Google, under an agreement with the Korean government, will take steps to protect minors from materials deemed to be harmful on its Korean keyword search service www.google.co.kr
Google of late has been working close with the Ministry of Information and Communication Republic of Korea (MIC) to address safeguarding children and teens from exposure to potentially harmful or obscene sexual materials when using Korean keyword search.
To accommodate the requests of the South Korean government, it will launch its own "SafeSearch" filtering system along with an adult authentication process. The SafeSearch filtering system checks entire keywords, phrases, URLs and open directory searches and can filter and omit adult-only sites in its query results.
For example, if the user selects sex-related keywords in the search query, adult authentication is necessary before the results can be displayed. If the user is underage, the SafeSearch's filtered results will be shown. Google targets to deploy the SafeSearch filtering and age verification process in its Korean keyword search by end August.
With these measures, Google finally joins other major Korean portals which have implemented similar measures to protect minors from reaching inappropriate URLs. The updated list of harmful URLs will be provided by Korea Internet safety commission to Google. Further, Google will commit key personnel to handle safety of minors.
"Our continued efforts with Korean government to come up with best solution for protecting Korean children and youth from pornographic material has paid off and we are delighted," said Kent Walker, Google's lead attorney.
Meantime, an affiliate from Korean information Ministry added, "I feel very positive about the latest effort by Google, working closely with the Korean government and showing commitment toward protecting minors."
Hyojeoung Kim of ZDNet Korea reported from Seoul, South Korea.