X
Business

Video sites grapple with specter of smut

Critics say smut and brutal images can be easy for children to access at top upload sites.
Written by Greg Sandoval, Contributor
The text accompanying the video says a man has stolen a pair of women's underwear.

The clip, first posted on video-sharing site YouTube on May 31 and viewed more than 1,500 times over six days, shows a man standing in what appears to be a dimly lit public bathroom, wearing what indeed appears to be panties. As the video plays, the man, shown from the stomach down and thus faceless, begins to fondle himself.

YouTube is not the only well-known video site where such graphic content appears. Many of the companies that let users display homemade videos on the Web are having difficulty keeping their pages smut-free. A weeklong review of some of the top user-generated video sites by CNET News.com unearthed scenes of beheadings, masturbation, bloody car accidents, bondage and sadomasochism. It's important to note that no child pornography was discovered.

Online video-sharing sites such as YouTube, Yahoo Video and Google Video are competing in one of the fastest-growing entertainment segments on the Web. They may also be victims of their own popularity. The vast majority of videos available on these sites depicts budding musicians, comedians, filmmakers or just people vying for attention in innocuous--if sometimes oddball--ways.

But industry insiders say that as the sites collect greater amounts of video, tracking and purging sexually explicit and graphically violent content will become increasingly difficult. Industry insiders say that while prescreening millions of homemade videos is likely to be costly and problematic, failing to police the sites could scare off advertisers and lead to clashes with family advocates and lawmakers.

Materials inappropriate for children are too easy for kids to get their hands on at Google Video, according to the New York State Consumer Protection Board, which issued a warning to parents on June 12. The board has a broad mandate to inform and educate consumers but has no regulatory powers. Nontheless, it will continue to publicize the issue in an effort to force Google Video and other video-sharing sites to do more to protect children, said Jon Sorensen, the board's spokesman.

"Very few of the other (video-sharing) sites feature this kind of content on their front page," Sorensen said Thursday. "It's disappointing because we contacted (Google Video) two weeks ago, and they said they were trying to make changes. Still, this stuff continues" to show up.

In an e-mail to CNET News.com, Google said it removes such content when made aware of it.

Unlike New York's consumer protection board, the federal government does have the power to force change. A bill proposed this month in the U.S. Senate would require any Web site that offers sexually explicit content to post warning labels on each offending page or face imprisonment.

The authors of the bill, called the Stop Adults' Facilitation of the Exploitation of Youth Act, or the Safety Act, want to decrease the chances that children can inadvertently be exposed to pornography by Web sites that mislabel their materials either deliberately or through negligence.

And video-sharing sites are likely to face enormous pressure to clean up their sites from big advertisers. Some companies are eager to partner with the sector's powerhouses but will steer clear if it means that one of their ads sits next to unsuitable content, said Greg Sterling, who operates Web research company Sterling Market Intelligence.

"There's absolutely a big opportunity for these sites to sell advertising, provided that they guarantee (what kind of) content...goes next to the ads," Sterling said. "Advertisers are going to want control of where their brands are placed."

That's not going to be easy for some sites. Take, for example, YouTube, the largest video-sharing site, with nearly 13 million users per month. Guaranteeing the quality of content on the site would mean hiring employees to eyeball each frame of the more than 50,000 videos that get posted daily. YouTube allows videos to last up to 10 minutes, but most are much shorter. If the average video is 3 minutes, then YouTube would be monitoring 2,500 hours worth of video a day.

"It's going to be hard to guarantee absolute protection," said Mike McGuire, a research analyst at Gartner. "You have to wonder if (these sites) foresaw the kind of expense and effort that they are going to have to put into monitoring their sites."

Yahoo Video has installed a screening system that, when applied, prevents visitors from accessing adult content that may wind up on the site. Google, which has a similar screening system for its photo site, hasn't installed one for Google Video. In its e-mail to News.com, the company said it has added new screening methods but declined to provide details.

YouTube doesn't prescreen any videos, said company spokeswoman Julie Supan. People are technically able to post anything they want, immediately. The company's user agreement, however--like those at most rival sites--prohibits material that could be considered pornographic, obscene or unlawful, and YouTube leaves it to the community to report violations.

"As the largest community for video on the Web, we could not review all the content that goes up on the site," Supan said. "Community policing on the Internet has proven very effective over the last 10 years."

YouTube users can flag content they think violates the user agreement. If a video collects enough flags (the company declines to publish the number), YouTube will review the clip, and pull it if executives agree the material is objectionable, Supan said.

But not all flagged material gets pulled. If executives think a clip doesn't violate the agreement, it remains on the site but is accessible only to registered users 18 and older. YouTube encourages visitors to register, a process that requires entering a birth date. People who say they're younger than 13 are barred from registering.

This restriction process, however, can be circumvented. In one instance, News.com encountered a clip that had been flagged and restricted, but an identical, unrestricted clip was available under a slightly different title.

And there's no guarantee that a potentially objectionable clip will come to light to begin with. An unrestricted clip of a female television host in Europe, who spoke to a live audience while wearing only a bikini bottom, was available on the site for at least three days.

Over at Google Video, which also said it relies on user feedback to monitor content, material uploaded in recent weeks includes a parody of a car commercial that features an announcer using numerous expletives during a mock sales pitch.

"Self-policing flat out doesn't work," said Peter Pham, director of business development at Photobucket, a fast-growing photo-sharing site that has recently jumped into video. "The problem is that most of the people finding this material are the people who are looking for this material. And they aren't going to complain."

By eyeballing each frame of every clip submitted, companies such as Photobucket and San Diego-based start-up vMix want to avoid angering advertisers or family advocates. All videos on Photobucket's site get reviewed, Pham said. The company has developed software that creates a frame-by-frame "map" of a video, allowing workers to evaluate content at a glance, Pham said, adding that Photobucket recently hired 50 people to monitor incoming video and photos.

A family-friendly site doesn't come cheap. The projected cost of all of this is $2 million per year, Pham said. VMix is doing something similar on a smaller scale.

"What you are trying to do is discourage people from posting this kind of material on your site," Jeff Davids, VMix's chief financial officer, said at the Digital Media Summit in Los Angeles earlier this month. "If they see that their material isn't going up on the site, they're going to go someplace else."

Prescreening may work for small companies that own a minuscule market share. According to traffic-tracking site Hitwise, more than 42 percent of all visits to video-sharing sites occurred at YouTube. The privately held company would conceivably need hundreds, if not thousands, of personnel reviewing video.

And that won't guarantee a clean site, Supan said. "There are always going to be people who try to take advantage of the system, whatever it is," she said.

Editorial standards