No, John, it's web designers who are ruining the web!

No, John, it's web designers who are ruining the web!

Summary: No matter how fast your internet connection -- and I have a fast one, by UK standards -- too many web sites are still bloated and slow. Indeed, the problem seems to be getting worse.

SHARE:
TOPICS: Tech Industry
3

No matter how fast your internet connection -- and I have a fast one, by UK standards -- too many web sites are still bloated and slow. Indeed, the problem seems to be getting worse. "Trendy" sites such as Twitter, Google Plus, Facebook and Quora are now slow performers even with Google Chrome, and they're appalling with Internet Explorer 8. We've reached the point where just opening a page or two in Google's Chrome browser consumes more Windows resources than supposedly bloated programs such as Microsoft Word 2010, and opening anything in IE is even worse.

This is crazy now that more people are trying to access websites on devices that have much less processor power, memory and storage space than the average PC -- namely, tablets and smartphones. Although Apple originally thought websites running HTML5 would provide free online applications for its iPad tablets, the message from the marketplace is that people are willing to pay for apps instead.

In Sunday's Observer newspaper, John Naughton apportioned blame in a column headlined Graphic designers are ruining the web. His initial complaint seems to be aimed at designers who want "pixel perfect" control of pages, as though they were designing magazine pages rather than websites.

Probably this still is a factor, and reminds me that I had a good rant about Decorators with keyboards in my Guardian column back in July 2003. For those who missed the point, I followed that up with Are most commercial websites designed by children?

My point was not that websites should be ugly. Of course, I'd like them to be as beautiful as possible. My point was that you don't need a megabyte of pictures, graphics, Flash and junk JavaScript to make a site look nice, or to do something useful. Google's home page is an extremely obvious example.

However, if you don't have the taste or the talent to combine beauty and functionality, we all know that users prefer functionality. That's why the top websites include Amazon, eBay, Wikipedia, Yahoo, Reddit, Craigslist and many other sites that would never win a beauty contest.

Unfortunately, John Naughton tried to make his point by contrasting bloated sites with Peter Norvig's home page. This links to some good content but is, frankly, horrible. It's bad, outdated, non-design.

I would have gone for Oliver Reichenstein's iA (Information Architecture) website, which is lightweight while also being stylish and -- unlike most websites -- readable. If all those "decorators with keyboards" had only read his 2006 article, Web Design is 95% Typography, the web might be a vastly better place.

Naughton might also have made it clear that websites are not solely or even mainly the work of "graphic designers" nowadays, or even "web designers". Big websites are built by big teams.

Yes, the graphic design might be rubbish, but today there are other things that slow down websites. These include adverts, pointless plug-ins, connections to social networking sites, and all the furniture that comes along for the ride. It's possible to go to a website (eg The Next Web) to read a story and find only the opening line (or two) is visible because the header stuff takes up so much room.

It's not clear why we still have a bloatware problem, for three reasons. First, the idea that "less is more" is not exactly new to design schools, and nor is Apple's successful implementation of that simple idea. Second, there are plenty of people pointing out bad design, from Jakob Nielsen's UseIt to Vincent Flanders' Web Pages That Suck. It doesn't take much effort to go through Flanders' checklist of 165 Mortal Sins That Will Send Your Site to Web Design Hell.

Third, website managers/developers/designers/whatever must surely know that bloated designs are costing them readers, because almost everybody quits before slow websites have finished downloading. The desirable response time is 1 second. Any site that takes more than 15 seconds will start losing people at a rapid rate. Even if they stay for one page, it will put them off loading any more.

I'm delighted that Google now includes response times in its website evaluations, and it has a page of useful tools at Let's make the web faster.

Perhaps Google should move every site down by one place for each second it takes to download, beyond 10 seconds. That really would speed up the web.

@jackschofield

Topic: Tech Industry

Jack Schofield

About Jack Schofield

Jack Schofield spent the 1970s editing photography magazines before becoming editor of an early UK computer magazine, Practical Computing. In 1983, he started writing a weekly computer column for the Guardian, and joined the staff to launch the newspaper's weekly computer supplement in 1985. This section launched the Guardian’s first website and, in 2001, its first real blog. When the printed section was dropped after 25 years and a couple of reincarnations, he felt it was a time for a change....

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

3 comments
Log in or register to join the discussion
  • I think part of the problem is programmers today have grown up in a world where machine resources appear infinite and they don't give nearly enough thought to the impact their applications have on an infrastructure, unlike us oldies who had to hoard every byte & clock cycle just to get anything to run at all.
    AndyPagin-3879e
  • Speed issue number 1 is third-part code, advertisement and web metrics. So devs do not have a direct control over that, it is usually chosen by the marketing departement and is put on top of the website, weahter devs like it or not. Often, management has no idea that it impacts performance so heavily. Management believes in data, and it is most of the time not measured, and not measured does not exist.
    Second source of slowdowns, is website that relie a lot on ajax, but with a weak implementation. Well structured client side app is something new, and we did not got right yet, but is improving. In the 2 next years, most majors websites will move toward better architecture.
    Yes, google flagging somehow slow website would certainly boost the improvement of the web. I m 100% for that, it would makes of me a rich man, since I have a web performance consultancy.
    olivvv
  • @olivvv

    Many thanks! Interesting info....
    Jack Schofield