The final word on the 'megapixel myth'

The final word on the 'megapixel myth'

Summary: There's been a lot of talk of the "megapixel myth" lately all started by David Pogue of the New York times declaring that megapixels are a "big fat lie".  Fellow blogger David Berlind also did an entire series of blogs with his own sets of tests declaring "there's a good chance you'll never need more than 3 or 4 megapixels from your camera".

SHARE:
TOPICS: Hardware
101

There's been a lot of talk of the "megapixel myth" lately all started by David Pogue of the New York times declaring that megapixels are a "big fat lie".  Fellow blogger David Berlind also did an entire series of blogs with his own sets of tests declaring "there's a good chance you'll never need more than 3 or 4 megapixels from your camera".  I read the articles and blogs and I took a look at the methodologies described and I found some serious problems that affected the conclusions.  Here's a list of issues and I'll explain each one.  [Update 11:00 PM - Pogue responded in the talkback and I responded to him.]

  1. Computer down-sampling was used to render lower resolution images
  2. Lack of adequate test patterns for resolution testing
  3. Lossy compression JPEGs were used which adds artifacts
  4. Lens and motion factors not accounted for
  5. Not all image sensors are created equal
  6. Megapixels are a unit of area and not resolution
  7. Actual resolutions needed for printing

1: The first MAJOR issue is the use of computer down-sampling (or down-sizing) to generate lower resolution samples.  Someone pointed that out to Pogue that it may invalidate the results but Pogue ignored it with the simple comment "I'm not entirely convinced".  Now David Pogue is attempting to conduct a "scientific" experiment in order to prove his hypothesis that the digital camera industry is pushing the "big fat lie" of more megapixel yet Pogue sees no problem synthesizing the data and violating fundamental science principals.  Now I understand Pogue is no scientist, but he reaches a massive audience and he has the responsibility to be accurate when he's making such bold and damning assertions about an industry.  I'm going to explain why this completely invalidates his experiments.

Whenever you down-sample a computer image especially when you use a high quality algorithm provided by something like Photoshop, you are guaranteed to get improved image quality because all the noise, lack of focus, and slight motion blur in the image is averaged out and you're left with the purest of images that maximizes the effectiveness of every single pixel.  If you took a 4500x3000 (13.5 megapixel) image from a high-end camera and down-sampled it to 2250x1500 ( 3.375 megapixel) image, I can guarantee you that the resulting 3.375 megapixel image will be vastly superior to any image captured from a native 4 megapixel image.  Pogue simply assumes that a down-sampled image is the same as a lower resolution image captured with fewer megapixels.  Not only is it an assumption, it's a really bad assumption and anyone who does the experiment or works with digital images a lot will know that the native 4 megapixel camera will invariably have noise and imperfections in the image.  The down-sampled 3.375 megapixel image on the other hand will have far fewer imperfections because the noise happens at the 13.5 megapixel level and it's mostly averaged out to produce a very clean image at 3.375 megapixels.  [Updated 11:00 PM - Pogue did capitulate and allowed a professional photographer to use cropping instead of down-sampling to come up with lower megapixels.]

[Next page - The concept of optical resolution]

The concept of optical resolution

2: The other big issue is the lack of an adequate test pattern.  You can't just shoot some random object and eyeball it and say "yup I this one's better than the other" because it's totally unscientific and inaccurate.  Any real resolution test will use a standardized pattern for an optical resolution test.  The EIA 1956 video resolution target for example is used to test television systems but you can use 5 of them side by side to get 5 times the resolution which is enough to test digital cameras.  Not using a standardized pattern for resolution testing is like going to an eye doctor that he holds up his fingers and asks you how many fingers you see to figure out if your eye sight is good or bad.  Would you actually trust that optometrist?

You can purchase official EIA test patterns but they're not cheap especially if you need 5 of them for digital camera testing.  Note that you can buy a real one and use four blank sheets of paper of equal size as space fillers.  I actually recreated a high-quality vector based pattern in Corel Draw 7 back in late 1999 and posted it on the Internet for anyone to download and print.  I've reposted it here for you to download but be aware that it's only in Corel 7 and Illustrator 8 format so you'll need something that can import either of these files.  I tried an SVG format but the plug-in did a horrible job on the print preview and a conversion to Visio and Word was equally disappointing.  Note that this can't be used for official testing but it's pretty darn close if you printed this on the brightest laser paper at 1200 DPI laser.  Here is what the pattern looks like below.

To calculate digital camera resolution, you need to have 5 of these sheets plastered end to end and you need to fill your entire camera frame with it.  Note that most SLRs only give you 85% of the image so you'll have to play with it until you get all 5 sheets filling the full frame.  You don't need to actually use 5x5 sheets, 5 sheets from top to bottom will suffice and we just need to fill it perfectly from top to bottom.  Then all you need to do is look at the two sections I highlighted in red and see where the wedges blur to the point of being indistinguishable.  Since a single EIA chart filling full frame tests 200 to 800 lines of resolution, 5 of these sheets allow you to test a digital camera between 1000 to 4000 lines of resolution.  I did some sloppy measurements for illustration purposes but the official methodology is to zoom in and use the eyedropper tool to see where the color is at the halfway grey point between the lightest and darkest sections.  If the darkest part has a value of 200 and the lightest has a value of 100, then the point at which the gap hits 150 is the midpoint where the wedge is no longer distinguishable and that's the optical resolution limit of the camera used to capture the image.  Also note that I should have used a tripod, much more lighting, and slow shutter with much more exposure to get a cleaner lighter white but the following was the best I could do in a hurry.

Calculating optical resolution
Ideal sampleVertical sampleHorizontal sample

Note how my resolution is peaked at the 400 lines marker.  Since I'm doing 5 of these samples, the actual resolution is 5 times higher which means I can technically do about 2000 lines with my 8 megapixel digital SLR even under less than ideal conditions without an adequate exposure or tripod.  I can assure you that a 4 megapixel digital camera wouldn't do nearly as well.  Some may wonder what kind of uber geek would have time to do this kind of testing on actual optical resolution to decide what kind of camera they should buy; my answer is that you don't have to.  There are places like Digital Photography review which does the most hardcore in-depth reviews of digital cameras I've ever seen such as this review of the Canon 400D and they've show you the actual sample images under all sorts of conditions.

[Next page - Other things to avoid when testing camera resolution]

Other things to avoid when testing camera resolution

3: Any resolution test should never use a lossy compression format like JPEG if we're trying to see what the best quality can be.  TIFF at the very least or RAW image format should be used.  [Update 11:00 PM - I should have been more specific here that it was David Berlind who used the JPEG format and not David Pogue]

4: Lens and motion issues must be kept to a minimum.  The slightest bit of motion or off focus would prevent light from converging properly on the image sensor.  You could easily have an effective optical resolution that would only justify 4 megapixels rather than the 16 megapixel sensor you're actually using because you shifted the image the length of one pixel or the lens was out of focus and the light only converged on to 2x2 pixels.  You are therefore looking at the limitation of the lens or inadequate shutter speed which causes blurring rather than the limitations of the image sensor.  No matter how many quality megapixels the camera has, it's that old saying of garbage in garbage out.

5: Not all digital image sensors are created equal.  A dirt cheap camera with a supposed 4 megapixel sensor won't be the same as a digital SLR with a 4 megapixel sensor.  So it's not just the number of megapixels, it's the quality of the megapixels that count too.  The more accurate way of measuring resolution is to look at optical resolution using the EIA test charts.  If I had my way I'd ban megapixel advertising and demand optical resolution specifications using a standardized target under a standardized set of lighting conditions in terms of how many lines can be displayed.  That would tell you the true resolution capability rather than the theoretical capability in megapixels which may or may not be accurate.

6: And finally one of the biggest myths about megapixels that people make is the assumption that megapixels somehow equal resolution.  One naturally assumes that 8 megapixels must be twice as good or twice the resolution as something that's 4 megapixels.  That's simply not true because twice the vertical and horizontal resolution of a 4 megapixel image is 16 megapixels.  Megapixels are a unit of area and not a unit of resolution.  You can't expect something to have doubled the resolution unless there are 4 times the megapixels.  You can't expect something to have 4 times the resolution unless there is 16 times the number of megapixels.  You'd have to have a 64 megapixel camera to have quadrupled the resolution of a 4 megapixel camera.

7: If you want semi-professional grade images to print at 8x12 at 300 DPI, you need 8.64 megapixels to do an exact translation.  A 9x13.5 image (which costs about $2.50 to print in a one hour shop) at 300 DPI will require roughly 11 megapixels which is what newer digital SLRs under $1000 can deliver these days.  Of course if you want to be able to crop and/or down-sample then the more pixels you have the better but anything over 11 megapixels at this time gets to be very expensive and puts you in the $4000+ range.

So in conclusion, megapixels absolutely do matter though only the square root of that number will give you an accurate idea of how much theoretical resolution you're getting.  The quality of the sensor and other factors such as the lens and the person operating the camera plays a huge role in the quality of your images.  The biggest factor that is completely outside the control of the camera is the amount of quality light hitting the subject and lower light levels will always produce more noise than bright daylight images.   For example, daylight is about 1000 times stronger than indoor lighting which is why it's possible to see shutter speeds of 1/1000th of a second outside and only get 1 second shutter speeds indoors.

The point is that we cannot make bold declarations that megapixels are a "big fat lie" especially when the test methodology and testing is completely unscientific.  We're fortunate to have technology that keeps getting better and cheaper and it's not uncommon to find 10+ megapixel cameras under $800 these days and 8 megapixel cameras for less than $400.  Don't base your buying decision on unscientific slogans; the smart advice is to read the in-depth reviews before you buy because the more educated you are the less likely you are to get ripped off.

Topic: Hardware

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

101 comments
Log in or register to join the discussion
  • Great article

    Great article George - couldn't agree more.

    The only definite area where megapixels are a bit of a scam is in the amateur digicam market. My mom was very proud to come home with a 8 megapixel camera (which a good salesman obviously had something to do with), only to not realise every picture she was taking was at 640x480 resolution! And she seemed quite upset and confused when I told her she didn't need 8 megapixels anyway if she never prints her photos (which she doesn't).

    Ah well ... it happened to MHz, maybe it'll happen to Megapixels.
    ross2000
    • MHz wasn't a myth, it just had limits

      Thanks for your comments.

      There was nothing wrong with more MHz. You get more speed with more instructions per cycle or if you jack up the MHz/GHz. No one should ever say MHz don't matter because they do, but being able to process more instructions per cycle is also important. The problem with GHz was the thermal brick wall and Intel realized that with the Pentium 4 and switched to the Pentium M/Core 2.
      georgeou
      • Yes there is George...

        when there is a bottleneck in the buss all the MHz in the world won't make a difference. Until hard drives are able to read / write faster at a reasonable cost... MHz doesn't matter. NOW if you combine MHz with RAM... okay you are correct. ]:)
        Linux User 147560
        • Absolute statements are refuge of ignorant

          Intel has gotten around the FSB limitation by doing double channel high speed DDR2-533 or DDR2-667. Most overclockers run DDR2-800 or 1000. It's all an engineering tradeoff that Intel has made for now, and they're obviously winning the over all design and speed.

          With the next generation of Intel processors being released mid-2008, they'll go to a new memory architecture. For now, they're using brute force and brute channels.

          Anyone who says GHz don't matter is making a blanket absolute statement and they don't know what they're talking about.
          georgeou
          • Are you calling me ignorant?

            And I don't use Intel, I prefer AMD. ]:)
            Linux User 147560
          • Are you saying GHz doesn't matter?

            If you are, then you've answered your own question :). Like I said, absolute statements can never be backed up because they're so easy to blow out of the water. GHz absolutely DO matter within a given architecture. The only issue with GHz is the thermal ceiling. Intel was flying high with the Pentium 4 and blowing everyone out of the water when they raced to 3 GHz. The problem is that they hit a thermal brick wall and stayed in the 3 GHz zone for 3 years. Intel has since fixed that with superior architecture and superior gigahertz with the Core 2 architecture.
            georgeou
          • You prefer AMD? Surprise surprise.

            You may not be ignorant, just stubborn and foolish. I used to praise the very motherboard that had an AMD CPU mounted on it because for a good period of time they were the ruling class. But they are not the ruling class now.

            Perhaps between AMD and Intel this might be some kind of game, who knows, it would have to be a high stakes game. Its clearly some kind of game to you because you don't mind putting your money and your reputation down on the current loser (which is AMD currently) in the apparent hope that if and when AMD regains the title of champ again for a period you can some how lay claim to some kind of higher knowledge that AMD was in some way always better then Intel.

            I suspect you are going to come back and say how all you have said is that "I don't use Intel, I prefer AMD" and not that AMD currently has the faster processor. Big deal. Your statement that you prefer AMD smacks of a smugness. Such a blank unqualified statement implies that you prefer AMD over Intel for no other reason then you think AMD makes a superior CPU on some basis that is general enough that it would makes a rats a@$ of a difference to someone other then you. Well the news is that "generally" speaking, these days, if you are prepared to spend more then a couple of hundred dollars on a CPU that the Intel Core 2 Duo is your best performing and most cost effective choice, unless you are stuck with an AMD motherboard you cannot afford to replace. And that my friend is not "preferring" AMD, thats just being too broke to afford to switch to the much better Intel CPU, so at that point your interest in AMD cannot be generalized.

            But I'm willing to give you one benefit of the doubt, I don't think your ignorant. I suspect your stubborn, obstinate and obtuse when it comes to computer hardware, and probably software as well. For me, I guess I'm just lucky I don't tie my self esteem into my computer hardware/software. Personally, every time I purchase either hardware or software, depending on the amount of money I have dictates what I buy, and I could care less who makes it, at all. It just has to do the best for what I need and want for what I can afford. And right now Intel has the best CPU every time for a new computer once you hit the $200+ mark on the CPU.

            Please don't come back and prove yourself to be ignorant, it makes me look bad for predicting you are not.
            Cayble
          • Extremists

            All extremists should be shot.

            think about it.
            Sxooter_z
          • ...counselling someone to commit an indictable

            ,,,offense is in itself a crime. In Qannadda, punishable to the same degree as the propsed offence itself.
            Feldwebel Wolfenstool
  • Optional Illusion

    I have no idea what you are talking about George.
    That impresses me.
    Good one.
    D T Schmitz
    • You're not an uber geek?!?!

      No! Please don't disapoint me! :)
      georgeou
      • Take your pick...

        [pre]
        Geek
        From Wikipedia, the free encyclopedia
        (Redirected from Uber geek)
        Jump to: navigation, search

        A geek is an individual who is fascinated by knowledge and imagination, usually electronic or virtual in nature. Geek may not always have the same meaning as the term nerd. The Merriam-Webster definitions are "1: a carnival performer often billed as a wild man whose act usually includes biting the head off a live chicken or snake 2: a person often of an intellectual bent who is disliked 3: an enthusiast or expert especially in a technological field or activity," though these are only three of many definitions.
        [/pre]

        ;)
        D T Schmitz
      • Take your pick...

        Geek
        From Wikipedia, the free encyclopedia
        (Redirected from Uber geek)
        Jump to: navigation, search

        A geek is an individual who is fascinated by knowledge and imagination, usually electronic or virtual in nature. Geek may not always have the same meaning as the term nerd. The Merriam-Webster definitions are "1: a carnival performer often billed as a wild man whose act usually includes biting the head off a live chicken or snake 2: a person often of an intellectual bent who is disliked 3: an enthusiast or expert especially in a technological field or activity," though these are only three of many definitions.


        ;)
        D T Schmitz
  • Going up rather than down

    Great article George.

    However, going the other way often gives quite pleasing results to the untrained eye and may explain why a lot of people may be happier with the lower resolution. I've often blown up and printed computer screens and low res photos (say 1200 pixels in width) to poster size using the algorithms in my favourite graphic editor PhotoImpact and been quite pleased with the results.

    The other reason the resolution may not matter much any more to a lot of people people is the growing habit of showing digital photos on TVs and computer displays - rather than printing. However, for my own work I'll always get the highest res photo I can get and then step it down to my target dimensions - I agree, you can see the difference ;-)
    TonyMcS
    • Thanks, photos are precious

      Thanks, photos are precious and they're memories. I want as much of my memories to be preserved as possible.

      8 megapixels cameras are so cheap now and storage cards are so dirt cheap that you might as well get 8. Why go out of your way to buy a lousy 4 megapixel image? To me, images are so precious that I want the best and I store all my photos multiple times in RAW format. 8 MBs of RAW storage per image is 5 photos per penny in storage! Back in the old days it cost $.50 to $1 per image depending on how good you want the film stock to be and how good the procesing. That's 500x cheaper for digital RAW storage!
      georgeou
  • A minor quibble

    You used ?downsampling? when talking
    about resizing the pics. In
    ?graphicseze? downsampling refers to
    reducing the resolution of a pic (like
    Acrobat Distiller press setting
    downsampling rasters above 400 dpi
    to 300 dpi or Photoshop Save For Web
    downsampling to 72 dpi).

    Making things physically smaller is
    commonly referred to as downsizing
    or, more commonly, resizing.

    It amazes me how many people, as you
    pointed out, do not understand the
    difference between physical size (pixels
    X pixels) and resolution. A large part
    of my job is preparing pics for printing.
    I often get total garbage from pro
    photogs because, after switching form
    film, they don?t understand the
    computer end of digital photography.

    By the way, the printer that Pogue had
    the pic run on was the Rolls Royce of
    digital color printers (starts at $65K) It
    comes with its own computer with a
    special interpolation scheme made
    especially for being able to increase
    the physical size of a print. It?s not
    anything like normal printers and
    presses.
    j.m.galvin
    • Downsizing is right, but you are downsampling

      Downsizing is right, but you are downsampling and downsizing when saving to 72 dpi for web. Pogue's use of downsizing and no use of EIA charts to conclude megapixels are a "big fat lie" is nonsense.
      georgeou
      • Megapixel = noun

        "Pogue's use of downsizing and no use of EIA charts to conclude
        megapixels are a "big fat lie" is nonsense"

        George. Until you read the article, you will continue to make very little
        sense.

        I never wrote "Megapixels are a big fat lie." This is the second time you've
        misquoted me in a nonsensical way. How can a noun be a lie?

        Why are you so adamant about not reading the actual article?

        There you would see what I'm calling the Megapixel Myth. It is this: "The
        more megapixels a camera has, the better the pictures."

        I can prove my point by showing you the 6-megapixel output from my
        Nikon D80, and compare it with the 10-megapixel output from the pocket
        cam of your choice.


        --David Pogue
        David Pogue
        • Your example is an exception

          Your example is an exception and not the rule because of superior optics in the lens and sensor electronics. You also failed to explain why that is the case. I explained it by introducing readers to the concept OPTICAL RESOLUTION and I actually ran some tests using standardized resolution charts. I would think that's superior to the photoshop downsizing method or the hairy arm method.
          georgeou
        • come on now

          How can a noun be a lie?

          I had no trouble understanding what George meant by Megapixels are big fat lie. Everyone knows a digital camera's hottest selling point is it's number of megapixels. When George said that you called megapixels a big fat lie, I didn't for one second think you were denying the existance of megapixels. That's stupid.

          You say "The
          more megapixels a camera has, the better the pictures." And that's exactly what I got from George's translation.

          When debating in a forum, it doesn't really benefit anyone to nitpick the most obvious of metaphors and allusions. Save it for the details that actually have the devil in them.
          mindilator9