The IT industry has been around since roughly the end of World War II — a shade over 60 years. Like any other sector, it is long enough in the tooth to have accumulated its fair share of founding myths — stories that are accepted as truth even though they may have no basis in reality.
For example, one tale often trotted out concerns a remark allegedly made by Thomas J Watson, a former president of IBM. Watson is rumoured to have said (dates as to when vary, but the 1940s tends to be the popular option): "I think there is a world market for maybe five computers."
Good story — but there is no evidence he ever said it. Even if he did, was it such a daft remark in the context of the time, when computers involved millions of scarce dollars and gazillions of temperamental valves?
In the interests of attempting to place our industry on a more scientific footing, ZDNet.co.uk has been trawling the archives to see whether certain myths stand up to scrutiny or crumble upon investigation.
1. Computers helped Britain win World War II
Undoubtedly the work of the team at Station X (Bletchley Park) was of great significance in aiding the Allied war effort. But, as victors, we have tended to exaggerate both the status of the Mark II Colossus as the first true "computer", and the impact of the intelligence Alan Turing and his team gathered by cracking German codes.
In fact the Germans, and indeed the Americans, can both claim to have beaten the Brits when it comes to creating the first computers — with Konrad Zuse's Z3 and Iowa State University's Atanasoff–Berry Computer respectively.
Myth truth value: 5/10
2. Britain had the first commercial computers
This patriotic belief one has a sounder basis. It is certainly the case that Lyons' famous LEO (Lyons Electronic Office) was in operation by late 1951, a few months before any equivalent business-oriented computers got off the ground in the US.
A team from Lyons had been touring the US looking for good business ideas to pinch (and why not?), heard about these still-secret computer things, and one thing led to another. An interesting question is whether LEO, since it started doing calculations for Ford UK in 1956, can also claim to be the first platform for IT outsourcing.
Myth truth value: 8/10
3. The guy who invented the transistor did it because he was on his own in the lab
Amazingly, this version of events is pretty much supported by the facts behind the incredible contributions of engineer Jack Kilby to IT and the world in general.
Kilby came up with the world's first integrated circuit in 1958. He had joined his first employer, Texas Instruments, but had started just before the summer and hadn't accumulated enough service time to be eligible for a holiday. More or less on his own, he dreamed up the idea that sticking lots of components onto one piece of silicon would be worthwhile. The rest, as they say…
Myth truth value: 9/10
4. Steve Jobs ripped Xerox off
In one version of this tale, Steve Jobs, the "colourful" head of newly founded microcomputer firm Apple, was invited to look at the accomplishments of the innocent geniuses at Xerox's Parc (Palo Alto Research Center) in desktop/GUI design… and a few months later the Apple Mac came out, complete with Xerox innovations such as the mouse, icons and so on.
Some important facts need to be inserted here. The pioneering work Xerox Parc achieved was astonishing, yes — but it was not all 100 percent original work. The mouse, for instance, must be credited as an invention of Douglas Engelbart, from whose nearby Augmentation Research Center more than a few Parc people originated. In any case, as to what Jobs did or didn't do: Apple was specifically invited to share Xerox's work in exchange for shares in the new company. So, although lawyers grew rich off the back of this one for years, whatever else he is, Jobs on this count is no burglar.
Myth truth value: 4/10
5. IBM wanted to make Gary Kildall rich, but he ducked out of the meeting, so they gave the contract to Bill Gates instead
The story goes that representatives from IBM went to the headquarters of pioneering PC operating-system company Digital Research, wanting to talk to the main man, Gary Kildall. But he supposedly goofed off, choosing to fly his plane instead and leaving a flunky to take the meeting — and, when the man from IBM said that they could only speak if the representative signed a confidentiality agreement, the deal died on the spot. According to the story, Bill Gates' similarly nascent Microsoft won the lucrative order instead, to put an operating system on the first IBM PCs.
A couple of caveats. It's true that Kildall was in the air, but he was doing a highly responsible thing — rushing off to deliver software to a customer. It was his wife Dorothy and the company's own lawyer who met IBM and, quite sensibly, the former refused to sign until the big boss was back. In at least one version of the story Kildall came back in good time, before the IBM people had left, and signed the non-disclosure anyway.
The deal did fall apart and Microsoft did do very well as a result — which is why we labour under a DOS tyranny now, not a CP/M one, as it were. But that can't be attributed to what is often presented as an irresponsible or amateurish approach to business by the Kildalls.
Myth truth value: 3/10
6. SAP's rise to prominence was unstoppable
In some versions of the story, when SAP was formed in 1972 by five ex-IBM Deutschland employees, there was a clear game plan for (unfortunate phrase) global domination. And when the ERP craze burst onto the scene in the mid-1990s, SAP merely sat on the crest of the wave and became one of the most dominant software companies in the world.
In reality, SAP's rise to prominence was far from certain. In fact, it struggled for many years to break out of its home, German-speaking market, let alone Europe; for much of the 1980s and up until the mid-1990s, SAP was a dot on the horizon compared to established manufacturing companies such as Cincom and even Oracle.
It was the decision to take a chance and move Klaus Besier to head up SAP's North America division in 1992 that changed things around. Besier turned a backwater operation into a $700m (£338m) heavyweight in three years, and finally established SAP on the US corporate landscape. But he followed his own business methods, famously noting that: "It is better to ask forgiveness than permission". Besier, of course, fell out massively with the rest of the management in Walldorf, and has since faded into obscurity — but it is worth recalling now and again that even SAP was at one time just another kid on the block.
Myth truth value: 4/10
7. Y2K was a total con
Well, it depends on your point of view. The Millennium Bug started to preoccupy many IT commentators from the mid-1990s. The problem was that older systems were crafted at a time when good programmer practice was to be as economical as possible with code due to machine-space limitations. Therefore, many dates were coded as the final two digits only — thus "99" instead of "1999". That was all well and good, but what would happen when the century clicked over? Then "99" could either refer to "1999" or "2099" — and confusion abounded as a consequence. All that code needed, surely, was to be checked and modified to prevent any problems.
But what started as a perfectly sensible call to action to at least look at more mature systems to make sure they were not liable to generate so many runtime errors became a millennial craze. Most of will remember how doom-mongers saw Y2K as a potential civilisation-collapse issue. Millions of pounds were spent in an attempt to avert disaster. And when 1999 turned into 2000, not a single plane crashed as a result of the bug, nor did any nuclear power plants go into meltdown.
However, this is very much a hindsight perspective. At the time, there was considered to be enough credible cause for concern to warrant action. Indeed, there were documented Y2K bugs before the deadline, and there were in fact malfunctions on 1 January, 2000, including at nuclear power plants (in Japan).
Probably the best way probably to think about Y2K is that, at the very least, it was about best practice in systems maintenance and overhaul, and that the scale of the response may have headed off far more problems than those that occurred. Also — let's be honest — it was a good excuse to clear out the IT attic and refresh systems that were past their sell-by dates.
Myth truth value: 3/10 (as we will never know what would have happened if no action had been taken)
8. "Microserfs" is an accurate depiction of staff life at Redmond
The cover story in the January 1994 issue of Wired magazine contained what many readers took to be a piece of straightforward journalism detailing the work and daily lives of IT toilers at Microsoft's main campus in Redmond, Washington — thus "Microserfs".
The piece had major impact, boosted not just by the "pictures" on the cover of the staffers profiled but the air of verisimilitude in the writing. It felt like a realistic depiction of a bunch of 20-year-olds working at Microsoft.
Yet it was just fiction. Canadian Generation X novelist Douglas Coupland had in essence printed the first chapters of what he published next year as a full novel. But, in a move typical of its then innovative approach to publishing, the magazine had dressed it up as documentary rather than straight fiction.
Despite Coupland's imaginative writing and the authoritative style of the piece, insiders were not fooled long — there is, for instance, no "Building 7" on the Redmond campus for new hires.
But, as no-one ever leaves Microsoft (see myth 10 below), the general public can't be blamed for feeling there is more than a hint of truth in these tales of harassed kids working 20-hour days and getting shouted at.
Myth truth value: 3/10
9. Computers help businesses operate better
Does anyone still believe this one? We are joking, of course: we all know that, in the 21st century, no IT project happens without complete line of business involvement; business and IT goals are firmly intertwined; and so forth.
Perhaps the best thing we can say on this one is to repeat, not so much a founding myth of IT, but one of the best yarns to have come out of IT consultancy. The story goes that a senior IT professional in a large retail group, shall we say, received a call that he was to go and see a famous chief executive. The executive's lack of interest in IT is famous within the group, so the IT professional went to see him in some confusion. As he entered the office, he saw a Securicor-type van, but went in anyway. The chief greeted the IT manager with the question: "Do you know what £1m looks like?" The manager had to admit he didn't. "No, I didn't think that you did either, so come with me."
They walked down the corridor to another meeting room, where the larger-than-life entrepreneur opened the door for the manager to see notes piled everywhere on the desk and floor. "That is what a million quid looks like," the chief executive is alleged to have said. "And that is what your lack of ability with our systems has cost me — and why you are fired!"
Myth truth value: 10/10
10. Bill Gates is the devil
Gates has been the subject of a string of less-than-flattering books and even movies (for example, 1999's Pirates of Silicon Valley) that suggest he is an unscrupulous businessman, less than honest, and the creator of a predatory and vicious Microsoft culture.
And surely Gates's philanthropy since 2000 is the best "atonement" for whatever mistakes he has made along the way. One source credits the Bill & Melinda Gates Foundation with donating more than $29bn over the past seven years.And we shouldn't forget that there are other personalities in the industry who rival Gates when it comes to negative press. Oracle's chief executive, Larry Ellison, is often accused of having created an equally rapacious business machine. His reputed arrogance resulted in Mike Wilson's unauthorised biography being called The Difference Between God and Larry Ellison: God Doesn't Think He's Larry Ellison.
Myth truth value: 6.66/10