Ask any senior executive why IT fails to deliver value and the honest ones will soon use words like "unresponsive" and "incompetent", while the more politically correct will usually emit platitudes about hiring problems, systems complexity, media driven expectations, and communications barriers - all of which have IT's unwillingness or inability to respond promptly to user needs at their root.
As a consultant I've seen more than my share of IT failures -and incompetence, along with its kissing cousins: ill will and careerism, does play a role in many of them. What you mostly find in organizations with serious IT failures, however, is competent IT people competently doing the wrong things - and in most of those cases the blame affixes squarely to the organization's top executives.
Thus the number one reason IT fails to deliver value in one or more roles in many organizations is indeed incompetence, but it's incompetence among the top executive, not the IT people.
In most cases both the chicken and the egg in this situation is a socially fueled refusal by top management to involve itself with technical, particularly computer related, issues - a continuation, I think, of the fear based avoidance strategies by which the the pretty party people -the same people who later in life desperately want their kids to be smart - separate themselves from the nerds while in school.
The applicable adage here is that you get what you measure - and when top management treats technology as lower caste what they end up measuring is conformance to inappropriate expectation: social expectations with respect to the IT mangers they come in contact with, technology expectations in terms of popularity as measured by traditional media exposure, and performance expectations as measured against their own need to denigrate the nerds involved in making it all work.
In operation this produces a socially self fulfilling prophecy: if you want to prove that IT people don't matter, hiring the same people to do the same things in the same ways using the same tools as your competitors ensures that IT doesn't matter - and therefore that IT people don't matter.
The most direct consequence of this is expressed in organizational structure: specifically through the continuation of centralized IT control despite all evidence of its ineffectiveness and cost. Basically, centralized control is natural for data processing and unnatural for science based computing - meaning that top management's distaste for dealing with nerds has allowed the data processing side to win where it should have lost.
This reflects history: when data processing started to develop its own organizational forms and cultural inertia back in the 1920s it did so as a service group within Finance. As a result it was treated as a corporate cost and growth in its operations therefore generally sold on cost savings - usually to be achieved via lay-offs outside IT.
Thirty years later, when science based computing came along, none of the people involved gave data processing any consideration - and when users started bringing this stuff into business and government during the sixties and seventies, they didn't either. That created an us vs them situation in which user management either bypassed the central data processing organization to buy packaged "solutions" they could control and implement directly, or adopted simple dodges like cloud computing (then known as time sharing) to bypass centrally imposed budget and related controls.
The budget conflicts this created between users and central data processing organizations who saw themselves as in control of corporate computing came to a head for IBM and its clients when the first vaxes became capable of running complete resource management packages and thus enabled a user driven play for access to the financial data needed to complete ERP suites.
Data processing won the ensuing conflict by leveraging its financial clout, its organizational position in finance, and - above all - senior management's inability to differentiate data processing from computing, and so ended up in control of the unified corporate IT budget.
As a result IT became as centralized, and as divorced from user needs, as data processing had always been - and when the PC came along we saw a replay with users grabbing the thing for the control it gave them, only to slowly cede that control back to data processing to produce what we have now: high cost systems combining insane levels of desktop complexity with locked down centralized processing, centralized management, and the almost complete isolation of the user from the decision process.
Basically, management's abdication of its IT responsibilities allowed the data processing culture to evolve unimpeded - and thus created another self-fulfilling prophecy: treat IT as a cost, isolate it from users organizationally, and what you get is a process driven, non responsive, cost center.
So what's the bottom line? When you see unhappy users, IT costs that are radically out of line with benefits, and a dramatic disconnect between what's happening in the organization and thirty years of computing progress - don't blame the IT people. Yes, many of them fit the MCSE stereotype, yes many live by Windows or zOS/zVM, and yes they're often completely techno-illiterate outside their comfort zones - but they're responding to directions set by upper management. Want to blame someone? blame the people who aren't doing their jobs, the people who let this situation fester: blame, in other words, the people in the executive suite.