As IT evolves in the direction of more cloud adoption, more automation, and more artificial intelligence (AI), machine learning (ML) and analytics, it's clear that the IT jobs landscape will change too. For example, tomorrow's CIO is likely to become more of a broker and orchestrator of cloud services, juggling the strategic concerns of the C-suite with more tactical demands from business units, and less of an overseer of enterprise applications in on-premises data centres. Meanwhile, IT staff are likely to spend more time in DevOps teams, integrating multiple cloud services and residual on-premises applications, and enforcing cyber-security, and less time tending racks of servers running siloed client-server apps, or deploying and supporting endpoint devices.
Of course, some traditional IT jobs and tasks will remain, because revolutions don't happen overnight and there will be good reasons for keeping some workloads running in on-premises data centres. But there's no doubt which way the IT wind is blowing, across businesses of all sizes.
Historically, technology transitions have seen some jobs eliminated and created negative externalities such as increased inequality or environmental degradation. But new jobs -- not always clearly foreseen -- are usually created and, so long as the negatives are properly addressed, society as a whole generally benefits.
How will this scenario play out in the AI era, as the pace of change accelerates, and ever higher-level jobs are affected? Let's start with a 'macro' view, and then consider the IT industry in more detail.
In January 2016 the World Economic Forum (WEF) published The Future of Jobs, a report based on a survey of 350 of the world's largest companies, including over 150 of the Fortune Global 500. Respondents were mostly chief HR officers, as well as other C-suite executives with a strategic focus on talent. The WEF's focus was the impact of the 'fourth industrial revolution' -- a combination of developments such as artificial intelligence and machine-learning, robotics, nanotechnology, 3D printing, and genetics and biotechnology -- on business models and labour markets from 2015 to 2020.
The report's headline finding was that up to 7.1 million jobs could be lost in 15 major developed and emerging economies due to "redundancy, automation or disintermediation, with the greatest losses in white-collar office and administrative roles". Partially offsetting this figure, resulting in a net loss of some 5 million jobs, was the predicted creation of 2.1 million new jobs in sectors such as 'architectural and engineering' and 'computer and mathematical':
Even in industries (such as information and communication technology, or ICT) where employment demand is predicted to be positive, the WEF report flagged up the conjunction of "hard-to-recruit specialist occupations with simultaneous skills instability across many existing roles" as an approaching challenge.
Fortunately for current employees, investment in re-skilling emerged as the top strategic future workforce priority among the WEF's survey respondents, with 65 percent pursuing this strategy across all industries. In second place was 'support mobility and job rotation' at 39 percent.
A December 2016 report -- one of the last from President Obama's administration -- examined Artificial Intelligence, Automation and the Economy, noting that "Researchers' estimates on the scale of threatened jobs over the next decade or two range from 9 to 47 percent," and that "Research consistently finds that the jobs that are threatened by automation are highly concentrated among lower-paid, lower-skilled, and less-educated workers. This means that automation will continue to put downward pressure on demand for this group, putting downward pressure on wages and upward pressure on inequality."
On the other side of the coin, the White House report acknowledged that "new jobs are likely to be directly created in areas such as the development and supervision of AI as well as indirectly created in a range of areas throughout the economy as higher incomes lead to expanded demand."
The report offered three main strategies for addressing the impact of AI-driven automation across the US economy: 'Invest in and develop AI for its many benefits'; 'Educate and train Americans for jobs of the future'; and 'Aid workers in the transition and empower workers to ensure broadly shared growth'.
The America First/Brexit factor
The reports quoted above were researched and produced before the recent outbreak of populist nationalism on both sides of the Atlantic really made their presence felt. So it remains to be seen how President Trump's 'America First' doctrine and the UK's tortuous Brexit process (due to end on 29 March 2019) will affect business models and labour markets. However, given that a major driver of these political developments was a reaction to job losses in traditional industries and resulting increased inequality, further technology-driven unemployment can only exacerbate the situation in these and other countries, unless addressed in time.
In its recent Autumn Budget, the UK government set out a vision for "An economy driven by innovation that will see the UK becoming a world leader in new technologies such as Artificial Intelligence (AI), immersive technology, driverless cars, life sciences, and FinTech." There was even money set aside for things like AI and driverless cars, R&D, retraining, 5G mobile and fibre broadband. But will such ambitions survive the economic realities of Brexit?
IT industry trends
How will the IT industry cope with the coming 'fourth industrial revolution'? According to the World Economic Forum, the prospects for the ICT sector are good:
With the highest employment growth forecast and average levels of skills stability, ICT looks better placed to weather the next few years than any other sector, according to the WEF survey.
Looking at strategies for future workforce development, the WEF survey also detected an above-average determination in the ICT sector to invest in re-skilling current employees -- 81 percent versus 65 percent across all industries:
Also noticeable in this chart is an increased tendency in ICT to hire more short-term workers and collaborate with educational institutions, and (disappointingly) a decreased emphasis on targeting female talent.
In 2016, the US Department of Labor (DoL) provided a detailed breakdown of job prospects in Computer and Information Technology through to 2026. Overall, employment in the sector is projected to grow an above-average 13 percent between 2016 and 2026. More than half a million new jobs will be created, says the DoL, with demand generated by increased focus on "cloud computing, the collection and storage of big data, and information security".
Software developers form the most numerous group, both in 2016 and in the 2026 projection, followed by computer support specialists and computer network architects:
The fastest-growing ICT occupations, according to the DoL, will be information security analysts (28%), software developers (24%) and computer and information research scientists (19%). Computer programmers, meanwhile, are the only group covered by the DoL predicted to decline between 2016 and 2026 (-8%), mainly due to increased outsourcing to emerging economies:
Interestingly, three of the four highest-paid IT occupations -- computer and information research scientists (median 2016 pay $111,840/year), software developers ($102,280) and information security analysts ($92,600) -- also have the best growth outlook. The outlier is computer network architects, whose 2016 pay and 2026 growth projection figures are $101,210 per year and 6 percent respectively.
A near-term view on IT staffing comes from Spiceworks' recent 2018 State of IT report. This is based on a survey conducted in July 2017 that gathered responses from 1,003 IT professionals from North America and Europe working in organisations ranging from SMEs to enterprises. Industry sectors covered include manufacturing, healthcare, non-profits, education, government and finance.
When it comes to IT departments' staffing plans for the year ahead, there's a clear relationship with company size:
In small and medium-sized businesses, the predominant position is 'no change' in IT staff levels, whereas larger businesses (>500 employees) are more likely to be increasing IT headcounts in 2018.
Another annual finger on the pulse of the IT industry is the Gartner CIO Agenda Report, which for the 2018 edition canvassed the views of 3,160 CIOs across 98 countries (representing some $13 trillion in revenue/public-sector budgets and $277 billion in IT spending).
Gartner's survey provides some pointers to the IT specialisms most likely to be in demand in the near term. When asked which new technologies have required, or will require, new or hard-to-find skills in order to deploy them, artificial intelligence was the clear leader, followed by digital security and the Internet of Things:
Echoing research quoted earlier, 60 percent of respondents expect to identify and redeploy transferable skills, while more than half will recruit more permanent staff (54%) or upskill through internal or external training (52%):
Once again, the coming 'revolution' doesn't look as though it will sweep away the existing order -- at least as far as IT personnel are concerned.
When it comes to the specific skills sought by this survey's respondents, the cloud, data science (including AI) and software engineering lead the way:
Automation-related job losses are predicted across both developed and emerging economies, predominantly in white-collar office and admin roles. But new -- mostly higher-level -- jobs will also be created. Managing the transition to mitigate negative effects such as increased inequality will be an important task for governments and businesses.
The IT sector is well placed to weather the coming 'fourth industrial revolution', with good employment growth prospects and above-average determination among leading companies to invest in retraining current employees where necessary.
If you're working in IT or considering a career in this field, look to develop skills in key areas such as cybersecurity, software development and data science (including AI). Don't fear automation too much just yet, but be prepared to find yourself implementing and/or working with cognitive systems in the near future.