UK sees worst IT skills shortage for a decade

UK sees worst IT skills shortage for a decade

Summary: A National Computing Centre report warns employers to budget for increased training of existing staff to cope with the skills shortage

SHARE:
TOPICS: IT Employment
9

The UK's IT sector is suffering its worst shortage of skills for a decade, according to new figures.

Perceived shortages in the industry jumped from 4.2 percent last year to 6.8 percent this year, a national survey of IT salaries and employment trends has found.

Just under 40 percent of respondents indicated recruitment and retention issues, a significant increase on the 29 percent reported last year.

The report warns employers to budget for increased training, with 73 percent of those who said there was a need for new skills planning to get hold of them by retraining existing staff.

The report predicts that those workers with business-analysis, network-support, .Net Oracle, SAP, VMware, web-development and project-management skills will be in high demand over the next two years.

Read this

Photos: Cisco lifts the lid on female-recruitment drive

The networking giant has revealed how it's trying to attract more female workers...

Read more

Salary growth in the sector remains stable, with respondents reporting an average wage increase of 3.7 percent.

The number of performance-related bonuses is also on the increase, up from six to 44 percent, with the value of those bonuses rising from 7.5 percent to 8.3 percent.

The annual poll of 244 organisations, which provides salary and employment details for 5,493 IT staff, was undertaken by the National Computing Centre.

Topic: IT Employment

About

Nick Heath is chief reporter for TechRepublic UK. He writes about the technology that IT-decision makers need to know about, and the latest happenings in the European tech scene.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

9 comments
Log in or register to join the discussion
  • When will the two ends meet?

    One week we read of a massive IT skills shortages, the next of massive IT unemployment. Why is it that we seem to have loads of people with good skills and experience and an employment base screaming for the right people?

    Are the skills completely out of date or is the problem in the matching mechanism? For myself, I'll go with option 2. When the people who will actually be working with the prospective employee pass off the requirement to their management, and they pass it off to HR, who pass it off to a 3rd party recruiter, you have a huge potential for mismatch. The original requirement is quite often, in reality, simply for "A Good Fit" (tm) so how do you put this on paper? Answer, frankly you can't.

    If somebody is an expert in database A and the requirement is for someone conversant in database B, in real life, what is the distance between the two. Most techs would say, not that far. There isn't going to be that much difference between two different products performing exactly the same job. Sure the commands are different, but the effect will be comparable and the general architecture is very similar. The original supervisor will be able to make a judgment as to how much lead in time will be needed to get an applicant up to speed. The rest of the chain will have less and less idea.

    So where does this leave us? The job spec comes out with an absolute laundry list of technologies, probably a photocopy of the previous post holder's skills. Potential applicants look at this list and in 99% of what could be good fit applicant, they will say "I have that, that, that and that, but not this. The last time I applied for a near fit like this the recruiter made me feel like a fraud because I wasn't a 100.00% fit, so I won't bother."

    Then you have the fakers who try and spin their CV to fit and, how should I say "Dissemble" their way through phone interviews. Their apparent close fit will eclipse the folks who in reality would do well in this post. They get through the process and get hired .. only to prove in short order that they actually haven't a clue. The employer then sets up a chorus of "I can't find anyone for this job" .. to a harmony of "Well if you didn't ask for a who's who of product skills, we might have applied" from the folks who didn't.

    Recruitment is an expensive, complex and time consuming process. If you scrimp on the effort, you get the results you paid for in time and effort. Don't blame the industry, the education system, outsourcing or whatever else for your failure to find the right people.

    I've been in this industry for nigh on 20 years. I'm not making this up. Ive seen this happen in real life, again and again and again and again and again and again and again and again and again and again and again and again and again and ...
    Andrew Meredith
  • Spot on really.

    I have been in the business 30 years many at the leading edge. Currently on the market due to redundancy due to resource relocation to the US.
    Looking at the jobs on offer they are a doddle or maybe a bit of a challenge but I have now lost count of the number of times I have been told I am too high a level for this lowly position and "for high positions we need a 100% match", yet the jobs stay "available" for weeks.
    I fulfill all the various roles I take on and exceed requirements but as I have more than one string I am not considered, by the agents, to be a specialist, even with a track record of attaining higher levels of knowledge than the majority of specialists in whatever field. As I am "not" a specialist the agencies wont put me forward for roles they consider specialist. The agencies aren't entirely to blame as many companies do not understand IT themselves. I recently "failed" an interview as an incomplete fit; couple of months later they advertised for an additional person to take on half the original role which I would have completely filled within 1 month.
    As mentioned, the requirements get taken as a verbatim, mismashed by the HR and then mostly misunderstood by the agents, following which I assume they all totally confuse each other and I presume the agents are struck off for putting "non matching" clients forward from they way they want every box ticked.
    The article sums it up with the phrase "perceived shortage".
    Meanwhile, anyone in Dorset in difficulties?!
    Yellowcave-9fde3
  • ShortFall? from an ex student

    Being a graduate in IT i was among a great deal of people in IT, It took me 20 interviews to actually get a job, i don't think it is lack of skills it became apparent that companies wanted experience over qualifications or both in a nice low paid package. Even with a job having these qualifications don't make any difference regarding duties and wages, well it seems that way from my angle qualifications are not as valuable as experience doing a set of tasks even if they allow you to do more than one set of tasks.

    Having been through it all i can say if i had know then i would not of bothered as much as 34K debt and a wage that barely covers the student loans and tax on top of that working life is the same as student life, of having to cut corners on food etc. The whole thing leaves me with yey i got bits of paper allowing me to do lots of high skilled jobs but those without that paper and just have experience seem to be better off.

    To cut his short the skills are there they are just being over looked and most graduates cannot get a job using the skills they went to uni to get and even with the job the wages are normally lower. I think the government needs to work on helping students more instead of punishing them for trying to better themselves.

    also the industry needs to start looking at graduates and making the most of the qualifications thy work for, after all they will never get experience's in their chosen field if no one hires them to use the skills they studied for.

    Sorry for the rant but this is how it feels from an ex student point of view
    1000266287
  • We need to redefine I.T. skills

    There are as many reports claiming an existance of a shortage in skills as there are those claiming that I.T. graduates can
    harpless
  • It starts at school

    When I were a lad, back in the early paleolithic, we were taught Computer Studies, which was a ground up course in what a computer is and in general terms how it works. We had hard drives with the cases off and short animated films showing the data going too and fro. We used a programming language called CESIL, Computer Education in Schools Instructional Language. It was kind of like assembler, only it had a PRINT command with preformatted output alongside register shifts and such. The lessons I took then, some 25 years ago would work just as well today AND THAT'S THE POINT. The fundamentals haven't changed in the slightest. By half way through my teens I understood, in general terms what went on inside a computer and the sorts of things it could and couldn't do. From that grounding, learning the specifics of how to use a given app was a very short step, and the same went for the next version of the same software or indeed a different package doing the same thing.

    The kind of computer education you get in schools today is entirely superficial; a term on Word version XYZ; a term on Excel version ABC. Not how to use a word processor, just which button to press in Word. It will be obsolete, by the time they leave school, let alone a quarter of a decade later. Sure it gets quick results, but they are only skin deep and they don't prepare people for a working life with ubiquitous IT, let alone future IT people.
    Andrew Meredith
  • ICT in schools

    And when I were a lad, perhaps in the neolithic period, I remember building logical circuits from some neat Boolean blocks of some sort, that could be plugged together, and progressing to design and build devices such as a simple microcontroller board (that never worked, but that didn't matter as nobody in a position to assess it could tell), and a camera made by slicing the top off a memory chip and measuring the amount of time it took for the individual memory cells to dissipate their charge (again, it never worked and it did blow up my Oric).

    Back then of course there were few applications to learn; if you wanted to do something you wrote your own app, probably in machine code or at best assembly.

    I wouldn't recommend that sort of upbringing to anybody - I'm sure those acid baths used for etching circuit boards can't be good for one's health - but then like you say, I understand what makes computers tick. Get that, and the rest is child's play.

    Your description Andrew of students today being told what buttons to press seems, well, depressing. Of course it's useful to know how to use applications, but an education in computers has to go deeper than that. The question is: Do our teachers have the necessary skills? I'm not for a moment suggesting that none do, but just asking whether enough do.
    Lonester
  • Childs play

    I like the phrase "Child's Play". It very nicely sums up the point I think we are both making. If you can foster a fundamental understanding of the basic workings of modern technology in the mind of a child and make it into a game, there is little left to do. We outstripped our teachers in understanding of this stuff in very short order, because it was a game. They booted us up and off we jolly well went.

    As to whether the teachers teaching todays younger generation understand enough to be able to teach this; I think you will find the answer depressing in a large number of cases. Experienced teachers, of all subjects are being driven out of the education system in droves. Schools have become frightening places for adults and children alike. Health and safety, compensation culture, mountains of paperwork, metal detectors at the gates, fingerprint scanners to check out library books, the proposed National Junior Identity Register. Just read other articles on this very site and watch a few of many documentaries on this subject if you think I'm being melodramatic. Our society has much bigger educational problems than whether we are turning out decent systems administrators and programmers. A stunning percentage of school leavers can't even read.

    However, would you put someone up as a maths teacher if they are innumerate, or English teachers who can't spell? Yet in countless cases we field people with no grasp of the underlying technology as ICT teachers. They can't actually teach the fundamentals, so they are left teaching the stuff they can do; which button to push. As a consequence, there is no point examining for this basic understanding, so we are left with having to examine for the stuff that is actually taught and end up with GCSEs in Word 200X. I'm sure the vendors love it in the short term, but longer term it hits them too. Badly. Just re-read this article to hear the bleating.

    Oh hang on though. I've just made a silly booboo. "Longer Term"? Thinking beyond this quarter's bottom line? Anyone would think we were Japanese or Indian or something !!!
    Andrew Meredith
  • School, huh, what is it good for...

    I don't think you're being melodramatic.

    And to think that GSCEs cover specific applications sends shivers down my spine for so many reasons.
    Lonester
  • I think the key is...

    Companies expect something for nothing these days. In the days of old, pretty much all the employers seemed to have a good grasp of what they wanted, and that they had to acclimatise and train new hires to get them productive. That and they very often were kosher with paying for further training of their existing staff.

    That gave people a career. Now if you're "old" you're deemed useless even if you're not, if you're young-(ish) then you have to fit or lie about it and hope to get a position.

    The whole concept of "good things take time" seems to be lost on the companies.
    ego.sum.stig