Careful coding the order of the day..."It was clear from dealing with the millennium date change issue that manufacturers and suppliers, despite knowing that there was scope for problems with software driving safety critical applications, failed to supply safe and suitable software." The millennium bug. Remember it? It may feel like a lifetime ago, but the IT industry's favourite creepy crawly is back and this time it's personal. Eighteen months after the whole brouhaha died down, the UK's Health and Safety Executive has released a paper suggesting software suppliers should be held criminally negligent when damage has been caused by events that could have been predicted. "Place a duty of care on manufacturers, developers and suppliers etc, of computer software, where its use affects a safety critical function, to supply software that is safe, so far as it is reasonably practicable, when it is being used for its intended purpose of work." Software programs have become ubiquitous - running transport, controlling machinery and managing medical systems. When systems fail, lives are put at risk and software manufacturers, consultants and contractors should be held responsible. But the law is already used in cases where software developers negligently get things wrong. The HSE wants to go a step further and create a criminal law which it claims will "allow preventive enforcement and have a deterrent effect". Apart from the millennium bug, the HSE hasn't offered any practical examples of critical systems failing because of a software fault - although maybe it knows something we don't. So should a software bug become a criminal offence? And if the new law goes through will software be any safer? Have your say by emailing email@example.com . We'll be bringing you more on this subject later this week.