One of the things that struck me as odd about the programming languages discussion here a few weeks ago was the fact that so many people seemed to think "security" a function of the programming language, not its usage.
On looking into this, I found it to be a fairly common opinion with many writers claiming that languages like Java (and therefore C#) make it easy to write safe code while cheerfully referring to the insecurity of things like pointer usage in C as if these were real language issues - - which they're not. At least, I don't think so; but then, I also think people, not guns, kill people.
The problem, of course, is that things aren't as simple as the aphorism about programmers, not languages, producing vulnerable code might make you think. Specifically, the gun analogy breaks down on responsibility: a person with the right skills isn't going to accidently shoot somebody else, but a programmer with all the right skills can produce an easily attacked binary.
The key reason for this is that languages like C rely on binary executables generated by compilers that can themselves introduce dangerous generalisations or compromise code safety in the interests of other goals, like performance.
On Unix and Windows, for example, C is a third generation language - a term that is not related to genealogy but derives from the fact that compilers were first known as code generators: meaning that a directly entered binary uses a code generator zero times while turning assembler into an executable requires one and getting to zero from C takes two.
What makes a Java application harder to attack isn't a language intrinsic but the addition of a software layer; or sand box, separating the executable from the hardware - and problems with the sand box code affect the attack-ability of the system, but not that of the program being run.
In reality, therefore, C code compiled and run in a a safe code environment is as safe as Java run in a virtual machine - and, by extension, a Java virtual machine is itself as vulnerable as any other C application. Under OS/400, for example, C and C++ code compiled with or without the the AIX PASE stuff produces binaries executing via the same firmware layer separating other code from the hardware and is consequently just as hard to attack as anything else.
In the PC and Unix worlds the pcode interpretation idea initially fell to performance considerations and when Intel adopted a RISC core for x86 execution they choose not to adapt the technology to manage security -a mistake repeated in the x86 extensions set for Itanium. As a result badly written Java is likely to be harder to attack than badly written C, but only because the run-time conditions are different and not because one language is inherently safer than the other.
In fact C is a lot simpler than Java and should be correspondingly harder to attack - a reality that fits directly with the origin of the sand box idea in late fifties and early sixties experimentation with virtualization as a means of keeping user processes separate. Thus you can think of the PC's BIOS, ring zero, kernel, and user modes as switchable virtual machines, note that this hardware design has determined a lot of the software evolution around it, and conclude that much of today's PC "security" problem is ultimately rooted in a mistake.
A mistake, not because virtualization was the wrong answer, but because a better answer was known: the use of typing instead of address based authorisations. In fact, the whole C pointer business now widely reviled as insecure because so easy to use poorly in the context of today's hardware and OSes, started as the basis for an alternate solution to multi-user memory access control - a solution which could have bypassed almost all of the problems the pseudo virtualization built into the x86 architecture attempts so ineffectually to solve.
Take it all together and what you have is a complex seeming outcome of historical processes and accommodations, but the bottom line on all of it is quite simple: it's wrong to think of a language like C as more dangerous than something like Java, because what counts is the entire execution environment - and on Windows and Unix most of the attributable risk ultimately comes from hardware and, and therefore compiler, not language, design.