Part of the company's Government Security Program (GSP), Microsoft has offered the Russian Federal Security Service (FSB) a peek inside the source code of Microsoft Windows Server 2008 R2, Microsoft Office 2010 and Microsoft SQL Server.
This is the second time that the company has (publicly) shared source code with the FSB, following a similar deal which took place in 2002, this time involving source code for Windows XP, Windows 2000 and Windows Server 2000. Microsoft has done similar deals with China in 2003, and most recently in 2010.
However, in the light of the silently ongoing cyber warfare arms race, GSP's main benefit of "providing insight and a deeper understanding of Microsoft products", may easily turn into a gold mine for discovering security flaws, or at least offer important pieces of the puzzle.
For starters, the program's restriction that "governments may read and reference the source code but may not modify it." is flawed because it implies that just because you're looking you cannot influence, hence indirectly modify the source code for offensive purposes. From powerful DIY source code analysis tools, to the managed services offered by different companies, it wouldn't be hard for a government to execute this process and take advantage of any source code it has access to.
Moreover, in the context of the Linus's Law - "Given enough eyeballs, all bugs are shallow", taking all the geopolitical factors on an international scale into consideration, if Russia or China manage to find a security flaw by having access to the source code offered to them by Microsoft for national security reasons, there's little to zero possibility that they will go public with it, as the competitive advantage from a cyber warfare/cyber espionage perspective is indisputable.
Cambridge University's Richard Clayton seems to agree:
"If a government has the source code it can find different sorts of security vulnerabilities and perhaps exploit them, [but] it's unclear whether access to the source code makes people better or worse off," said Clayton.A number of different factors made the situation complicated, said Clayton. Access to the code could allow close analysis, which would enable the discovery of holes such as buffer overflow flaws, but equally it is possible to run a fuzzing program which throws random data at parts of an operating system or software to find different vulnerabilities.
Although the sharing of source code, doesn't automatically result in zero day flaws, it may offer crucial pieces for the puzzle that a particular country has already started building, on its way to find security flaws within Microsoft's OS/products, for defensive and naturally, offensive purposes.
From a business perspective, nothing's more precious than a government contract. But in order for this government contact to ever see the light of the day, sometimes a company losses sight of the big geopolitical picture, citing commercial gains, or plain simple market segment growth strategies.
Microsoft, don't just offer a peak at your source code, demand and legally oblige those who have access to it for national security reasons, to share back data on important bugs and potential security flaws. How would Microsoft measure the effectiveness of this potential bilateral contract? It can legally reserve the right to exclude countries who've been on purposely offered insecure source code, and decided not to report it.
Is Microsoft forgetting the basics of geopolitics, namely that "Nations have no permanent friends and no permanent enemies. Only permanent interests."? Are the risks posed by sharing source code with deep pocketed cyber warfare players, worth the market penetration emphasis from a commercial perspective?
Should Microsoft finally switch from being the giver, to being the receiver as well?
What do you think? Talkback.