On July 17, 2020, three individuals filed a lawsuit against Microsoft Corporation in the United States District Court for the Northern District of California, with a request for class action certification.
The plaintiffs contend that Microsoft is routinely violating the privacy of customers who pay for business subscriptions to Microsoft 365 (formerly Office 365). They allege that "Microsoft shares its business customers' data with Facebook and other third parties, without its business customers' consent." The complaint also accuses Microsoft of sharing business customers' data with third-party developers and with "hundreds of subcontractors ... without requiring the subcontractors to keep the data private and secure." And they maintain that Microsoft uses their business customers' private data "to develop and sell new products and services—and otherwise benefit itself."
Those charges would be explosive, if true. (The most important part of that sentence is "if true." Hold that thought.)
But after digging into the complaint and doing my own research, I'm convinced that none of their allegations are true. This lawsuit is predicated on embarrassing technical errors and almost comical misreadings of technical documentation.
And yet the plaintiffs have received their share of headlines from mainstream technical publications like The Register, which reported the allegations without even a pretense of research or fact checking.
You can read the full complaint (in PDF format) here: FRANK D. RUSSO, KOONAN LITIGATION CONSULTING, LLC, and SUMNER M. DAVENPORT & ASSOCIATES, LLC vs. MICROSOFT CORPORATION, Case 3:20-cv-04818. But I need to issue my own legal disclaimer first: If you have a technical understanding of how Microsoft cloud services work and you take medication to keep your blood pressure in check, please talk to your doctor before you click that link.
Before we get to the substance of the complaint, consider a few important facts here:
These shocking allegations are accompanied by no evidence. None.
You will find no technical detail in this pleading. No reports of expert forensic examinations. No casual descriptions of observed data transfers. No anecdotes, even. It's remarkable.
What's even more remarkable is that, if these allegations are true (again, a very very very big if), then multiple other sources with far more technical resources have somehow missed Microsoft's blatant violations of their business customers' privacy.
No security/privacy experts have ever flagged any untoward behavior in Office 365 that matches these accusations. No one in the Fortune 500 noticed Microsoft casually giving their data to Facebook, even after five long years. The extremely aggressive EU authorities, who have dragged Microsoft through the muck for technical privacy violations, totally missed this. No whistleblowers from Facebook have come forward. No actual victims have told their stories.
The plaintiffs, by the way, are not computer security experts or even amateur hackers. They are, respectively, a lawyer, a litigation consultant, and a web designer.
There are no footnotes.*
I am not a lawyer, but I read a lot of legal filings as part of my work. And the paucity of footnotes in this filing is noteworthy.
Most notably, as I went through the allegations trying to figure out what they really meant, I didn't have the luxury of footnotes that would include links to the Microsoft-owned web pages that purportedly prove these privacy violations. I had to search using snippets of quoted text to find the original documents. This sounds like a minor detail but it's actually a pretty big deal.
* Ok, in the interests of technical accuracy, I confess that this complaint includes three footnotes at the very beginning, one of which alerts the court to the fact that Office 365 was recently renamed to Microsoft 365. But once they get into detailed accusations, there are no footnotes to be found. If opposing counsel and the judge and interested observers (like yours truly) have to use Google to find the context for your quotations, you might have a problem.
There's no demonstration of actual harm.
The entire section that sets out how Microsoft's actions have allegedly injured the plaintiffs is two sentences long. Here it is, in its entirety:
Plaintiffs and Microsoft's other business customers would not have purchased (or would have paid less for) Microsoft's services if Microsoft had not made the misrepresentations discussed above and had disclosed its sharing and use of its customers' data.
Microsoft's use and sharing of Plaintiffs' and Microsoft's other business customers' data also reduced their data's privacy and security.
To their credit, a Microsoft spokesperson resisted the urge to reply "LOLwut?" when asked for comment; instead, they told The Register: "We're aware of the suit and will review it carefully. However, while the allegations themselves are not very specific, as we understand them we don't believe they have merit. We have an established history of both robust privacy protections and transparency, and we're confident that our use of customer data is consistent with the instructions of our customers and our contractual commitments."
The three defendants have paid between $120 and $150 a year for three individual subscriptions to Microsoft 365/Office 365 Business editions over the past four to five years. That's a total of roughly $2000. Meanwhile, in the just-concluded 2020 fiscal year, Microsoft's other, mostly much larger customers sent $20 billion in revenue to Redmond for Microsoft 365 commercial subscriptions. (That's out of $50 billion in overall cloud revenue.) And yet none of those customers, many of them with small armies of security experts on their payroll, seem to have noticed.
In fact, when I looked into the specifics of these allegations, I found them to be facepalm-worthy in the extreme. In many cases, they're based on a naïve misreading of various support documents and are just profoundly ignorant of how modern cloud computing works.
Let's start with the Facebook accusation.
Does Microsoft share customer data with Facebook?
The plaintiffs say, "Microsoft shares its business customers' data with Facebook and other third parties, without its business customers' consent." Here's the relevant portion from the complaint:
Microsoft routinely and automatically shares its business customers' contacts with Facebook—without those customers' consent— whether or not the customers or their contacts are Facebook users.
Even if a customer discovers and disables this Facebook-sharing "feature" after activating Office 365 or Exchange Online services, the damage has already been done. At that point, the business customer's contacts have been shared with Facebook. As Microsoft explains in an obscure technical instruction, "[o]nce contacts are transferred to Facebook, they cannot be deleted from Facebook's systems except by Facebook."
Well, that's not very specific, and the absence of footnotes makes it hard to figure out exactly what they're talking about. But a search for the quoted portion of that "obscure technical instruction" turns up a 2014 post from an Office 365 MVP in Portugal, Nuno Silva. His post appears to be a copy of a Microsoft Exchange Online support document titled "Advanced Privacy Options for Administrators."
Skip down to Section 2.2, "Facebook Contact Sync," which contains this text:
What does this feature do?
This feature shares information in your Outlook Contacts folder with Facebook, and imports your Facebook friends' contact information into your Outlook Contacts folder. Once contacts are transferred to Facebook, they cannot be deleted from Facebook's systems except by Facebook, even after the Contacts sharing feature is turned off by a user.
How do I turn this feature on/off?
This feature may be turned on by default. Administrators can turn this feature off by following the steps provided below or by using a powershell cmdlet.
I suppose if you know nothing about how Office 365 works, that might sound pretty damning. Indeed, in my Office 365 commercial account (a subscription that's identical to those used by the plaintiffs), the admin panel contains this exact setting:
But if I want to change the Outlook Web App mailbox policy to exclude Facebook contact sync, the option isn't there at all.
That seems very strange, unless you've been covering Office for a couple decades, as I have, and you remember the history of the Outlook Social Connector program. It ran on older versions of Microsoft Office and was briefly supported on Office 365. But Microsoft killed the Facebook Connect option in 2015.
My fellow Microsoft watcher, Mary Jo Foley, reported on this development more than five years ago: "Facebook integration no longer available for several Microsoft services." And you can read the official support document on Microsoft.com, which is bluntly titled "Facebook Connect is no longer available."
Scroll down to the "Microsoft 365 Outlook on the web" section and you'll see this text:
The following features will no longer be available:
- Facebook Connect – If you've connected to Facebook in Outlook on the web, your Facebook contacts will no longer be synchronized to your Microsoft 365 account.
- If you're a new user, you won't be able to connect to Facebook using Outlook on the web.
And sure enough, when I check Outlook on the Web, there are no Facebook options.
So, what happened? Back in 2007, Microsoft paid $240 million for a 1.6% stake in Facebook, and the two companies found ways to do business online. Facebook used Microsoft's online advertising service and its Bing search results. Microsoft integrated the social network into Outlook, introducing the Outlook Social Connector, which displayed activity for social media accounts in a People Pane when you viewed an email message from someone who you were friends with on Facebook. Outlook on the Web offered a similar feature.
But Facebook stopped using Microsoft's ad network in 2010 (and Microsoft took a $6.2 billion writeoff against its ad division in 2012). Facebook dumped Bing in 2014. By 2015 the two companies had mostly parted ways. Any business connections between the two companies and their respective customers dissolved into thin air. Microsoft appears to have sold all its Facebook stock.
Even in the heyday of that partnership, though, there was no automatic data exchange. Microsoft's customers had to configure Outlook on the Web or download the Facebook connector tool, sign in, and create a connection between Facebook and Outlook.
I am not sure why that setting in the Office 365 admin panel is still there, but it certainly wouldn't be the first time an outdated setting survived for years before being officially removed from a Microsoft control panel.
Meanwhile, if you are using a current version of Office 365, that setting has no effect, and your contacts stay within your organization, exactly as you would expect.
(It's also worth noting that according to the lawsuit, all three of the plaintiffs in this case began their subscriptions to Office 365 after this feature had been officially killed, so there's no way their contacts could ever have been synced to Facebook, even if they had wanted them to.)
Does Microsoft share customer data with developers?
The next accusation is that "Microsoft shares its business customers' data with third-party developers, without its business customers' consent."
Oh lord. I wanted to slap someone as I read this section, because it is based on a complete misunderstanding of how third-party developers work with Microsoft cloud services. Here's the key paragraph from the complaint:
[E]ven if a business customer did not download a third-party application (and thus did not consent to sharing its data with the third-party), Microsoft nonetheless transmits the non-consenting business customer's data to third-party developers if another Office 365 user consented to the application.
Among other things, Microsoft gives third-party developers information about the documents and projects those non-consenting business customers worked on. Microsoft allows those third-party developers to search the content of its business customers' emails and to access their schedules, locations, and availability status, i.e., whether they are "available" or "away."
Microsoft explains to developers that they can "perform searches for people who are relevant to the [Microsoft] user and have expressed an interest in communicating with that user" about specific topics, such as pizzas. Microsoft explains that "[t]opics in this context are just words that have been used most by users in email conversations. Microsoft extracts such words and creates an index for this data to facilitate . . . searches."
Again, there's no footnote, but the quoted text appears to be from a developer document titled "Overview of people and workplace intelligence in Microsoft Graph." It's an extremely thorough document and well worth reading if you want to understand these issues.
Where do I even begin to describe how incredibly wrongheaded this entire section is? The plaintiffs seem to think that developers can write a few lines of code and thereby gain access to email messages, contact information, and other data from anyone in the world. ("Microsoft allows those third-party developers to search the content of its business customers' emails…")
That's not how it works. That's not how any of this works. (Sorry, I didn't mean to scream.)
Developers write apps that incorporate Microsoft APIs. Enterprise customers can then use those apps to perform searches on their own organization's data. If I install one of those apps and then sign in using my Office 365 work account, I can use the app to organize and extract information from my own organization and my own inbox. So if I have a meeting with my team coming up, the app can find emails from other team members and can also find related files from a SharePoint site, making it easier for me to prepare for the meeting.
The developer doesn't get that data. People within the organization can access that data through the app, but only if they have the appropriate authorization.
Does Microsoft share business customers' data with subcontractors?
The complaint alleges that Microsoft "shares its business customers' data with hundreds of subcontractors when sharing is not needed to provide the services, and without requiring the subcontractors to keep the data private and secure."
This one goes on for four paragraphs, boldly declaring that Microsoft gives away business data to subcontractors, doesn't anonymize the data, and doesn't require subcontractors to use encryption.
There's no way to fact-check this accusation, but based on my experience with Microsoft's data handling policies, I find it literally impossible to believe. Business customer data is stored within the tenant, not intermingled with the Microsoft cloud at large (And if it were genuinely happening, I suspect it would have drawn serious attention from EU privacy agencies long ago.)
Does Microsoft use business customers' data to develop and sell new products and services?
This portion of the complaint hopscotches through at least three different, completely separate regions of Microsoft's cloud, displaying a profound misunderstanding of what each one does.
The overarching charge goes like this: "Despite Microsoft's repeated assurances that it will use its business customers' data only to provide them with the services they purchased, Microsoft mines that data to develop new products that it sells to other customers."
The complaint then goes on to list three products.
The first is the Microsoft Security Graph API, which the complainants describe as "an application program interface Microsoft sells to software developers so they can create new security-related products." Just as a fact-checking point, I feel compelled to note that Microsoft doesn't "sell" this API to anyone.
Microsoft boasts that Security Graph API is built off the "uniquely broad and deep" insights Microsoft obtained for itself by scanning "400 billion" of its customers' emails and "data from 700 million Azure user accounts."
Those scans are designed to detect, intercept, and block malicious code, including malware, ransomware, and phishing attempts. They're a standard part of every modern email server and cloud storage service. (For details about how these scans work, see this support document: "Malware and Ransomware Protection in Microsoft 365.") The Security Graph API aggregates data about those threats, their points of origin, and their effects, as collected by Microsoft's security team and its partners.
If you think that collecting and sharing data about external attackers targeting your infrastructure and your users is a violation of your privacy, you're just wrong.
Next, the plaintiffs throw in a completely random reference to the "Microsoft Audience Network," an online advertising tool that uses data from Microsoft consumer services and has nothing to do with the business services they're using. You can read all about it at the network's welcome page, which makes clear that it is "anchored in the consumer understanding provided by the Microsoft Graph [and designed to reach] hundreds of millions of people through brand-safe environments or placements on premium sites including MSN, Outlook.com, Microsoft Edge and other partners."
It's an understandable confusion (I'm being charitable here), because Microsoft has separate versions of the Microsoft Graph for personal, work, and educational services. The Microsoft Audience Network has nothing to do with business customers.
And then there's Cortana. This one's especially weird, because the transition of Cortana to the enterprise is relatively recent. (As an aside, I was extremely disappointed that there was no footnote explaining the origin of the Cortana brand.)
According to this pleading, "[T]hrough a default setting that applies when the customer first installs Office 365, Microsoft collects and uses business customer data (including documents, contacts, and calendar information) to develop and improve its virtual personal assistant 'Cortana.' It does so even if the customer is not using Cortana."
There's a scintilla of a glimmer of a tiny ray of accuracy in that accusation. Yes, Microsoft does use customer data for machine learning as part of its enterprise-based Cortana service. That's documented in the terms of service and in a detailed support document titled "Cortana enterprise services in Microsoft 365 experiences."
Microsoft uses Customer Data only to provide the services agreed upon, and for purposes that are compatible with those services. Machine learning to develop and improve models is one of those purposes. Machine learning is done inside the Office 365 cloud, and there is no human viewing, review or labeling of your Customer Data.
Your data is not used to target advertising.
One could make a case for opting an organization out of Cortana's machine learning, but I'd be hard-pressed to identify the damage caused by not opting out.
And that's it. That's the entire lawsuit. It is, to use a decidedly non-legal term, a hot mess.
I will continue to follow this case as it proceeds, but I fully expect it to be dismissed with prejudice early in its life. I'll keep you posted.