Facebook explains how it will notify third-parties about bugs in their products

Companies have 21 days to acknowledge reports and 90 days to patch vulnerabilities; otherwise, Facebook will go public with bug details.
Written by Catalin Cimpanu, Contributor

Facebook engineers manage one of the biggest software portfolios in the world, with tens of apps and millions of lines of code that provide a wide variety of services to billions of users around the world.

Managing this gigantic codebase is hard work due to its sheer size, and, of course, its complexity.

Finding security bugs in this giant pile of code isn't always simple, but trough in-house-developed static analysis tools like Pysa and Zoncolan, Facebook has made a concerted effort to find issues before they reach public-facing code.

However, not much has been revealed about what happens when Facebook engineers discover security bugs inside their code.

Obviously, the vulnerability is patched, but some bugs are harder to fix than others. That's because not all of Facebook's code is unique. A large portion of Facebook's applications is also propped up by smaller libraries developed by third-parties.

For the past few years, Facebook has often found vulnerabilities in these third-party components, which the company's security team has always reported to their respective owners.

However, not all disclosures have gone to Facebook's liking. Some library developers have fixed bugs within days, while in other cases, Facebook had to fork libraries patch the code itself, or develop its own in-house alternatives.

But Facebook doesn't believe this shouldn't be the norm, as it's not fair to the other users of these third-party libraries, most of which will continue to use the unpatched code.

A way through which Facebook wants to address these problematic disclosures is through a new policy the company intends to apply, starting today.

Facebook's new vulnerability disclosure policy

Called a "vulnerability disclosure policy," these are a set of rules that Facebook engineers plan to apply when reporting vulnerabilities they find to third-party entities.

According to a summary of these new rules, Facebook promises to "make a reasonable effort to find the right contact for reporting a vulnerability" to any third-party entity.

After contact is made, Facebook says it will provide an in-depth technical report describing the bug, but if a company/developer doesn't acknowledge receiving this report within 21 days, its engineers will publicly disclose bug details online so other users/developers can protect their products.

Third-parties who acknowledge reports have 90 days to fix issues, which is the unofficial standard timeframe in the software community that bug hunters give companies to patch security flaws.

While Facebook might give some companies some leeway over this 90-day deadline, once this passes, Facebook says it will publicly disclose bug details and let users and companies mitigate the third-party bug as they see fit.

The only situation where Facebook will go public right away is when a bug in a third-party component is under active exploitation. Not all zero-days, as these bugs are also called, will be disclosed right away, however, but only those cases where disclosing the bug helps users stay safe.

These VDPs, or "ethics statements," as their also known, are not unique to Facebook, and other companies and even independent security researchers have one, usually listed on their websites.

For example, this is the VDP of Project Zero, a security team inside Google that's specialized in finding security flaws in products usually deployed inside Google's own network.

Each VDP is unique, and Facebook's is pretty standard when it comes to it, so third-parties shouldn't have any issues with following its basic rules.

A more in-depth look at Facebook's VDP is available below:


  • Facebook will make a reasonable effort to find the right contact for reporting a vulnerability, such as an open source project maintainer. We will take reasonable steps to find the right way to get in touch with them securely. For example, we will use contact methods including but not limited to emailing security reporting emails (security@ or secure@), filing bugs without confidential details in bug trackers, or filing support tickets. 
  • The contact should acknowledge the report as soon as reasonably possible. 
  • The contact should confirm whether we've provided sufficient information to understand the reported problem. 
  • In its report, Facebook will include a description of the issue found, a statement of Facebook's vulnerability disclosure policy, and the expected next steps.
  • If needed, Facebook will provide additional information to the contact to aid in reproducing the issue. 
  • If we do not receive a response within 21 days from a contact acknowledging the report of a vulnerability, we will assume that no action will be taken. We then reserve the right to disclose the issue.
  • For purposes of the disclosure timeframe, Facebook's sending the report constitutes the start of the process. 
  • Facebook will generally decline to sign non-disclosure agreements specific to an individual security issue that we have reported.


Mitigation & Timeline

  • Whenever appropriate, Facebook will work with the responsible contact to establish the nature of the issue and potential fixes. We will share relevant technical details to help expedite the fix.
  • The contact should be as transparent as possible about the mitigation progress. They are expected to make reasonable effort to fix the reported issue within 90 days.
  • Facebook will coordinate the disclosure with the availability or rollout of the fix. 
  • If no fix is forthcoming at the 90-day mark, we will notify the contact of our intent to disclose the reported issue. 
  • If there are no mitigating circumstances, we will disclose the issue as soon as we are reasonably able to do so.



  • Depending on the nature of the problem, there may be a number of disclosure paths: 1) we may disclose the vulnerability publicly, 2) we may disclose it directly to the people using the project, or 3) we may issue a limited disclosure first, followed by a full public disclosure. Facebook will work with the contact to determine which approach is most appropriate in each case.
  • Our intent is to disclose vulnerabilities in a way that is most helpful to the community. For example, we may include guidance on workarounds, methods for validating patches are in place, and other material that helps people contain or remediate the issue. 
  • We may choose to include a timeline to document communication and remediation actions taken by both Facebook and the third party. Where reasonable, our disclosure will include suggested steps for mitigating actions.
  • We will include a CVE when available, and, if necessary, issue an appropriate CVE.

Additional disclosure considerations

  • Here are some potential scenarios when Facebook may deviate from our 90-day requirement:
    • If the bug is actively being exploited, and disclosing would help people protect themselves more than not disclosing the issue. 
    • If a fix is ready and has been validated, but the project owner unnecessarily delays rolling out the fix, we might initiate the disclosure prior to the 90-day deadline when the delay might adversely impact the public.
    • If a project's release cycle dictates a longer window, we might agree to delay disclosure beyond the initial 90-day window, where reasonable.
  • Facebook will evaluate each issue on a case-by-case basis based on our interpretation of the risk to people. 
  • We will strive to be as consistent as possible in our application of this policy.
  • Nothing in this policy is intended to supersede other agreements that may be in place between Facebook and the third party, such as our Facebook Platform policies or contractual obligations.

The best VPN services: Our 10 favorite vendors for protecting your privacy

Editorial standards