X
Business

Should the cloud close the front door to the database?

The recent MongoDB ransomware hack in the cloud that made headlines earlier this week raises questions as to whether cloud platform providers should take the initiative in babysitting their most negligent customers.
Written by Tony Baer (dbInsight), Contributor
cloud-door.jpg
Getty Images/iStockphoto

Last week's reports that 27,000 MongoDB databases got hacked for ransom provided the latest fodder for IT in its perennial battle with shadow IT: How to ensure that enterprise data assets remain sacrosanct when developers or departments take the law into their own hands. There's nothing new about this issue; these questions have been around ever since the first PCs snuck through the back door (often literally) through departmental petty cash purchases, rather than through central IT procurement.

There was little doubt as to the scale of the impact, which left roughly a quarter of all non-firewalled, unauthenticated cloud-based MongoDB instances affected. Unlike most ransomware attacks, the data was deleted, not encrypted. The attackers offered to restore your data for a modest fee. If you backed up your data, well, then you got by... this time.

The elementary nature of the hack of the MongoDB instances was almost comical. The intrusion affected instances that weren't even protected by password access controls or run behind a firewall. While passwords can easily be hacked, the incidents of the past couple weeks prove that having first lines of defense aren't totally worthless. So, if you enforced password access, we'll say it again, you got by... this time.

Clearly, database breaches are hardly new; the phenomenon has become depressingly routine. The most recent International Oracle User Group study on database security reported a rising expectation from enterprises on the likelihood of suffering data breaches.

Often, the hacks come from sophisticated observation and spoofing of end-user behavior, involving measures like forged cookies, stolen third-party vendor credentials, RAM scraping malware, fine-grained, and phishing that require some degree of sophistication and casing of network infrastructure and user behavior.

Then again, let's not forget human error or negligence, which is the case here. Picking up where people leave off is where default settings in the database can help. It's not enough to support defenses such as encryption of data at rest or on the wire, authentication and authorization, query validation, and network connection settings (e.g., the database should not default to a 0.0.0.0 wildcard IP address). To idiot-proof security, it's often about whether the database turns on these measures by default. MongoDB doesn't do that; instead it make its documentation prescriptive and refer users to a security best practices checklist.

The broad reach of the hacks in a way revealed MongoDB to be a victim of its own success: As a developer-friendly platform that emphasizes ease of use, MongoDB drew a huge practitioner base. As fellow contributor Liam Tung reported, it's the fourth most-popular database out there, trailing only Oracle, MySQL, and Microsoft SQL Server. It occupies a place in database developer hearts that MySQL did a decade ago when it was a pillar of the LAMP stack.

As you democratize the user base, you'll inevitably draw less sophisticated practitioners. Of course, that was originally the rap with Microsoft Visual Basic, which made programming accessible to a new generation of music and philosophy majors, or MySQL, which lowered the barriers to entry for implementing a real backend database on your website.

For the record, MongoDB provides the means for DBAs to control access to the database. Its own Atlas cloud service manages access control by default. And, as of the 3.4 version, MongoDB can reduce the level of vulnerability outside the firewall by enabling users to configure authentication to without incurring downtime.

The 10 scariest cloud outages (and lessons learned from them)

But if you implement MongoDB in the wild, you can turn features like authorization, authentication, encryption, and auditing on or off. If you're a developer running a test/dev database or sandbox with masked data, such measures appear more hassle than they're worth. But, then again, if you're testing database apps that will then live in the cloud, do you want to give hackers the ability to watch your tests? In effect, you're tipping them off on the behaviors that you expect to be normal and the boundary conditions that you're hoping to protect.

The silent actor in all this is the role of the cloud provider. Many enterprises are looking to the cloud because of the realization that in a world where threats and intrusions are not only becoming routine, but constantly mutating, the cloud provider is probably best suited for this whack-a-mole game because infrastructure is their business.

When cloud providers offer managed database services, security that is turned on by default is one of the key selling points. But if you just use cloud as Infrastructure as a Service (IaaS), then have at it. You're renting infrastructure to run as you want. Traditionally, the cloud provider is responsible for securing infrastructure, but database and application security has been the client's responsibility.

Looking at the recent MongoDB hacks, you need to take the basic measures that might otherwise be taken for granted. And just as you would with on-premise systems, you'll need to enforce full "AAA" (authentication, authorization, and accounting) to guard entry. Then, of course, there is the basic hardening of the instances, going down to securing and patching the operating system, ensuring only the right people access the management console, and so on. That means all communications -- and we mean all -- between client, administrator interface, and the cloud target must be strongly encrypted all the way down to passwords and keys.

What's behind the trend of companies moving from public to hybrid cloud?

But enforcing the types of controls that are best practice are still not enough for the cloud -- you're dealing with an external surface and, depending on where the data lives, more data movement. And then there's the dynamic elasticity that is one of the attractions of the cloud. For now, there are third-party providers that will offer you that added protection for the cloud surface. The question, however, is given human fallibility when it comes to cutting corners for expediency, is it time for cloud providers themselves to act as in loco parentis? Should they turn on activity controls at the front door of the database (or application) by default to protect cloud customers from themselves?

Editorial standards