Update: How to Avoid a Malicious Attack That Ransoms Your Data
September 8, 2017 | Updated: January 19, 2018
A New Wave of Ransomware Attacks
Reports have emerged of a new wave of ransomware attackers searching for misconfigured and unmaintained instances of MongoDB. We have been monitoring the situation closely to help investigate and provide assistance.
It is important to note this new wave of attacks does not indicate a new risk, just new targets. However, the new wave displayed some characteristics that merit further investigation: for example, we note that just one threat identity has claimed most of the newly targeted deployments. We’ve reviewed these details to understand where and when users left systems insecure – connected to the Internet with no password on their Administrator account – and who is attacking them.
Here’s What’s Coming
Our approach is to facilitate safe choices for users, within a flexible product serving the many communities developing on and deploying MongoDB.
Helping direct users towards safe network options is why since release 2.6.0 we have made localhost binding the default configuration in our most popular deployment package formats, RPM and deb. This means all networked connections to the database are denied unless explicitly configured by an administrator. Beginning with development release version 3.5.7, localhost-only binding is implemented directly in the MongoDB server, making it the default behavior for all distributions. This will also be incorporated into our upcoming production-ready 3.6 release.
In addition, we added a warning to our download center to ensure users know the network configuration risks with non-packaged distributions.
MongoDB Atlas, our database-as-a-service, further simplifies deployment decisions by providing secure infrastructure by default. Whether users set up a free instance or full production cluster, choosing our cloud option means getting security best practices as a service, which prevents misconfigured instances.
We’re Always Striving to Make Safe Deployment Easier
Our post from earlier this year – titled “How to Avoid a Malicious Attack That Ransoms Your Data” – guided users through the simple steps to prevent or diagnose and respond to such an attack.
If you or someone you love runs MongoDB, please point them to our freely available guides to MongoDB’s built-in security features: access controls, encryption, and detailed auditing. For example, our Security Checklist provides current best practices and links to in-depth documentation to ensure deployments are secured. We made it easy for users to run daily security tests to send alerts on whether their instance is exposed to the public Internet. And we offer even broader training for all features and deployment practices through free online MongoDB University courses such as M310: MongoDB Security, covering native and third-party integration security features and resources.
We thank the responders and researchers working on this and will continue to monitor and investigate.
About the Author - Davi Ottenheimer
Davi leads Product Security at MongoDB.
GDPR: Impact to Your Data Management Landscape: Part 2
Welcome to part 2 of our 4-part blog series. In part 1 , we provided a primer into the GDPR – covering its rationale, and key measures In todays part 2, we’ll explore what the GDPR means for your data platform In part 3, we’ll discuss how MongoDB’s products and services can support you in your path to compliance Finally, in part 4, we’ll examine how the GDPR can help in customer experience, and provide a couple of case studies. If you can’t wait for all 4 parts of the series, but would rather get started now, download the complete GDPR: Impact to Your Data Management Landscape white paper today. Mapping GDPR to Required Database Capabilities Like other regulations designed to enforce data security and privacy standards (e.g., HIPAA, PCI DSS, SOX, FISMA, FERPA), GDPR compliance can be achieved only by applying a combination of controls that we can summarize as People, Processes, and Products: “People” defines specific roles, responsibilities, and accountability. “Processes” defines operating principles and business practices. “Products” defines technologies used for data storage and processing. As with any data security regulation, enabling controls in a database storing personal data is just one step towards compliance – people and processes also are essential. There are, however, specific requirements stated in the GDPR text that define a set of controls organizations need to implement across their data management landscape. We can group these requirements into three areas: Discover: scope data subjects to the regulation. Defend: implement measures to protect discovered data. Detect: identify a breach against that data, and remediate security and process gaps. The following section of the post examines GDPR requirements, and maps them back to the required database capabilities. Please note that the list below is illustrative only, and is not designed to be exhaustive. Discover Before implementing security controls, an organization first needs to identify personal data stored in its databases, and for how long the organization is permitted to retain that data. They also need to assess the potential impact to the individual, should the personal data be disclosed to an unauthorized party. Identification of Impact to Personal Data The GDPR requires organizations to undertake a Data Protection Impact Assessment, documented in Article 35 (clause 1) of the GDPR text, stating: “Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data.” It is therefore important to have access to tools that enable the data controller to quickly and conveniently review their database content, and as part of an ongoing discovery process, to inspect what additional data will be captured as new services are under development. Retention of Personal Data As noted in “Information to be Provided”, Article 13 (clause 2a), the GDPR text specifies that at the time data is collected from an individual, the organization must state: “the period for which the personal data will be stored, or if that is not possible, the criteria used to determine that period” Therefore, a required capability that the organization will need to implement is the ability to identify personal data, and securely erase it from the database once the expiration period has been reached, or an individual specifically requests erasure. As a result, storage, including backups, should have the ability to provably erase data as requested by owner. Defend Once the organization has conducted its Discover phase, with an Impact Assessment and expiration policies defined, they need to implement the controls that will protect citizen data. General Security Requirements of the GDPR The “Security of Processing”, Article 32 (clause 1) provides an overview of security controls an organization needs to enforce: “….the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate: (a) the pseudonymisation and encryption of personal data; (b) the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services; (c) the ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident; (d) a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.” Each of the bulleted clauses is further expanded upon within the GDPR text, as follows. Access Control The GDPR emphasizes the importance of ensuring that only authorized users can access personal data. As stated in the text “Data Protection by Design and by Default”, Article 25 (clause 2): “The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed” This requirement is further reinforced in Article 29, “Processing Under the Authority of the Controller or Processor”, stating “The processor and any person acting under the authority of the controller or of the processor, who has access to personal data, shall not process those data except on instructions from the controller….” Within the database, it should be possible to enforce authentication controls so that only clients (e.g., users, applications, administrators) authorized by the data processor can access the data. The database should also allow data controllers to define the specific roles, responsibilities, and duties each client can perform against the data. For example, some clients may be permitted to read all of the source data collected on a data subject, while others may only have permissions to access aggregated data that contains no reference back to personal identifiers. This approach permits a fine-grained segregation of duties and privileges for each data processor. Pseudonymisation & Encryption In the event of a breach, the pseudonymisation and encryption of data is designed to prevent the identification of any specific individual from compromised data. In the definitions section of the GDPR text, pseudonymisation means: “….the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information” Clause 28 of the general regulations states: “The application of pseudonymisation to personal data can reduce the risks to the data subjects concerned and help controllers and processors to meet their data-protection obligations.” One of the most effective and efficient means of pseudonymising data is based on the access control privileges defined in the previous step. The database redacts personal identifiers by filtering query results returned to applications. Encryption is specifically referenced in Article 32 (clause 1) referenced above. The advantages of encryption are further expanded in the text for “Communication of a Personal Data Breach to the Data Subject”, Article 34 (clause 3a), stating communication to the data subject is not required if: “the controller has implemented appropriate technical and organisational protection measures, and those measures were applied to the personal data affected by the personal data breach, in particular those that render the personal data unintelligible to any person who is not authorised to access it, such as encryption;” The database should provide a means to encrypt both data “in-transit” using network connections, and data “at-rest” using storage and backups. Resilience and Disaster Recovery As stated in bullets B and C in the “The Security of Processing”, Article 32 cited above, systems and service availability, along with a means to restore data in a timely fashion, are both core operational requirements of the GDPR. As a result, the database needs to offer fault tolerance to systems failures, along with backup and recovery mechanisms to enable disaster recovery. Data Sovereignty: Data Transfers Outside of the EU Chapter 5 of the GDPR is dedicated to how the transfer of personal data outside of the EU should be handled – defining when such transfers are permissible and when they are not. Key to understanding data transfer is that EU citizen rights under the GDPR accompany the data to wherever it is moved globally, where the same safeguards must be applied. To summarize the chapter, Article 45 (clause 1) states: “A transfer of personal data to a third country or an international organisation may take place where the Commission has decided that the third country, a territory or one or more specified sectors within that third country, or the international organisation in question ensures an adequate level of protection.” To support globally distributed applications, organizations are increasingly distributing data to data centers and cloud facilities located in multiple countries across the globe. In context of the GDPR, it should be possible for the database to enforce data sovereignty policies by only distributing and storing EU citizen data to regions recognized as complying with the regulation. Detect In the event of a data breach, the organization must be able, in timely fashion, to detect and report on the issue, and also generate a record of what activities had been performed against the data. Monitoring and Reporting Monitoring is always critical to identifying potential exploits. The closer to real time, the better chance of limiting the impact of data breaches. For example, sudden peaks in database resource consumption can indicate an attack in progress at the very moment it happens. In the GDPR text “Notification of a Personal Data Breach to the Supervisory Authority”, Article 33 (clause 1), it is stated: “In the case of a personal data breach, the controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of it, notify the personal data breach to the supervisory authority….” As a result, the database should offer management tools that enable constant monitoring of database behavior to proactively mitigate threats, and that enable the organization to report on any breaches within the specified timeframes. Auditing “Data Protection by Design and by Default”, Article 25 (clause 2) emphasizes the requirement to maintain a log of activities performed against the data: “….Each controller and, where applicable, the controller's representative, shall maintain a record of processing activities under its responsibility” “Processor”, Article 28 (clause 3H) further expands on the requirement for auditing, stating that the data processor: “makes available to the controller all information necessary to demonstrate compliance with the obligations laid down in this Article and allow for and contribute to audits, including inspections, conducted by the controller or another auditor mandated by the controller.” The database needs to offer a mechanism to record database activity, and present that activity for forensic analysis when requested by the controller. Wrapping Up Part 2 That wraps up the second part of our 4-part blog series. In Part 3, we’ll discuss how MongoDB’s products and services can help you meet the requirements we’ve discussed today Remember, if you want to get started right now, download the complete GDPR: Impact to Your Data Management Landscape white paper today. Disclaimer For a full description of the GDPR’s regulations, roles, and responsibilities, it is recommended that readers refer to the text of the GDPR (Regulation (EU) 2016/679), available from the Official Journal of the European Union , and refer to legal counsel for the interpretation of how the regulations apply to their organization. Further, in order to effectively achieve the functionality described in this blog series, it is critical to ensure that the database is implemented according to the specifications and instructions detailed in the MongoDB security documentation . Readers should consider engaging MongoDB Global Consulting Services to assist with implementation.
Security in Government Solutions: Why Secure By Default is Essential
Data security in government agencies is table stakes at this point. Everyone knows it’s essential, both for compliance and data protection purposes. However, most government agencies are working with solutions that require frequent security patches or built-on tools to protect their data. Today, the federal government is pushing its agencies to move to modernize their solutions and improve their security posture. For example, the DHS and Cybersecurity and Infrastructure Security Agency’s recently issued technical rule for modernization of the Protected Critical Information Infrastructure program – a program that provides legal protections for cyber and physical infrastructure information submitted to DHS. “The PCII Program is essential to CISA’s ability to gather information about risks facing critical infrastructure,” said Dr. David Mussington, Executive Assistant Director for Infrastructure Security. “This technical rule modernizes and clarifies important aspects of the Program, making it easier for our partners to share information with DHS. These revisions further demonstrate our commitment to ensuring that sensitive, proprietary information shared with CISA remains secure and protected.” So how can government agencies modernize their data infrastructure and find solutions that not only protect data but also power innovation? Let’s look into a few different strategies. 1. Why secure by default is key Secure by default means that any piece of software uses default security settings that are configured for the highest possible security out of the box. CISA Director Jen Easterly has addressed how using solutions that are secure by default is critical for any organization. “We have to have [multi-factor authentication] by default. We can't charge extra for security logging and [single sign-on],” Easterly said . “We need to ensure that we're coming together to really protect the technology ecosystem instead of putting the burden on those least able to defend themselves.” “The American people have accepted the fact that they’re constantly going to have to update their software,” she said. “The burden is placed on you as the user and that’s what we have to collectively stop.” Easterly is right. Secure-by-design solutions are vital to the success of data protection. The expectation should alway be that solutions have built-in, not bolt-on security features. One approach that’s gaining traction both in the public and private sectors is zero trust environments. In a zero trust environment, the perimeter is assumed to have been breached. There are no trusted users, and no user or device gains trust simply because of its physical or network location. Every user, device, and connection must be continually verified and audited. As the creator of zero trust, security expert John Kindervag, summed it up: “Never trust, always verify.” For government agencies, that means the underlying database must be secure by default, and it needs to limit users’ opportunities to make it less secure. 2. Security isn't just on-prem anymore; cloud is secure, too Cloud can be a scary word for public sector organizations. Trusting your sensitive data to the cloud might feel risky for those who handle some of the country’s most sensitive data. But, cloud providers are stepping up to meet the security needs of government agencies. There is no need to fear the cloud anymore. Government agencies and other public sector organizations nationwide are navigating cloud modernization through the lens of increased cybersecurity requirements outlined in the 2021 Executive Order on Improving the Nation’s Cybersecurity . “The Federal Government must adopt security best practices; advance toward Zero Trust Architecture; accelerate movement to secure cloud services, including Software as a Service (SaaS), Infrastructure as a Service (IaaS), and Platform as a Service (PaaS); centralize and streamline access to cybersecurity data to drive analytics for identifying and managing cybersecurity risks; and invest in both technology and personnel to match these modernization goals.” Also, the major cloud providers are well established, purpose-built options for government users. AWS GovCloud, for example, is more than a decade old and was “ the first cloud provider to build cloud infrastructure specifically designed to meet U.S. government security and compliance needs.” This push by the federal government toward cloud modernization and increased cybersecurity will be a catalyst in upcoming years for rapid cloud adoption and greater dependence on cloud solutions designed specifically for government users. 3. Security features purpose-built for goverment needs are essential Government agencies are held to a higher standard than those in the private sector. From data used in sometimes life-or-death missions to data for students building their futures in educational institutions (and everything in between), security has real-world consequences. Today, security is non-negotiable and like we explored above, it’s especially crucial that public sector entities have built-in security measures to keep data protected. So, what built-in features should you look for? Network isolation and access It’s critical that your data and underlying systems are fully isolated from other organizations using the same cloud provider. Database resources should be associated with a user group, which is contained in its own Virtual Private Cloud (VPC), and access should be granted by IP access lists, VPC peering, or private endpoints. Encyption in flight, at rest, and in use Encryption should be the standard. For example, when using MongoDB Atlas, all network traffic is encrypted using Transport Layer Security (TLS). Encryption for data at rest is automated using encrypted storage volumes. Customers can use field-level encryption to encrypt sensitive workloads which enables you to encrypt data in your application before you send it over the network to MongoDB clusters. Users can bring their own encryption keys for an additional level of control. Granular database auditing Granular database auditing allows administrators to answer detailed questions about systems activity by tracking all commands against the database. This ensures you always know who has access to what data and how they’re using it. Multi-factor authentication User credentials should always be stored using industry-standard and audited one-way hashing mechanisms, with multi-factor authentication options including SMS, voice call, a multi-factor app, or a multi-factor device, ensuring only approved users have access to your data. MongoDB Atlas for Government: Purpose-built for public sector As we’ve discussed, solutions that are purpose-built with built-in security are ideal for government agencies, and choosing the right one is the best way to keep sensitive data protected. MongoDB Atlas for Government on AWS GovCloud recently secured its FedRAMP Moderate authorization thanks to these security measures built into the solution. FedRAMP is a government-wide program that provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services. To ensure the utmost levels of security, Atlas for Government is an independent, dedicated environment for the U.S. public sector, as well as ISVs looking to build U.S. public sector offerings. Public Sector organizations carry a heavy burden when it comes to keeping data protected. However, with the right data platform underpinning modern applications – a platform with built-in security features – progress doesn’t mean you have to compromise on security. Want to learn more about data protection best practices for public sector organizations? Attend our upcoming webinar on April 12 for deeper insight .