MongoDB 3.2.6 is released
MongoDB 3.2.6 is out and is ready for production deployment. This release contains only fixes since 3.2.5, and is a recommended upgrade for all 3.2 users.
Fixed in this release:
- SERVER-22970 Background index contains mismatched index keys and documents
- SERVER-22043 count helper doesn’t apply read preference
- SERVER-23394 AuthorizationManager may deadlock while building role graph if profiling is enabled
- SERVER-23766 Remove beta startup warning for inMemory storage engine
All Issues | Downloads | 3.2 Release Notes
As always, please let us know of any issues.
– The MongoDB Team
Back to Basics: Learn about MongoDB in Six Easy Steps
On May 5th we will launch a new Back to Basics webinar series . The goal of this series is to provide a gentle introduction to MongoDB for people who are new to NoSQL and are interested in accelerating their own knowledge of MongoDB. This six part series will take you from the basics of NoSQL to building and deploying your first MongoDB application. Participants are expected to have a technical background and should have at least a basic grasp of relational database technology. Some programming experience will definitely help you to get your head around the examples. The six webinars in the series are described below. Visit here to register for all the webinars in the series . Webinar 1: Introduction to NoSQL We start the series by taking a look at NoSQL and why you should care. We will cover the differences between the main types of NoSQL databases - document stores, wide column stores, and key value stores. Webinar 2: Your First MongoDB Application Next, we explore the details of how to build an application in MongoDB. We will cover the types of entities that we work with in a document database, as well as how to build document-based applications and how to manage performance, including the role of indexes. Webinar 3: Schema Design, Thinking in Documents In the third part of our series, we take a deeper look at the challenges of schema design. We will explore how to map relational schema into MongoDB, and how to optimize schema design for reads and writes. Finally, we’ll take a look at an interesting and unique feature of MongoDB - document validation. Webinar 4: Advanced Indexing, Text and Geospatial Indexes One of the key value propositions of MongoDB is its advanced library of indexing techniques. In this webinar we outline how to tune indexes. We then look at our text index capabilities which allow us to do free text searching within fields in the database, and our geospatial capabilities which allow you to search based on location. Webinar 5: Introduction to the Aggregation Framework The aggregation framework is one of the most powerful analytical tools available for MongoDB. In the fifth part of our series we explore how to create a pipeline of operations that can reshape and transform your data and apply a range of analytics functions and calculations to produce summary results across a data set. Webinar 6: Production Deployment In the final talk of the series we will explain how we create a stable production environment. You will learn how to create MongoDB production deployments that can survive many different failure scenarios. You will also learn how to create a scalable cluster than can handle any increase in the production workload. We will also introduce some of the production deployment tools that we can use to automate management and deployment. The whole series will be recorded and made available for review by all participants. If after attending these bite sized pieces of training you want to dig deeper we recommend you look at our extensive program of free training at MongoDB University . We look forward to seeing you online for the first episode on May 5th! Back to Basics Webinar 1: Introduction to NoSQL About the Author - Joe Drumgoole Joe is Director of Developer Advocacy EMEA at MongoDB. At MongoDB he helps developers to understand and utilise MongoDB in order to unleash the power of software and data for innovators everywhere. He is a software entrepreneur with over 25 years experience of successful product delivery at Digital Equipment Corporation, Nomura, Oracle Corporation, CR2 and Cape Clear Software. He has founded three software startups. Joe is a regular speaker at technical conferences and has provided mentoring and advice to many Startups over the past ten years.
Security in Government Solutions: Why Secure By Default is Essential
Data security in government agencies is table stakes at this point. Everyone knows it’s essential, both for compliance and data protection purposes. However, most government agencies are working with solutions that require frequent security patches or built-on tools to protect their data. Today, the federal government is pushing its agencies to move to modernize their solutions and improve their security posture. For example, the DHS and Cybersecurity and Infrastructure Security Agency’s recently issued technical rule for modernization of the Protected Critical Information Infrastructure program – a program that provides legal protections for cyber and physical infrastructure information submitted to DHS. “The PCII Program is essential to CISA’s ability to gather information about risks facing critical infrastructure,” said Dr. David Mussington, Executive Assistant Director for Infrastructure Security. “This technical rule modernizes and clarifies important aspects of the Program, making it easier for our partners to share information with DHS. These revisions further demonstrate our commitment to ensuring that sensitive, proprietary information shared with CISA remains secure and protected.” So how can government agencies modernize their data infrastructure and find solutions that not only protect data but also power innovation? Let’s look into a few different strategies. 1. Why secure by default is key Secure by default means that any piece of software uses default security settings that are configured for the highest possible security out of the box. CISA Director Jen Easterly has addressed how using solutions that are secure by default is critical for any organization. “We have to have [multi-factor authentication] by default. We can't charge extra for security logging and [single sign-on],” Easterly said . “We need to ensure that we're coming together to really protect the technology ecosystem instead of putting the burden on those least able to defend themselves.” “The American people have accepted the fact that they’re constantly going to have to update their software,” she said. “The burden is placed on you as the user and that’s what we have to collectively stop.” Easterly is right. Secure-by-design solutions are vital to the success of data protection. The expectation should alway be that solutions have built-in, not bolt-on security features. One approach that’s gaining traction both in the public and private sectors is zero trust environments. In a zero trust environment, the perimeter is assumed to have been breached. There are no trusted users, and no user or device gains trust simply because of its physical or network location. Every user, device, and connection must be continually verified and audited. As the creator of zero trust, security expert John Kindervag, summed it up: “Never trust, always verify.” For government agencies, that means the underlying database must be secure by default, and it needs to limit users’ opportunities to make it less secure. 2. Security isn't just on-prem anymore; cloud is secure, too Cloud can be a scary word for public sector organizations. Trusting your sensitive data to the cloud might feel risky for those who handle some of the country’s most sensitive data. But, cloud providers are stepping up to meet the security needs of government agencies. There is no need to fear the cloud anymore. Government agencies and other public sector organizations nationwide are navigating cloud modernization through the lens of increased cybersecurity requirements outlined in the 2021 Executive Order on Improving the Nation’s Cybersecurity . “The Federal Government must adopt security best practices; advance toward Zero Trust Architecture; accelerate movement to secure cloud services, including Software as a Service (SaaS), Infrastructure as a Service (IaaS), and Platform as a Service (PaaS); centralize and streamline access to cybersecurity data to drive analytics for identifying and managing cybersecurity risks; and invest in both technology and personnel to match these modernization goals.” Also, the major cloud providers are well established, purpose-built options for government users. AWS GovCloud, for example, is more than a decade old and was “ the first cloud provider to build cloud infrastructure specifically designed to meet U.S. government security and compliance needs.” This push by the federal government toward cloud modernization and increased cybersecurity will be a catalyst in upcoming years for rapid cloud adoption and greater dependence on cloud solutions designed specifically for government users. 3. Security features purpose-built for goverment needs are essential Government agencies are held to a higher standard than those in the private sector. From data used in sometimes life-or-death missions to data for students building their futures in educational institutions (and everything in between), security has real-world consequences. Today, security is non-negotiable and like we explored above, it’s especially crucial that public sector entities have built-in security measures to keep data protected. So, what built-in features should you look for? Network isolation and access It’s critical that your data and underlying systems are fully isolated from other organizations using the same cloud provider. Database resources should be associated with a user group, which is contained in its own Virtual Private Cloud (VPC), and access should be granted by IP access lists, VPC peering, or private endpoints. Encyption in flight, at rest, and in use Encryption should be the standard. For example, when using MongoDB Atlas, all network traffic is encrypted using Transport Layer Security (TLS). Encryption for data at rest is automated using encrypted storage volumes. Customers can use field-level encryption to encrypt sensitive workloads which enables you to encrypt data in your application before you send it over the network to MongoDB clusters. Users can bring their own encryption keys for an additional level of control. Granular database auditing Granular database auditing allows administrators to answer detailed questions about systems activity by tracking all commands against the database. This ensures you always know who has access to what data and how they’re using it. Multi-factor authentication User credentials should always be stored using industry-standard and audited one-way hashing mechanisms, with multi-factor authentication options including SMS, voice call, a multi-factor app, or a multi-factor device, ensuring only approved users have access to your data. MongoDB Atlas for Government: Purpose-built for public sector As we’ve discussed, solutions that are purpose-built with built-in security are ideal for government agencies, and choosing the right one is the best way to keep sensitive data protected. MongoDB Atlas for Government on AWS GovCloud recently secured its FedRAMP Moderate authorization thanks to these security measures built into the solution. FedRAMP is a government-wide program that provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services. To ensure the utmost levels of security, Atlas for Government is an independent, dedicated environment for the U.S. public sector, as well as ISVs looking to build U.S. public sector offerings. Public Sector organizations carry a heavy burden when it comes to keeping data protected. However, with the right data platform underpinning modern applications – a platform with built-in security features – progress doesn’t mean you have to compromise on security. Want to learn more about data protection best practices for public sector organizations? Attend our upcoming webinar on April 12 for deeper insight .