Last June we introduced MongoDB Atlas, the database as a service for MongoDB. Atlas is designed in accordance with all of the best practices for managing MongoDB, so using it is like getting a professional MongoDB Ops team on your side. It is the easiest and most cost effective way to run MongoDB in the cloud, and it is already helping thousands of teams -- from innovative startups like Bond to established industry leaders like eHarmony and Thermo Fisher -- to build apps more efficiently by making database management as easy as possible.
We’re incredibly excited by the success our customers have had with Atlas so far, and today I’d like to share some updates to the service that will make it even easier to get started with Atlas.
Making Atlas data migrations simple with MongoMirror
It’s a cinch to spin up a MongoDB cluster with Atlas, but if you’re already running an application, you still have to migrate data, which until now has been a manual process. Today we’re introducing a new utility called MongoMirror that automates that process. MongoMirror will live migrate data to MongoDB Atlas from any pre-existing MongoDB 3.0+ replica set, making it even easier to get your existing applications migrated to Atlas.
Get MongoDB in the cloud for free with the new M0 tier
We’re also making it easier than ever to experiment with a real cloud environment for MongoDB. The new “M0” cluster type is a free cluster, ideal for learning MongoDB or building a prototype. Like our existing cluster types, the M0 tier has optimal security, availability, and managed upgrades by default.
More to come
The M0 tier and MongoMirror remove even more barriers between developers and execution of their ideas. Now you can get started with MongoDB Atlas for free, migrate without downtime, and scale up as you need, completely seamlessly. In the coming months, we’ll be bringing MongoDB Atlas to the Google Compute Engine and Microsoft Azure, and we’re actively working on even more tools to seamlessly migrate existing workloads to MongoDB Atlas, so stay tuned.
About the Author - Eliot Horowitz
Eliot Horowitz is CTO and Co-Founder of MongoDB. Eliot is one of the core MongoDB kernel committers. Previously, he was Co-Founder and CTO of ShopWiki. Eliot developed the crawling and data extraction algorithm that is the core of its innovative technology. He has quickly become one of Silicon Alley's up and coming entrepreneurs and was selected as one of BusinessWeek's Top 25 Entrepreneurs Under Age 25 nationwide in 2006. Earlier, Eliot was a software developer in the R&D group at DoubleClick (acquired by Google for $3.1 billion). Eliot received a BS in Computer Science from Brown University.
Leaf in the Wild: World’s Most Installed Learning Record Store Migrates to MongoDB Atlas to Scale Data 5x, while Reducing Costs
Learning Locker moves away from ObjectRocket to scale its learning data warehouse, used by the likes of Xerox, Raytheon and U.K. Universities. From Amazon’s recommendations to the Facebook News Feed, personalization has become ingrained in consumer experience, so it should come as no surprise that resourceful educators are now trying improve learning outcomes with that same concept. After all, no two students are identical in much the same way that no two consumers are exactly alike. Developing a truly personalized educational experience is no easy feat, but emerging standards like the xAPI are helping to make this lofty goal a reality. xAPI is an emerging specification that enables communication between disparate learning systems in a way that standardizes learning data. That data could include things like a student’s attendance in classes, or participation in online tools, but can also stretch to performance measures in the real-world, how students apply their learning. This data-led approach to Learning Analytics is helping educators improve learning practices, tailor teaching and take early intervention if it looks like a student is moving in the wrong direction. But the implications of this go far beyond the classroom, and increasingly companies are using these same techniques to support their employees development and to measure the impact of training on performance outcomes. Whilst educators are predicting the chances of a particular student dropping out, businesses can use these same tools to forecast organizational risk, based on compliance training and performance data, for example. We recently spoke with James Mullaney, Lead Developer at HT2 Labs a company that is at the forefront of the learning-data movement. HT2 Labs’ flagship product, Learning Locker, is an open source data warehouse used by the likes of the Xerox, Raytheon and a wide-range of universities to prove the impact of training and to make more informed decisions on future learning design. To continue to scale the project, better manage their operations and reduce costs, Learning Locker migrated from ObjectRocket to database as a service MongoDB Atlas. Tell us about HT2 Labs and Learning Locker. HT2 Labs is the creator of Learning Locker, which is a data warehouse for learning activity data (commonly referred to as a Learning Record Store or LRS). We have a suite of other learning products that are all integrated; Learning Locker acts as the hub that binds everything together. Our LRS uses the xAPI, which is a specification developed in part by the U.S. Department of Defense to help track military training initiatives. It allows multiple learning technology providers to send data into a single data store in a common format We started playing around with xAPI around four years ago as we were curious about the technology and had our own Social Learning Management System (LMS), Curatr. Today, Learning Locker receives learning events via an API, analyzes the data stored, and is instrumental in creating reports for our end customers. Who is using Learning Locker? The software is open source so our users range from hobbyists to enterprise companies, like Xerox, who use our LRS to track internal employee training. Another example is Jisc, the R&D organization that advances technologies in UK Higher & Further Education.. Jisc are running one of the largest national-level initiatives to implement Learning Analytics across universities in the UK and our LRS is used to ingest data and act as a single source of data for predictive models. This increased level of insight into individual behavior allows Jisc to do some interesting things, such as predict and preempt student dropouts. How has Learning Locker evolved? We’re currently on version two of Learning Locker. We’ve open sourced the product and we’ve also launched it as a hosted Software as a service (SaaS) product. Today we have clients using our LRS in on-premise installations and in the cloud. Each on-prem installation comes packaged with MongoDB. The SaaS version of Learning Locker typically runs in AWS supported by MongoDB Atlas, the managed MongoDB as a Service. Tell us about your decision to go with MongoDB for the underlying database. MongoDB was a very natural choice for us as the xAPI specification calls for student activities to be sent as JSON. These documents are immutable. For example, you might send a document that says, “James completed course XYZ.” You can’t edit that document to say that he didn’t complete it. You would have to send another document to indicate a change. This means that scale is very important as there is a constant stream of student activity that needs to be ingested and stored. We’ve been very happy with how MongoDB, with its horizontal scale-out architecture, is handling increased data volume; to be frank, MongoDB can handle more than our application can throw at it. In fact, our use of MongoDB is actually award-winning: Last year we picked up the MongoDB Innovation Award for best open source project. Beyond using the database for ingesting and storing data in Learning Locker, how else are you using MongoDB? As mentioned earlier, our LRS runs analytics on the data stored and those analytics are then using in reporting for our end users. For running those queries, we use MongoDB’s aggregation framework and the associated aggregation APIs. This allows our end users to get quick reports on information they’re interested in, such as course completion rates, score distribution, etc. Our indexes are also rather large compared to the data. We index on a lot of different fields using MongoDB’s secondary indexes. This is absolutely necessary for real-time analytics, especially when the end user wants to ask many different questions. We work closely with our clients to figure out the indexes that make the most sense based on the queries they want to run against the data. Tell us about your decision to run MongoDB in the cloud. Did you start with MongoDB Atlas or were you using a third party vendor? Our decision to use a MongoDB as a service provider was pretty simple — we wanted someone else to manage the database for us. Initially we were using ObjectRocket and that made sense for us at the time because we were hosting our application servers on Rackspace. Interesting. Can you describe your early experiences with MongoDB Atlas and the migration process? We witnessed the launch of MongoDB Atlas last year at MongoDB World 2016 and spun up our first cluster with Atlas in October. It became pretty clear early on that it would work for what we needed. First we migrated our Jisc deployment and our hosted SaaS product to MongoDB Atlas and we also moved our application servers to AWS for lower latency. The migration was completed in December with no issues. Why did you migrate to MongoDB Atlas from ObjectRocket? Cost was a major driving force for our migration from ObjectRocket. We’ve been growing and are now storing five times as much data in MongoDB Atlas at about the same costs. ObjectRocket was also pretty opaque about what was happening in the background and that’s not the case with MongoDB Atlas, which gives you greater visibility and control. I can see, for example, exactly how much RAM I’m using at any point in time. And finally, nobody is going to tell you that security isn’t important, especially in an industry where we’re responsible for handling potentially-sensitive student data. We were very happy with the native security features in MongoDB Atlas and the fact that we aren’t charged a percentage uplift for encryption, which was not the case with ObjectRocket. Do you have any plans to integrate MongoDB with any other technologies to build more functionality for Learning Locker? We’re looking into Hadoop, Spark, and Tableau for a few of our clients. MongoDB’s native connectors for Hadoop, Spark, and BI platforms come in handy for those projects. Any advice for people looking into MongoDB and MongoDB Atlas? Plan for scale. Think about what you’re doing right now and ask yourself, “Will this work when I have 100x more data? Can we afford this at 100x the scale?” The MongoDB Atlas UI makes most things extremely easy, but remember that some things you can only do through the mongo shell. You should ensure your employees learn or retain the skills necessary to be dangerous in the CLI. And this isn’t specific to just MongoDB, but think about the technology you’re partnering with and the surrounding community. For us, it’s incredibly important that MongoDB is a leader in the NoSQL space as it’s made it that much easier to talk about Learning Locker to prospective users and clients. We view it as a symbiotic relationship; if MongoDB is successful then so are we. James, thanks for taking the time to share your experiences with the MongoDB community and we look forward to seeing you at MongoDB World 2017. For deploying and running MongoDB, MongoDB Atlas is the best mix of speed, scalability, security, and ease-of-use. Learn more about MongoDB Atlas
Serverless development with Node.js, AWS Lambda and MongoDB Atlas