Companies optimizing for the real time web are looking for tools and techniques to more effectively and efficiently analyze data. MongoDB’s fast write performance and horizontal scaling makes it a great fit for storing and processing large volume data feeds like social data feeds, stock quote data, logs or telemetry data. People are eager to learn more about it and since we understand that the community holds the expertise - we want to hear from you.
Share your stories on how you built a system for high volume data feeds with MongoDB, including the insights you gained and how others in the community can learn from your experience.
- Write a blog post on on your personal or company blog about how you designed a machine for processing large quantities of data with MongoDB. Gear your post towards a technical audience, because 10gen engineers will be picking the winner.
- Once you’re finished, submit the link to your blog post in the comments section of this post by August 10, 2012. We’ll announce the winner shortly thereafter.
- Please also tweet it out with the #MongoDB hashtag.
- MongoDB Timbuk2 backpack, t-shirt, and book
- Free entry for you and two friends to the MongoDB conference of your choice
- Your blog tweeted out to the over 20,000 followers of @mongodb and @10gen and mentioned in the MongoDB Newsletter
- All participants receive a MongoDB t-shirtli>
- Expect many of the best posts to be tweeted out from @mongodb!
For additional information on using MongoDB for High Volume Data Feeds, register for 10gen’s free webinar on August 2, or check out the use cases page on 10gen.com.