Atlas + Next.js + Now Connection Issues

I am trying to using mongo in a serverless environment (vercel’s now). But I am constantly deailing with my connection just dying.

What does my code do?

  • I have a list of 1000 blogs
  • I scan the blog for new posts via RSS feed
  • I potentially add the post to my db if some criteria are met

So a call to my /scan-blogs API endpoint will potentially scan and add hundreds of blog posts. All of this was working fine on mLab, but after switching to Atlas I’ve had to really put a limit on concurrency of these scans to dance around the connection limit.

Here is my shared db utility that I use in all of my serverless function

and here are the errors that I keep getting, about 1/3 of the requests fail.

error2

Hey Andrew,

Like from my Twitter message, if you can please try this utility instead https://github.com/vercel/next.js/blob/canary/examples/with-mongodb/util/mongodb.js and let me know if it helps. I think that would be a good first step to helping troubleshoot this.

Thanks,
Ado

and my connections shoot way up to 500

Screen Shot 2021-02-19 at 10.15.14 AM

It didn’t work

Same error

I think what’s happening here is that the Atlas M0 free sandbox allows only 500 concurrent connections and Vercel is spawning >500 concurrent Lambda functions each making a separate connection: Is there a way to reduce the concurrency a bit or re-use Lambda contexts?


same error . Have you find the fix of this issue?

We are still trying to reproduce this but one thing you could try is to close the MongoClient via close() when your process exits. You do not want to do this after each operation since you want to re-use the MongoClient object while the lambda function is warm, but you should call .close() when your function goes cold.

Hey Andrew,

In your next.config.js file, is your target set to “serverless” or “server”? I’ve been trying to get to the bottom of this for a little while, and have found that if your target is “server”, when your app is deployed to Vercel, it will create just two lambda functions that will bundle the /api/ routes and pages that use getServerSideProps(). If you have it set to “serverless” mode then each page will create a unique lambda function, thus potentially adding too many connections.

In my testing, with 10 different API’s and multiple pages using getServerSideProps, I never go over 10 connections. The other thing to consider is shutting down and deleting prior builds.

Hope this helps.

Andrew_Lisowski - We reached out to you over MongoDB Atlas’ in-app chat app so that we can provide more tailored help with this connection limit issue. Could you kindly log into https://cloud.mongodb.com/ and look for the chat that we opened?

h_b - It looks like one of our Atlas support agents already helped you solve the issue you were having connecting with Mongoose. Please open another chat if you still need help!

1 Like

Hi @Andrew_Davidson,
I’m new to serverless and I’m also facing an issue with my database hitting a limit of 500 open connections. I have caching in my serverless functions as well. I just want to ask how do you often detect if a function goes “cold,” is that the job of Mongo or the serverless system (I’m using Netlify which uses AWS Lambda). Thank you

Hi @Jessica_Dao,

Sorry to hear that you are facing this issue.

To answer your question, Atlas does not know that you’re using a serverless environment.

We would not expect you to be hitting the limit if you’re caching the MongoClient (unless you actually peak up to 500 concurrent operations) and would like to help you debug your issue in the context of your Atlas cluster and your specific environment. Could you use the in-app chat icon in the lower-right corner? You can ask for me specifically.

Kind regards,
Angela

2 Likes

Hi @Angela_Shulman ,

Thanks for getting back. Yesterday I have reached out to Mongo support but they said this question was not in their scope. Is there any other way you could help me.

Best regards,
Jessica

Hi @Jessica_Dao,

Sorry for the confusion - could you kindly look for an updated chat from us? Look forward to working with you.

Angela

1 Like