AWS services moving forward - 3rd party services depreciation

Hi Folks – Just wanted to update this thread that we pushed two additional changes to Realm’s Function engine yesterday which we believe will lead to significant performance improvements across many packages such as AWS, Stripe, and Axios. Very interested in hearing any and all feedback from this group!

1 Like

Hi @Drew_DiPalma thank you for the update! I will test this later today and report back.

Thank you

@Tyler_Collins were you able to see improvements with the SDK? @Drew_DiPalma any future enhancements / timelines we should be aware of?

Cheers!

You know, I didn’t get around to testing this. I will do this today and see if there is an improvement.

1 Like

Hi Folks – We have seen significant improvements in the past few weeks across all packages that make external requests. We also did just have another improvement go out towards the end of this week. We are always looking to make more improvements to our engine and interested in hearing feedback, but currently do not have any specific packages that we are working to optimize. Unless there are specific updates, it is likely that we’ll close this thread soon in favor of opening new threads for any future questions, etc.

A post was split to a new topic: Stripe Webhooks and AWS-S3 function call issue

I’ve tested the realm functions since the update and I am not seeing an improvement with aws-sdk.

I have aws-sdk - 2.737.0 installed as a dependency in functions.

Using the 3rd Party Services go AWS context my function takes 37ms to return.
Using the aws-sdk package my function takes 4000 ms to return.

For context I am using the aws package to generate pre-signed urls.

Thank you

@Drew_DiPalma I wanted to make sure you saw my previous comment. I am still seeing long runtimes when importing aws-sdk.

Hi Tyler – Thanks for checking in! This function seems to be an outlier from a performance perspective so we’ll be taking a closer look. That being said, I think we’ve seen it cut down a quite bit over the past few weeks in our benchmarks (though still certainly improvements to be made still).

Hi all,

I had another play with this just and I think the elevated run time we are seeing is due to the time spent loading the S3 SDK into memory.

The S3.getObject() runs in a reasonable time. But I think the issue of it feeling slow is due to the SDK having to be loaded into the function?

Just to add i’m using the latest AWS v2 - 2.1133.0

Thanks

I confirmed on my end and I am seeing this looks to be exactly the case. In the short term until you have guidance on how to increase this performance I am using the 3rd party service which is currently very fast.

Thank you

1 Like

@Tyler_Collins thank you for sharing your current solution. Are you able to provide a code snippet on how you are using the 3rd party service? Or at least elaborate a little bit more on how you are using this service? I have been dealing with this AWS issue for some time now and would love to use your solution to this performance bottleneck.

Cheers!

1 Like

This should help… But bare in mind that it is deprecated and the external dependencies are being pushed now.

1 Like

Ah, I see now, I think I just misunderstood the solution being used. Thanks for sharing though!

2 Likes

Hey @henna.s / @Drew_DiPalma

Any update on the AWS service?

Is it feasible from MongoDB’s side that we could leave the 3rd party services just for AWS, as it performs so well!

Thanks

1 Like

Hi @Adam_Holt,

Apologies for the delay in getting back to you.

Could you let me know if you have tested the service recently? I have feedback from the engineering test, it takes up to 937 ms to load the AWS package.

> took 1.155882501s
> logs:
timespent: 937
> result:

{ "$undefined": true }
> result (JavaScript):
EJSON.parse('

{"$undefined":true}
')

However, the first run could take a longer time.

I will check in with the team on the continuity of the 3rd party services and will update you once I have more information available.

Your patience is genuinely appreciated.

Kind Regards,

@henna.s

Any other updates with AWS Services? The issue has been persistent for some time now and I am curious if / when it will be prioritized as the current performance with AWS really can not support production level applications. I am more than happy to help however I can.

In my use case, I am trying to upload a base64 image to AWS s3. The only solution I have been able to manage is passing in very low quality images to speed up the upload time before Realm times-out, but even then it is not guaranteed.

Similar to @Adam_Holt’s question. If using 3rd party services for AWS is the solution that’s fine, but getting some clarity as to when these services are guaranteed to be around until would be very helpful before my team (or others) invest resources into making changes that may become obsolete within 6 months.

Thank you for your help.

G’Day @Jason_Tulloch,

I appreciate you raising your concern but could you please provide a brief on what your current issue is? Unfortunately, the thread has added multiple concerns to the same topic heading.

Could you confirm the issue is loading the AWS S3 package in the function and it times out before you are able to perform any operation?

Could you share the comparable timings for both AWS S3 and 3rd party services for uploading your images?

The 3rd party service is available until the end of the year.

I look forward to your response.

Best,
Henna :performing_arts:

Hey @henna.s,

Thank you for the quick reply. I apologize that you have been trying to manage various concerns flowing through this thread, mine is related to the original issued shared just regarding AWS (specifically s3) performance. I am just trying to upload an image from Realm to AWS s3.

Loading the AWS SDK package (which I understand is the direction this thread started to take) is not the issue. We currently have not tested 3rd party services given the depreciation anticipated by the end of the year.

The image being uploaded is in base64. With that, my understanding is that as the image quality increases (# of bytes), there is a drastic reduction in the upload speed.

// uploadImageToS3
exports = async function(newId, newPicture, bucket) {  
  const S3 = require('aws-sdk/clients/s3'); // require calls must be in exports function
  const s3 = new S3({
    accessKeyId: context.values.get("AWS_ACCESS_KEY"),
    secretAccessKey: context.values.get("AWS_SECRET_ACCESS_KEY"),
    region: "AWS_REGION",
  })
    
  const putResult = await s3.putObject({
    Bucket: bucket,
    Key: newId,
    ContentType: "image/jpeg",
    Body: Buffer.from(newPicture, 'base64'),
    ContentEncoding: 'base64'
  }).promise();
}

Link to documentation: Class: AWS.S3 — AWS SDK for JavaScript

It is my understanding that using base64 may be driving the issue, but if that is so, it leads me to the following questions:

  • This would imply to me that normal - high quality base64 images are really not really supported by Realm. I say that because my Realm function times out after 60 seconds and sending the original image quality will force a timeout. In order to avoid a timeout consistently we have been reducing the base64 image quality prior to sending it to the Realm function. Is that a fair takeaway?
  • What image type should be passed through Realm? We are using base64 strings because frankly it was the only solution we could find that seemed to work. I’m not sure if documentation has been updated / added to be clearer as to the types of images we can upload.
  • Are there better solutions?

Your help is really appreciated on this. If we need to share more, I am happy to do so.

Cheers,
Jason

3 posts were split to a new topic: Issues with getSignedUrl AWS S3 function