AWS services moving forward - 3rd party services depreciation

Hi All,

There have been some recent messages popup around the depreciation of 3rd party services such as AWS and Twilio. I think this is going to cause people a lot of confusion so I would advise MongoDB to add some examples of using the AWS package as a dependency.

In my projects across multiple clients, I have utilised the 3rd party AWS service to do lots of things such as S3, SQS, SFN, SNS. I have therefore just explored how to implement this change.

The example below gives a sort of “hello world” on how to get the AWS SDK working.

The access key and secret are stored in realm using the secrets and values area.

The installed aws-sdk package is v2.737.0. More recent ones don’t seem to work yet and I found this out from @Drew_DiPalma post HERE

exports = async function(){
  // Load the AWS SDK for Node.js
  const AWS = require('aws-sdk');
  // Set the AWS config
  AWS.config = new AWS.Config();
  AWS.config.accessKeyId = context.values.get("AWS_ACCESS_KEY");
  AWS.config.secretAccessKey = context.values.get("AWS_ACCESS_SECRET");
  AWS.config.region = context.values.get("AWS_REGION");
  
  // Create S3 service object
  s3 = new AWS.S3({apiVersion: '2006-03-01'});
  

  // Call S3 to list the buckets
  const buckets = await s3.listBuckets().promise()
  return buckets
};

One thing I would like to know from the MongoDB realm team is what sort of overhead is this going to add to our functions? Is it going to slow them down considerably if we need to load in the AWS SDK? The 3rd party services worked well in my opinion and didn’t really add any overhead.

As I mentioned, it would be great if there could be some detailed documention with examples around these dependencies that people will be needing. Mainly around AWS and HTTP, so maybe an example of the library axios or node-fetch would be ideal.

Thanks

7 Likes

Hello @Adam_Holt,

Thanks for raising this query and I acknowledge it has been a while since you asked and I have good news to share :slight_smile:

Please find the documentation link describing how 3rd party services can be replaced with npm modules.

One thing I would like to know from the MongoDB realm team is what sort of overhead is this going to add to our functions? Is it going to slow them down considerably if we need to load in the AWS SDK?

There should not be any difference in overhead between AWS service and aws-sdk. The service wraps the official Go SDK from amazon so under the hood, they are doing the same things.

I hope this helps. Please feel free to post a query should you run into issues.

Cheers,
Henna

1 Like

Hi @henna.s

Thanks for the response. I have given this a go now thanks to the documentation with the new usage of importing the S3 Node.js client.

However, I’m seeing an overhead using the node.js AWS package.

For example, take a look at these functions. There is a 40x increase in the time it takes to get an object from S3!

2326ms vs 57ms :scream:

Node.js package - 2326ms

exports = async function(){
  // Load the AWS SDK for Node.js
  const S3 = require('aws-sdk/clients/s3');
  const s3 = new S3({
    accessKeyId: context.values.get("AWS_ACCESS_KEY"),
    secretAccessKey: context.values.get("AWS_ACCESS_SECRET"),
    region: "ap-southeast-2",
  });

  // Call S3 to get object
  const beforeNodeSDK = new Date()
  const getResult = await s3.getObject({
    Bucket: "myBucket",
    Key: "myKey"
  }).promise()
  const afterNodeSDK = new Date()
  const timeTakenNodeSDK = afterNodeSDK - beforeNodeSDK
  
  return timeTakenNodeSDK // (result = 2326)
};

3rd Party Services (Go SDK) - 57ms

exports = async function() {
  // Load the built in AWS service
  const s3 = context.services.get("AWS").s3("ap-southeast-2");
  // Call S3 to get object
  const beforeGoSDK = new Date()
  const result = await s3.GetObject({
    Bucket: "myBucket",
    Key: "myKey"
  });
  const afterGoSDK = new Date()
  const timeTakenGoSDK = afterGoSDK - beforeGoSDK
  
  return timeTakenGoSDK // (result = 57)
};

Thanks!

Thanks, @Adam_Holt for reporting this.

Could you please check if subsequent requests are faster or are they consistently slow?

I look forward to your response.

Kind Regards,
Henna

Hey,

I ran the functions about 10 times each and the time taken was always similar.

Thanks

Thanks, Adam. I have reported this and I should be able to get you an update soon.

I appreciate your patience in the meantime.

Cheers,
Henna

2 Likes

I am seeing very similar results while working with the ‘aws-sdk’. Although, my results seem to be more random.

I have a simple function which signs URLs to upload to S3. In some cases the function returns in under 1 second and other cases it takes upwards of 15 seconds to return.

Hello @Adam_Holt , @Tyler_Collins,

Thank you for your patience.

Could you try running the function with aws sdk version 3 like this and check if you get the same slow performance?
const {S3Client, GetObjectCommand, PutObjectCommand} = require("@aws-sdk/client-s3");

Could you share the testing s3 file so we could try reproducing the same?

I look forward to your response.

Cheers,
Henna

Hi @henna.s

I’m unable to install that package. I get the following error info in the UI.

“failed to transpile node_modules/aws-crt/scripts/build.js. “aws-crt” is likely not supported yet. unknown: Unexpected reserved word ‘package’ (142:8)”

I did not specify a specific version. Just the package “@aws-sdk/client-s3”.

Thanks

Hi @Adam_Holt,

Thanks for sharing the result. This error happens if you click “Add New Dependency” to add it.

Could you try to use the old way (uploading node_modules.tar file) to install the dependency and let me know if the function still takes the same time?

Thanks a lot for the feedback, the “Add Dependency” should also work and the team is investigating that now.

I look forward to your response.

Cheers,
Henna

1 Like

When I try to install the module with the current version I receive the following error.
Failed to install dependencies failed to transpile node_modules/aws-crt/scripts/build.js. "aws-crt" is likely not supported yet. unknown: Unexpected reserved word 'package' (145:8)

I receive this regardless of if I upload a node_modules.tar or if I install in the UI. The only version I can get to install is 2.737.0

I saw the deprecation yesterday too and I’m seeing increases of times on my replacement of context.http to node-fetch (v2) from 5-6ms to 800ms. (didn’t deploy the change after seeing the times) In my case I was just using the triggers + functions to send some slack notifications.

Even though it is a different use case the underlying problem might be similar.

Hello @Tyler_Collins,

Thanks for sharing the feedback. Could you please share the testing S3 file, so that we can try to reproduce the issue on our end?

I look forward to your response.

Cheers :performing_arts:


Hi @henna.s

Here is an example of the 2 scripts side by side doing a simple GetObject from S3.

Node.js module takes 5827ms
Built in Go version takes 56ms

So in this case, it’s 104x slower.

Thanks

@henna.s Please see the attached code

// arg = [["key(filename)", "filetype"]]
exports = async function(arg){
  const S3 = require('aws-sdk/clients/s3');
  console.log(arg)
  
  const AWSAccessKeyID = context.values.get("AWSAccessKeyID_value");
  const AWSSecretKey = context.values.get("AWSSecretKey_value");

  // Configuring AWS
  const s3 = new S3({
    accessKeyId: AWSAccessKeyID, // stored in the .env file
    secretAccessKey: AWSSecretKey, // stored in the .env file
    region: "us-east-1",
  });

  // retrieve the bucket name
  const Bucket = "";

  // PUT URL Generator
  const generatePutUrl = (Key, ContentType) => {
    return new Promise((resolve, reject) => {
      // Note Bucket is retrieved from the env variable above.
      const params = { Bucket, Key, ContentType, Expires: 900 };
      // Note operation in this case is putObject
      s3.getSignedUrl('putObject', params, function(err, url) {
        if (err) {
          reject(err);
        }
        // If there is no errors we can send back the pre-signed PUT URL
        resolve(url);
      });
    });
  }
  
  const URLS = await Promise.all(arg.map((res, index) => {
    const awaitresult = generatePutUrl(res[0], res[1])
    return awaitresult
  }));
  
  return {urls: URLS};
};

As a quick update from the Realm engineering team – we’re actively working on tickets related to these performance issues and we expect to have a few significant improvements soon. While we have a date for removing 3rd Party Services, we are also going to continue monitoring usage and will be working with the community to make sure the deprecation date is fair, and extending if necessary.

3 Likes

Hello @Henrique_Silva,

Thank you for raising the query. This appears to be different than the aws-sdk issue. Could you please create a separate topic for this and share necessary details that can help us investigate.

I look forward to your response.

Cheers :performing_arts:

Do you have an ETA on the performance upgrades for “AWS-SDK”?

It looks like the “PresignURL” function from the s3 3rd party service no longer works so I am stuck waiting for the performance fixes to be made.

Thank you

Hey @Tyler_Collins

Try this code out. It works fine in my app.

exports = function(s3Path) {
  const s3 = context.services.get("AWS").s3("ap-southeast-2");
  const bucket = "my-bucket";

  const presignedUrl = s3.PresignURL({
    Bucket: bucket,
    Key: s3Path,
    // HTTP method that is valid for this signed URL. Can use PUT for uploads, or GET for downloads.
    Method: "GET",
    // Duration of the lifetime of the signed url, in milliseconds
    ExpirationMS: 900000
  });
  return presignedUrl;
};
1 Like

This worked, thank you

1 Like