AWS services moving forward - 3rd party services depreciation

Hey @henna.s / @Drew_DiPalma

Any update on the AWS service?

Is it feasible from MongoDB’s side that we could leave the 3rd party services just for AWS, as it performs so well!

Thanks

1 Like

Hi @Adam_Holt,

Apologies for the delay in getting back to you.

Could you let me know if you have tested the service recently? I have feedback from the engineering test, it takes up to 937 ms to load the AWS package.

> took 1.155882501s
> logs:
timespent: 937
> result:

{ "$undefined": true }
> result (JavaScript):
EJSON.parse('

{"$undefined":true}
')

However, the first run could take a longer time.

I will check in with the team on the continuity of the 3rd party services and will update you once I have more information available.

Your patience is genuinely appreciated.

Kind Regards,

@henna.s

Any other updates with AWS Services? The issue has been persistent for some time now and I am curious if / when it will be prioritized as the current performance with AWS really can not support production level applications. I am more than happy to help however I can.

In my use case, I am trying to upload a base64 image to AWS s3. The only solution I have been able to manage is passing in very low quality images to speed up the upload time before Realm times-out, but even then it is not guaranteed.

Similar to @Adam_Holt’s question. If using 3rd party services for AWS is the solution that’s fine, but getting some clarity as to when these services are guaranteed to be around until would be very helpful before my team (or others) invest resources into making changes that may become obsolete within 6 months.

Thank you for your help.

G’Day @Jason_Tulloch,

I appreciate you raising your concern but could you please provide a brief on what your current issue is? Unfortunately, the thread has added multiple concerns to the same topic heading.

Could you confirm the issue is loading the AWS S3 package in the function and it times out before you are able to perform any operation?

Could you share the comparable timings for both AWS S3 and 3rd party services for uploading your images?

The 3rd party service is available until the end of the year.

I look forward to your response.

Best,
Henna :performing_arts:

Hey @henna.s,

Thank you for the quick reply. I apologize that you have been trying to manage various concerns flowing through this thread, mine is related to the original issued shared just regarding AWS (specifically s3) performance. I am just trying to upload an image from Realm to AWS s3.

Loading the AWS SDK package (which I understand is the direction this thread started to take) is not the issue. We currently have not tested 3rd party services given the depreciation anticipated by the end of the year.

The image being uploaded is in base64. With that, my understanding is that as the image quality increases (# of bytes), there is a drastic reduction in the upload speed.

// uploadImageToS3
exports = async function(newId, newPicture, bucket) {  
  const S3 = require('aws-sdk/clients/s3'); // require calls must be in exports function
  const s3 = new S3({
    accessKeyId: context.values.get("AWS_ACCESS_KEY"),
    secretAccessKey: context.values.get("AWS_SECRET_ACCESS_KEY"),
    region: "AWS_REGION",
  })
    
  const putResult = await s3.putObject({
    Bucket: bucket,
    Key: newId,
    ContentType: "image/jpeg",
    Body: Buffer.from(newPicture, 'base64'),
    ContentEncoding: 'base64'
  }).promise();
}

Link to documentation: Class: AWS.S3 — AWS SDK for JavaScript

It is my understanding that using base64 may be driving the issue, but if that is so, it leads me to the following questions:

  • This would imply to me that normal - high quality base64 images are really not really supported by Realm. I say that because my Realm function times out after 60 seconds and sending the original image quality will force a timeout. In order to avoid a timeout consistently we have been reducing the base64 image quality prior to sending it to the Realm function. Is that a fair takeaway?
  • What image type should be passed through Realm? We are using base64 strings because frankly it was the only solution we could find that seemed to work. I’m not sure if documentation has been updated / added to be clearer as to the types of images we can upload.
  • Are there better solutions?

Your help is really appreciated on this. If we need to share more, I am happy to do so.

Cheers,
Jason

3 posts were split to a new topic: Issues with getSignedUrl AWS S3 function

@Jason_Tulloch Thank you so much for giving the problem statement in detail.

Could you please share a sample high-quality image that is not uploading so that the engineering team can test it on their end?

I look forward to your response.

Best,
Henna

@henna.s thanks for the quick replay.

Happy to share an image but I don’t think this is the right forum to share the image string. Depending on the image quality the string length is the following (275px x 150px image):

  • High Quality (Not being used): ~750k characters — Realm times out after 60 seconds
  • Current Quality (Being used): ~250k characters — this consistently takes between 15-60 seconds to upload

Can I share the base64 string another way? Or the actual image and engineering can quickly convert it to base64?

I do also want to stress that I do not need my images to be in base64, which I know increases the image size by ~1/3, they actually are originally jpg in most cases prior to being converted (prior to being sent to my Realm function), but this was the only way I could figure out how to upload an image from Realm to AWS, which are in the docs, albeit a portion that is being depreciated

Thanks again.

Hi @Jason_Tulloch ,

Thank you for your reply. You can share the image and hopefully, the team can convert it to a base64.

Could you also please share the complete code snippet above? How are you passing “newPicture” to the function?

Is this the image string you talking about that you are passing to the function?

I look forward to hearing from you.

Best,

Hey @henna.s,

An example image is below. Its just an image from my camera roll that I was unable to upload due to Realm’s 60 second timeout. “newPicture” is just an image’s base64 string, just as an example the strings look like this:

“/9j/4AAQSkZJRgABAQAA2ADYAAD/4QCMRXhpZgAATU0AKgAAAAgABQESAAMAAAAB…”

Is there anything else you need when you are asking for my complete code snippet? The code above is the entire function.

Thank you!

G’Day @Jason_Tulloch,

The engineering team has been able to repro uploading images slowly using the latest aws-sdk s3.

The time of just uploading image depends on the image size pretty likely in a linear way. For a small image with size 3k, time spent is ~500ms; for a medium image with size 117k, time spent is ~3.5s; for a large image with size 447k, time spent is ~12s.

I will update you once I have more info available. Meanwhile, would you be ok to use small images, or is it not a workable solution at all?

Best,
Henna

@henna.s,

Thank you so much for following up. We can continue to use smaller images for now. We’ve been dealing with this issue for several months so just knowing it is on the team’s radar is a step in the right direction.

Regards,
Jason

1 Like

G’Day @Jason_Tulloch ,

Thank you for your patience. I wanted to let you know that the date for the deprecation of 3rd party services can be pushed to ensure function performance matches developer expectations. Using 3rd party services in the meantime is acceptable.

If you have no further questions, I will close the topic. Please feel free to open a new topic if any more assistance is needed.

1 Like

@henna.s,

Thank you for the update. I am going to try to break up my follow-up questions for clarity…

  1. Do you have any clarity as to what pushing the 3rd party services depreciation date actually means. I know it’s hard to give an exact timeline, but it’s an investment on our end to switch to these services and not knowing when we will need to roll back our changes is a challenge.

  2. It sounds to me like this issue was not resolved and there is no expectation as to when we will be able to reasonably upload base64 images to s3 through Realm. Is that a fair assumption? Learning and confirming this will have an impact on our decisions going forward.

  3. Is there an alternative to 3rd party services that is recommended? For instance, we are uploading base64 images, but that is not necessary on our end, it is just the solution we found. We were not able to find any clear documentation for other examples, but are happy to explore those if the issue is specifically related to Realm + s3 + base64.

Hi @Jason_Tulloch,

I appreciate your concerns and I have some feedback from the product team

  1. The deprecation date can be pushed by at least 6 months as support for some packages isn’t fully at parity
  2. The best path forward would be to use pre-signed URLs + old 3rd party services at the moment if you want a more performant image upload
  3. Uploading images (especially large ones) is not something we’ve optimized our serverless functions for, as they are more for working with MongoDB Data/data transfer/shorter-lived functionality. Other alternatives that optimize for uploading images could involve using something like AWS Lambda directly to help solve this.

I hope the provided information is helpful.

2 Likes

@henna.s,

Thank you for following-up (again) and for the clarity here. It’s helpful to know that we have a least ~1 year before these services will be depreciated and appreciated that the team will not just drop its support without offering a better solution.

Regarding your comment on #3. Maybe its just my opinion, and maybe I am a niche case without recognizing it, but the images I had been trying to upload were small (<100kb). I would have imagined that they would have fit within your comment that the serverless functions are shorter-lived given their size.

One final question, it would be reassuring to see the warning in App Settings changed to note that 3rd Party Services are no longer planned to be depreciated on 12/1/2022. Or in the documentation calling out that AWS may have a different date. I am sure this is in the pipeline. Is that possible?

I really appreciate that you have offered 2 different solutions and your responsiveness these past few weeks. Time to change some code.

Cheers.

1 Like

Hello @Jason_Tulloch,

I believe this should fit in the comment, as the test results mentioned previously shows that image size between 3k - 117k takes 500ms - 3.5 s but if it is taking more time that needs to be looked at. Could you test using the alternate method using signedURLs as well?

I will confirm this with the product team and get back to you.

Thanks again for your patience.

Cheers, :performing_arts:

Hi @Jason_Tulloch and @henna.s,

I just wanted to chime in here as we have now rolled out the transition to production and no one has noticed any performance changes which is great for me! As when I started this thread I was quite worried.

@Jason_Tulloch I was also running into the same speed issue with uploading images directly to the serverless function by converting it to binary from memory and then getting the 3rd party services to upload and it worked well. However, when I changed to s3.putObject() from the AWS SDK it was really slow.

Therefore I now just upload the images/files by creating a S3 presigned URL with a realm function which is super quick, and then I use axios to do a PUT request of the file to that presigned URL. I have put an example below for you.

Realm function

exports = async function (args) {
  const bucket = "my-s3-bucket";

  const S3 = require("aws-sdk/clients/s3");
  const s3 = new S3({
    accessKeyId: context.values.get("awsAccessKeyId"),
    secretAccessKey: context.values.get("awsSecretAccessKey"),
    region: "ap-southeast-2"
  });

  const presignedUrl = await s3.getSignedUrlPromise("putObject", {
    Bucket: bucket,
    Key: args.Key,
    ContentType: args.ContentType,
    // Duration of the lifetime of the signed url, in milliseconds
    Expires: 900000
  });
  return presignedUrl;
};

Frontend JS code

import { realmUser } from "../../main";
const axios = require("axios");

export default class UploadFile {
  async handleFileUpload(file) {
    const returnObj = {
      s3Path: "",
      s3: {},
    };
    if (file.size > 26214400) {
      alert("No files over 25 MB supported");
      return false;
    }

    const key = `files/${file.name}`;
    returnObj.s3Path = key;
    // AWS S3 Request
    const args = {
      ContentType: file.type,
      Key: key,
    };

    try {
      const presignedUrl = await this.getPresignedS3URL(args);
      const options = {
        headers: {
          "Content-Type": file.type,
        },
      };
      // Saves the file to S3
      await axios.put(presignedUrl, file, options);
      returnObj.s3.key = key;
    } catch (error) {
      console.log(error);
    }

    // Return the data back to a componenet
    return returnObj;
  }

  async getPresignedS3URL(args) {
    return new Promise((resolve, reject) => {
      realmUser.functions
        .uploadTestBuyFile(args)
        .then((doc) => {
          resolve(doc);
        })
        .catch((err) => {
          reject(err);
        });
    });
  }
}

Hope this helps you!

Cheers

2 Likes

G’day @Jason_Tulloch , @Adam_Holt ,

Thank you Adam, for sharing the code snippets and your experience with using signedUrl :smiley: Happy to know this is working well for you.

Jason, the deprecation date for 3rd party services has now moved to Aug 2023 and will soon be reflected on the docs and the App Settings UI page. I hope this provides some assurance.

Cheers, :performing_arts:

1 Like

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.