AWS S3 can't read binary data anymore

Hi there.
I’m having trouble updating from third party services to bpm modules. I updated my uploadImageToS3 code, taken from the O-FISH repo, but now i’m getting the error:

Failed to upload image to S3: Error: Expected params.Body to be a string, Buffer, Stream, Blob, or typed array object

This is the code from my function, and the image is a bytearray sent from client.

exports =  function(name, image) {
  
   var bucket = "mybucket";
   const AWS = require('aws-sdk');
   
   AWS.config.update({
    accessKeyId :context.values.get("aws_access_key"),
    secretAccessKey : context.values.get("aws_secret_key"),
    region: "eu-central-1"
  });
  
  const s3 = new AWS.S3({apiVersion: '2006-03-01'});
   return s3.putObject({
      "Bucket": bucket,
      "Key": name,
      "ContentType": "image/jpeg",
      "Body": image,
      "ACL": "public-read"
   }).promise();
};

Whats wrong here?

1 Like

As suggested in other thread i’ve tried calling image.toBase64() - but now nothing happens. Any help would be greatly appriciated - i’m finding the documentation really lacking and migrating from 3party services to endpoints should really not be this difficult.

Rasmus - I can certainly relate to your problem and may have found a solution. Hopefully the documentation can / will be updated to clarify the best approach as well. In my example below I am using React Native and react-native-image-picker. This is probably additional detail than you needed, but including everything in case it helps others.

Addressing the Specific Issue
If you want convert the image to base64, try the following:

Buffer.from(image.Body).toString('base64')

for anyone reading this and uncertain what ‘image.Body’ represents, it should be a buffer with the following form:

{"data": [23,4,5,6,...], "type": "Buffer"}

If you do this in javascript be sure to import the native dependency (no import required for Realm functions) via

import { Buffer } from 'buffer'

Image Upload

const [image, setImage] = useState(null)
const onPressUploadImage = async () => {
    launchImageLibrary({ 
        mediaType: 'photo',
        maxWidth: 256,
        maxHeight: 256,
        selectionLimit: 1,
        includeBase64: true
    }, (response) => {
        const imageToUpload = response.assets[0].base64
        setImage(imageToUpload)
    })
}

At this point, an image has been successfully saved to ‘image’ as base64. You can confirm this by rendering the following (note that defining the height and width is required):

<Image
   style={{ height: 50, width: 50 }}
   source={{ uri: `data:image/jpg;base64,${image}`}}
/>

You should be able to see your image was saved to a variable on the frontend. Knowing that is the case, I added a button to upload the image to aws s3. On press, I simply did the following:

const onPressSendToS3 = async () => {
   try {
      await user.functions.uploadImageToS3(image)
   } catch (error) {
      console.error(error)
}

Alright, so what is happening in our Realm function…

exports = async function(athleteId, image) {
    const S3 = require('aws-sdk/clients/s3')
    const s3 = new S3({
      accessKeyId: context.values.get("AWS_ACCESS_KEY"),
      secretAccessKey: context.values.get("AWS_SECRET_ACCESS_KEY_LINKED"),
      region: "us-west-1",
    });
       
    const putResult = await s3.putObject({
      Bucket: "bucketName",
      Key: 'any image name',
      ContentType: "image/jpeg",
      Body: Buffer.from(image, 'base64') ,
      ContentEncoding: 'base64'
    }).promise();
}

A few comments on the function above (based on some issues I ran into / read about)

  1. You need to upload ‘aws-sdk’ as a dependency. The version needs to be v2 as v3 is not supported yet, I used 2.737.0 in this example.
  2. When accessing a secret, you must create the secret and then create another variable that links to the secret.
  3. The bucket name should match what is set up in s3 as a string.
  4. The key should be a string and can be custom, whatever you want to call the image.
  5. Buffer.from is built in and you do not need a dependency for this to work.
  6. ContentEncoding: this is required (I think after testing)

Fetch and View an Image
Including this as these issues are related for the most part. Create the following function in Realm.

exports = async function(picture) {  
    const S3 = require('aws-sdk/clients/s3')
    const s3 = new S3({
      accessKeyId: context.values.get("AWS_ACCESS_KEY"),
      secretAccessKey: context.values.get("AWS_SECRET_ACCESS_KEY_LINKED"),
      region: "us-west-1",
    });

    const getResult = await s3.getObject({
      Bucket: "bucketName",
      Key: picture,
    }).promise();
    
    const image = await JSON.parse(JSON.stringify(getResult))
    return image
}

Then in React Native I did the following

const [picture, setPicture] = useState('picture key in AWS s3')
    useEffect(() => {
        async function fetchImage () {
            try {
                if (picture.length > 0) {
                    const image = await user.functions.getS3Image.picture)
                    const imageClean = Buffer.from(image.Body).toString('base64')
                    setPicture(imageClean)
                }
            } catch (error) {
                console.error(error)
            }
        }
        fetchImage()
    }, [])

Lastly, to read the image

<ImageBackground
   style={styles.activityImage}
   source={{ uri: `data:image/jpeg;base64,${activityPicture}`}}
>
</ImageBackground>

Hope this helps and always welcome feedback. I am currently trying to find a way to enhance the speed of these functions, but at least this is a start!

1 Like

Good write up @Jason_Tulloch!

Are you also seeing a big slowdown in speed when comparing to the old 3rd party services method?

Take a look at what I just tested out here

Cheers @Adam_Holt !

Unfortunately (or fortunately) this was my first integration of the aws-sdk. I don’t have the exploratory code I was working on from earlier but I did notice that converting the image from base64 from a buffer

Body: Buffer.from(image, 'base64')

may be driving the issue. I was going to try tinkering with this conversion before sending the image to the Realm function.

Not sure if that helps (or gives you any thoughts) but will keep trying to find a better solution. If I do, will definitely share it here.

Me again… I think I found a workaround solution, dependent on the use case of the files you are saving to s3.

Originally, I was uploading images ~100 KB and was experiencing upload speeds between 10-30 seconds. I tried a few iterations of passing a base64 and/or buffer to my realm function without any positive results and given the simplicity of the function I explored the actual images themselves.

Now unfortunately my solution is specific to the package I am using, react-native-image-picker, so apologies that I can not give a better answer. In my case, the quality of the image was too ‘good’ or the size was too large.

Updating the function discussed earlier, I changed my maxWidth and maxHeight to 50 (from 256). The upload time is well under 2 seconds.

const onPressUploadImage = async () => {
    launchImageLibrary({ 
        mediaType: 'photo',
        maxWidth: 50,
        maxHeight: 50,
        selectionLimit: 1,
        includeBase64: true
    }, (response) => {
        const imageToUpload = response.assets[0].base64
        setImage(imageToUpload)
    })
}

Alternatively, and where I ultimately settled is reducing the quality of the image. Once again, this is specific to the frontend package I am working with but by setting the quality of the image to 0.5 vs. 1 (default?) I was able to set a higher max height and width but keep the upload time under 2 seconds.

const onPressUploadImage = async () => {
    launchImageLibrary({ 
        mediaType: 'photo',
        maxWidth: 300,
        maxHeight: 200,
        quality: 0.5,
        selectionLimit: 1,
        includeBase64: true
    }, (response) => {
        const imageToUpload = response.assets[0].base64
        setImage(imageToUpload)
    })
}

With these small tweaks, I am now uploading images closer to 10 KB.

I know this comment is specific to frontend, but I would shocked if this is the right solution. Fortunately I am working with simple, smaller images but there does seem to be a larger issue at play here. I am probably overstepping my knowledge, but is 100 KB really too large of a file size?

There should be a different solution to this. I am experiencing the same issue. If i upload images on my localhost to AWS a 150kb file takes around 150ms. With MongoDB functions using the SDK it takes 30 seconds. I hope they’ll find a solution, because I can’t…

@Sahmed_Matsko I completely agree, the solution above is just a temporary bandaid. I am currently following the thread below, the MongoDB team appears to be working on a solution.

1 Like

@Jason_Tulloch Thanks for sharing! Hope they will soon come with an update…