Trying to download my datasets after connecting to my cluster

Hello Community, I am new to the M201 class. The first lab, i am trying to download the dataset as instructed in the labs, but when i run the commands, I get this error


SyntaxError : Missing semicolon. (1:14)

mongoimport --drop -c people --uri mongodb+srv:

I tried putting the semicolon where the arrow was pointed, but couldnt resolve it.

Can someone please point me in the right directions

Hi @Oludare_Jolayemi you will get that error if you try to run the mongoimport command from inside of the mongosh tool. You need to run mongoimport at your operating system prompt.

Thank you so much Doug. Question: How do i navigate to MongoImport at the Operating system level from my terminal?

If you used an installer to install the tools, then they should be be in your path environment variable and you could just type in mongoimport. This is the same as you did for mongosh.

If you downloaded a zip file and extracted the tools somewhere, then you would just change directory to that location with the cd command and then type in mongoimport (windows) or ./mongoimport (Linux/Mac).

Hello again DOug, i did as instructed, but i still get this error after running the the command below

mongoimport --drop -c people mongodb+srv:// people.json

zsh: no matches found: mongodb+srv:// people.json

also the path to my mongoimport is /Users/mymacusername/mongodb-database-tools-macos-x86_64-100.6.0/bin.

I cd to the path above and ran thesme command, still got same error

Can you please post a screen shot of what you’re doing. You should not be getting a syntax error if running mongoimport from the command line, which is what it sounds like since you say

This is what I get if I use the command provided in your last message:

I can connect, but not authenticate, which makes sense as I’m hoping that is not the correct username/password.

That is a zsh error and not an error with mongoimport itself.

You’ve most likely got an * in your password (you did make some characters hard to read although most could be guessed so it’s hard to tell). If you do have an * in the password you need to escape that by placing a \ in front of it so zsh doesn’t try to do shell expansion.

1 Like

Thamks Doug, i do have * in my password. When i followed the instruction you gave, i got this error

OK, so Mongo now is not liking your credentials. I just created a test user that had an * in the password and had no issues, but I did notice that during the creation of the user there was the following warning:

Can you try replacing the \* in the command as I previously instructed with %2A which is the URL encoding for an asterisk.

The weird thing is that I’m able to run mongoimport with both \* and %2A (see below) so not sure why it’s failing for you.

Thanks Doug, i still cant get it to work changing that. I still get thesame error message

Can you use mongosh to connect to the database using those credentials?

If that is your Atlas instance, you could try creating another user that has a password with no special characters to see if you are able to run your mongoimport command.

yes, i could connect using mongosh. I will create a new cluster and get back to you.

You should just be able to add a new user to the current cluster.

1 Like

OK, so I just looked through the M201 course and see the following documentation from Lab 1.1

$ # Example for my Atlas cluster, you will need to change the
$ # user, password and Atlas cluster name

$ mongoimport --drop -c people --uri mongodb+srv:// people.json

$ mongoimport --drop -c restaurants --uri mongodb+srv:// restaurants.json

It looks like you’re trying to use the server that the instructor set up for the course. You would want to connect to your own cluster. I am now surprised that you were able to connect to the server using mongosh.

Wow…thank you Doug, it worked. Changed the clusyer to mine and i also added \ infrpnt of the *. It was able to import all my documents
I really appreciate your help

Glad you got it all worked out and that you were able to finally import your data. Best of luck in the course.

I want to download data from Amazon Aurora and upload it to Amazon S3. Attach the* IAM role to the DB cluster you created.Online payment gateway in uae. The main abstraction Spark provides is a resilient distributed dataset (RDD), Typically you want 2-4 partitions for each CPU in your cluster.

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.