Issues with local Atlas on MacOS 14

I’ve following the instructions here (https://www.mongodb.com/docs/atlas/cli/stable/atlas-cli-deploy-local/) to install all the dependancies and atlas-cli using Brew. When I run " atlas deployments setup" and accept the defaults for a local deployment the cli eventually fails with a “Error: context deadline exceeded” error. I ran through this a few times and managed to grab the podman logs for both the Mongodb server container (docker.io/mongodb/mongodb-enterprise-server:7.0-ubi8) and the atlas search container (docker.io/mongodb/mongodb-atlas-search:preview).

The Mongodb server appears to start ok, but the atlas search container has pages of exceptions starting with:

	at com.xgen.mongot.server.command.registry.CommandRegistry.getCommandRegistration(CommandRegistry.java:61)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.processMessage(MessageHandler.java:70)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.channelRead(MessageHandler.java:56)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.xgen.mongot.server.tcp.netty.ShutdownRequestRejector.channelRead(ShutdownRequestRejector.java:27)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:829)
2023-11-28T14:09:32.376+0000 W MONGOT [nioEventLoopGroup-3-4] [c.x.m.s.tcp.netty.MessageHandler] unexpected exception
java.lang.IllegalArgumentException: no command registered for getParameter
	at com.xgen.mongot.server.command.registry.CommandRegistry.getCommandRegistration(CommandRegistry.java:61)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.processMessage(MessageHandler.java:70)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.channelRead(MessageHandler.java:56)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.xgen.mongot.server.tcp.netty.ShutdownRequestRejector.channelRead(ShutdownRequestRejector.java:27)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:829)
2023-11-28T14:09:32.378+0000 W MONGOT [nioEventLoopGroup-3-2] [c.x.m.s.tcp.netty.MessageHandler] unexpected exception
java.lang.IllegalArgumentException: no command registered for buildInfo
	at com.xgen.mongot.server.command.registry.CommandRegistry.getCommandRegistration(CommandRegistry.java:61)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.processMessage(MessageHandler.java:70)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.channelRead(MessageHandler.java:56)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.xgen.mongot.server.tcp.netty.ShutdownRequestRejector.channelRead(ShutdownRequestRejector.java:27)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:829)
2023-11-28T14:09:32.381+0000 I MONGOT [nioEventLoopGroup-3-5] [c.xgen.mongot.server.tcp.TcpServer] Received connection fe687dfffef95a99-00000005-00000005-49d44877304382d1-0232e3d0 from /10.89.0.10:49368
type or paste code here
2 Likes

Looks like the first exception was truncated, here it is:

2023-11-28T14:09:32.374+0000 W MONGOT [nioEventLoopGroup-3-3] [c.x.m.s.tcp.netty.MessageHandler] unexpected exception
java.lang.IllegalArgumentException: no command registered for atlasVersion
	at com.xgen.mongot.server.command.registry.CommandRegistry.getCommandRegistration(CommandRegistry.java:61)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.processMessage(MessageHandler.java:70)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.channelRead(MessageHandler.java:56)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.xgen.mongot.server.tcp.netty.ShutdownRequestRejector.channelRead(ShutdownRequestRejector.java:27)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:829)

Any thoughts on approach to resolving would be appreciated. I’ve tried the usual reboot/re-install/drink more coffee/stand on one leg but no luck.

Hi @Tony_Curwen,

Thanks for raising this one. I was able to get the same mongot / no command registered for atlasVersion messages you posted however wasn’t able to actually get the “Error: context deadline exceeded” error in your initial post. I’m wondering can you provide the following:

  1. Terminal input and output from the point of running atlas deployments setup to the error
  2. atlascli version (atlas -v should provide this)

Redact any personal or sensitive information before posting here.

Regards,
Jason

Hi Jason,

The MacOS terminal session looks like this:

tony@MacBook-Air-M2 ~ % atlas deployments setup
? What type of deployment would you like to work with? local

[Default Settings]
Deployment Name   local2772
MongoDB Version   7.0
Port              56714

? How do you want to set up your local Atlas deployment? default
Creating your cluster local2772
1/2: Starting your local environment...
2/2: Creating your deployment local2772...
Error: context deadline exceeded

The atlas -v output is:

tony@MacBook-Air-M2 ~ % atlas -v
atlascli version: 1.13.0
git version: homebrew-release
Go version: go1.21.4
   os: darwin
   arch: arm64
   compiler: gc
1 Like

I had a similar problem, but my local deployment hangs with a Error: context deadline exceeded after 5 minutes on step 3.

? How do you want to set up your local Atlas deployment? default
Creating your cluster local1793 [this might take several minutes]
1/3: Starting your local environment...
2/3: Downloading the MongoDB binaries to your local environment...
3/3: Creating your deployment local1793...
Error: context deadline exceeded
1 Like

Thanks for providing the output from terminal during setup Andreo - Going to move this one onto another topic which had the same error. Did you receive this error performing any of the troubleshooting steps provided in the previous post or was this error generated just following the steps from an initial setup?

I have the exact same problem with a Macbook M1. The step 3 crash with the message “Error: context deadline exceeded”.

I tried with setup default, custom mongodb 7.0 or custom mongodb 6.0. I have always the same error.

I did all steps in the “Local machine Issues” and “Podman issues” with no success.

I don’t think this is the same as `atlas deployments setup` hangs with no output.

I’m also facing the same issue on MacBook Air M2, and tried the various things suggested in https://www.mongodb.com/docs/atlas/cli/stable/troubleshooting without much success.

1 Like

Hi all,
@Jason_Tran
Is there any progress with this issue?
same error in my machine.

No progress on my side. I wanted to run some tests with vector search/embeddings on a local system, but so far not possible on my system.

Hi All,

This is being looked into (error was able to be reproduced) but I have no further updates at this stage. I will update here if there are any relevant changes to be made that may resolve the error.

I noticed that there are mostly ARM mac’s mentioned here. @Ariel_Benesh are you running on an apple silicon mac as well or intel?

Regards,
Jason

Hi @Jason_Tran
I’m using an Apple silicon mac (Apple M1 Pro).
thanks

Hi everyone,

Thank you for you patience here. During the investigation it seemed that the error was caused by the issue in Podman and according to release notes was fixed in version 4.8.1. Upgrading to that version on our end has fixed the issue.
Please try it out and let us know how it goes.

Thanks,
Jakub

Hi Jakub - upgrading podman fixed it for me. Thanks for chasing this.

May have spoken too soon. The environment appears to start, but checking the logs for the atlas search container I see the following:

2023-12-08T18:17:06.814+0000 I MONGOT [main] [com.xgen.mongot.Mongot] [Starting Mongot - Mongot Version: local]
2023-12-08T18:17:07.028+0000 I MONGOT [main] [c.xgen.mongot.util.security.Security] installing FIPS security providers
2023-12-08T18:17:07.743+0000 I MONGOT [main] [com.xgen.mongot.Mongot] Bootstrapping with mongod connection 10.89.0.10:27017
2023-12-08T18:17:07.808+0000 I MONGOT [main] [c.x.m.i.lucene.config.LuceneConfig] refreshExecutorThreads not configured, defaulting to 1.
2023-12-08T18:17:07.808+0000 I MONGOT [main] [c.x.m.i.lucene.config.LuceneConfig] numMaxMergeThreads not configured, defaulting to 1.
2023-12-08T18:17:07.808+0000 I MONGOT [main] [c.x.m.i.lucene.config.LuceneConfig] numMaxMerges not configured, defaulting to 1.
2023-12-08T18:17:07.809+0000 I MONGOT [main] [c.x.m.i.lucene.config.LuceneConfig] ramBufferSizeMb not configured, defaulting to 20.0.
2023-12-08T18:17:07.810+0000 I MONGOT [main] [c.x.m.i.lucene.config.LuceneConfig] nrtCacheEnabled not configured, defaulting to false.
2023-12-08T18:17:07.810+0000 I MONGOT [main] [c.x.m.i.lucene.config.LuceneConfig] nrtTotalCacheSizeMb not configured, defaulting to 15.0.
2023-12-08T18:17:07.810+0000 I MONGOT [main] [c.x.m.i.lucene.config.LuceneConfig] nrtMaxMergeSizeMb not configured, defaulting to 1.5.
2023-12-08T18:17:07.811+0000 I MONGOT [main] [c.x.m.i.lucene.config.LuceneConfig] maxMergedSegmentSize not configured, defaulting to 474.0 MiB.
2023-12-08T18:17:07.812+0000 I MONGOT [main] [c.x.m.i.lucene.config.LuceneConfig] concurrentSearchExecutorThreads not configured, defaulting to 2.
2023-12-08T18:17:07.812+0000 I MONGOT [main] [c.x.m.i.lucene.config.LuceneConfig] using Lucene Version 9.7.0.
2023-12-08T18:17:07.815+0000 I MONGOT [main] [c.x.m.r.m.MongoDbReplicationConfig] numConcurrentInitialSyncs not configured, defaulting to 2.
2023-12-08T18:17:07.815+0000 I MONGOT [main] [c.x.m.r.m.MongoDbReplicationConfig] numConcurrentChangeStreams not configured, defaulting to 4.
2023-12-08T18:17:07.815+0000 I MONGOT [main] [c.x.m.r.m.MongoDbReplicationConfig] numIndexingThreads not configured, defaulting to 1.
2023-12-08T18:17:07.816+0000 I MONGOT [main] [c.x.m.r.m.MongoDbReplicationConfig] changeStreamMaxTimeMs not configured, defaulting to 1000.
2023-12-08T18:17:07.817+0000 I MONGOT [main] [c.x.m.r.mongodb.DurabilityConfig] numCommittingThreads not configured, defaulting to 1.
2023-12-08T18:17:07.825+0000 I MONGOT [main] [c.x.m.i.lucene.LuceneGlobalSettings] Lucene max clause count set to: 1024
2023-12-08T18:17:08.282+0000 I MONGOT [main] [org.mongodb.driver.cluster] Cluster created with settings {hosts=[10.89.0.10:27017], mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='10000 ms'}
2023-12-08T18:17:08.344+0000 I MONGOT [main] [org.mongodb.driver.cluster] Cluster created with settings {hosts=[10.89.0.10:27017], mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='10000 ms'}
2023-12-08T18:17:08.361+0000 I MONGOT [InitialSyncDispatcher] [c.x.m.r.m.i.InitialSyncQueue$InitialSyncDispatcher] 0 queued initial syncs.
2023-12-08T18:17:08.372+0000 I MONGOT [cluster-ClusterId{value='65735da41996e74dea6aaf8d', description='null'}-10.89.0.10:27017] [org.mongodb.driver.connection] Opened connection [connectionId{localValue:1, serverValue:5}] to 10.89.0.10:27017
2023-12-08T18:17:08.379+0000 I MONGOT [cluster-rtt-ClusterId{value='65735da41996e74dea6aaf8d', description='null'}-10.89.0.10:27017] [org.mongodb.driver.connection] Opened connection [connectionId{localValue:2, serverValue:4}] to 10.89.0.10:27017
2023-12-08T18:17:08.395+0000 I MONGOT [SynonymSyncDispatcher] [c.x.m.r.m.synonyms.SynonymManager] 0 queued synonym syncs.
2023-12-08T18:17:08.397+0000 I MONGOT [cluster-ClusterId{value='65735da41996e74dea6aaf8d', description='null'}-10.89.0.10:27017] [org.mongodb.driver.cluster] Monitor thread successfully connected to server with description ServerDescription{address=10.89.0.10:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=21, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=22486699, setName='rs-localdev', canonicalAddress=mongod-local4580:27017, hosts=[mongod-local4580:27017], passives=[], arbiters=[], primary='mongod-local4580:27017', tagSet=TagSet{[]}, electionId=7fffffff0000000000000001, setVersion=1, topologyVersion=TopologyVersion{processId=65735d9f9b83b49e0d9c93d9, counter=6}, lastWriteDate=Fri Dec 08 18:17:05 UTC 2023, lastUpdateTimeNanos=576309941879761}
2023-12-08T18:17:08.406+0000 I MONGOT [main] [c.x.m.c.manager.DefaultConfigManager] Initializing
2023-12-08T18:17:08.415+0000 I MONGOT [main] [c.x.m.c.manager.DefaultConfigManager] No config journal found, nothing to initialize
2023-12-08T18:17:08.443+0000 I MONGOT [cluster-ClusterId{value='65735da41996e74dea6aaf8e', description='null'}-10.89.0.10:27017] [org.mongodb.driver.connection] Opened connection [connectionId{localValue:3, serverValue:6}] to 10.89.0.10:27017
2023-12-08T18:17:08.443+0000 I MONGOT [cluster-ClusterId{value='65735da41996e74dea6aaf8e', description='null'}-10.89.0.10:27017] [org.mongodb.driver.cluster] Monitor thread successfully connected to server with description ServerDescription{address=10.89.0.10:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=21, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=14469800, setName='rs-localdev', canonicalAddress=mongod-local4580:27017, hosts=[mongod-local4580:27017], passives=[], arbiters=[], primary='mongod-local4580:27017', tagSet=TagSet{[]}, electionId=7fffffff0000000000000001, setVersion=1, topologyVersion=TopologyVersion{processId=65735d9f9b83b49e0d9c93d9, counter=6}, lastWriteDate=Fri Dec 08 18:17:05 UTC 2023, lastUpdateTimeNanos=576310013124766}
2023-12-08T18:17:08.447+0000 I MONGOT [cluster-rtt-ClusterId{value='65735da41996e74dea6aaf8e', description='null'}-10.89.0.10:27017] [org.mongodb.driver.connection] Opened connection [connectionId{localValue:4, serverValue:7}] to 10.89.0.10:27017
2023-12-08T18:17:08.607+0000 I MONGOT [main] [c.x.m.s.tcp.server.AbstractTcpServer] Starting query server on address 0.0.0.0/0.0.0.0:27027
2023-12-08T18:17:08.608+0000 I MONGOT [main] [c.x.m.s.tcp.server.MessageServer] starting on 0.0.0.0/0.0.0.0:27027
2023-12-08T18:17:08.896+0000 I MONGOT [nioEventLoopGroup-3-1] [c.xgen.mongot.server.tcp.TcpServer] Received connection 6a8944fffe08e2d6-00000005-00000001-919889b12e958c17-f9d49bbc from /10.89.0.10:51140
2023-12-08T18:17:08.937+0000 I MONGOT [nioEventLoopGroup-3-2] [c.xgen.mongot.server.tcp.TcpServer] Received connection 6a8944fffe08e2d6-00000005-00000002-a0124ef12e958c47-beb9e530 from /10.89.0.10:51152
2023-12-08T18:17:08.937+0000 I MONGOT [nioEventLoopGroup-3-3] [c.xgen.mongot.server.tcp.TcpServer] Received connection 6a8944fffe08e2d6-00000005-00000003-9f9dcef12e958c47-1c029e09 from /10.89.0.10:51156
2023-12-08T18:17:08.955+0000 W MONGOT [nioEventLoopGroup-3-3] [c.x.m.s.tcp.netty.MessageHandler] unexpected exception
java.lang.IllegalArgumentException: no command registered for atlasVersion
	at com.xgen.mongot.server.command.registry.CommandRegistry.getCommandRegistration(CommandRegistry.java:61)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.processMessage(MessageHandler.java:70)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.channelRead(MessageHandler.java:56)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.xgen.mongot.server.tcp.netty.ShutdownRequestRejector.channelRead(ShutdownRequestRejector.java:27)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:829)
2023-12-08T18:17:08.964+0000 W MONGOT [nioEventLoopGroup-3-2] [c.x.m.s.tcp.netty.MessageHandler] unexpected exception
java.lang.IllegalArgumentException: no command registered for buildInfo
	at com.xgen.mongot.server.command.registry.CommandRegistry.getCommandRegistration(CommandRegistry.java:61)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.processMessage(MessageHandler.java:70)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.channelRead(MessageHandler.java:56)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.xgen.mongot.server.tcp.netty.ShutdownRequestRejector.channelRead(ShutdownRequestRejector.java:27)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:829)
2023-12-08T18:17:08.965+0000 I MONGOT [nioEventLoopGroup-3-4] [c.xgen.mongot.server.tcp.TcpServer] Received connection 6a8944fffe08e2d6-00000005-00000004-c3d0d3712e958c4d-bc741460 from /10.89.0.10:51166
2023-12-08T18:17:08.969+0000 I MONGOT [nioEventLoopGroup-3-5] [c.xgen.mongot.server.tcp.TcpServer] Received connection 6a8944fffe08e2d6-00000005-00000005-709037712e958c4f-e20794eb from /10.89.0.10:51180
2023-12-08T18:17:08.970+0000 W MONGOT [nioEventLoopGroup-3-4] [c.x.m.s.tcp.netty.MessageHandler] unexpected exception
java.lang.IllegalArgumentException: no command registered for getParameter
	at com.xgen.mongot.server.command.registry.CommandRegistry.getCommandRegistration(CommandRegistry.java:61)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.processMessage(MessageHandler.java:70)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.channelRead(MessageHandler.java:56)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.xgen.mongot.server.tcp.netty.ShutdownRequestRejector.channelRead(ShutdownRequestRejector.java:27)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:829)
2023-12-08T18:17:08.974+0000 W MONGOT [nioEventLoopGroup-3-5] [c.x.m.s.tcp.netty.MessageHandler] unexpected exception
java.lang.IllegalArgumentException: no command registered for aggregate
	at com.xgen.mongot.server.command.registry.CommandRegistry.getCommandRegistration(CommandRegistry.java:61)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.processMessage(MessageHandler.java:70)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.channelRead(MessageHandler.java:56)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.xgen.mongot.server.tcp.netty.ShutdownRequestRejector.channelRead(ShutdownRequestRejector.java:27)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:829)
2023-12-08T18:17:09.003+0000 W MONGOT [nioEventLoopGroup-3-4] [c.x.m.s.tcp.netty.MessageHandler] unexpected exception
java.lang.IllegalArgumentException: no command registered for getLog
	at com.xgen.mongot.server.command.registry.CommandRegistry.getCommandRegistration(CommandRegistry.java:61)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.processMessage(MessageHandler.java:70)
	at com.xgen.mongot.server.tcp.netty.MessageHandler.channelRead(MessageHandler.java:56)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)
	at com.xgen.mongot.server.tcp.netty.ShutdownRequestRejector.channelRead(ShutdownRequestRejector.java:27)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:829)
2023-12-08T18:17:10.005+0000 I MONGOT [nioEventLoopGroup-3-1] [c.xgen.mongot.server.tcp.TcpServer] Closing connection 6a8944fffe08e2d6-00000005-00000001-919889b12e958c17-f9d49bbc
2023-12-08T18:17:10.006+0000 I MONGOT [nioEventLoopGroup-3-2] [c.xgen.mongot.server.tcp.TcpServer] Closing connection 6a8944fffe08e2d6-00000005-00000002-a0124ef12e958c47-beb9e530
2023-12-08T18:17:10.006+0000 I MONGOT [nioEventLoopGroup-3-1] [c.x.m.server.tcp.op.OperationManager] killing ops associated with channel 6a8944fffe08e2d6-00000005-00000001-919889b12e958c17-f9d49bbc
2023-12-08T18:17:10.006+0000 I MONGOT [nioEventLoopGroup-3-2] [c.x.m.server.tcp.op.OperationManager] killing ops associated with channel 6a8944fffe08e2d6-00000005-00000002-a0124ef12e958c47-beb9e530
2023-12-08T18:17:10.006+0000 I MONGOT [nioEventLoopGroup-3-5] [c.xgen.mongot.server.tcp.TcpServer] Closing connection 6a8944fffe08e2d6-00000005-00000005-709037712e958c4f-e20794eb
2023-12-08T18:17:10.006+0000 I MONGOT [nioEventLoopGroup-3-5] [c.x.m.server.tcp.op.OperationManager] killing ops associated with channel 6a8944fffe08e2d6-00000005-00000005-709037712e958c4f-e20794eb
2023-12-08T18:17:10.007+0000 I MONGOT [nioEventLoopGroup-3-4] [c.xgen.mongot.server.tcp.TcpServer] Closing connection 6a8944fffe08e2d6-00000005-00000004-c3d0d3712e958c4d-bc741460
2023-12-08T18:17:10.007+0000 I MONGOT [nioEventLoopGroup-3-4] [c.x.m.server.tcp.op.OperationManager] killing ops associated with channel 6a8944fffe08e2d6-00000005-00000004-c3d0d3712e958c4d-bc741460
2023-12-08T18:17:10.005+0000 I MONGOT [nioEventLoopGroup-3-3] [c.xgen.mongot.server.tcp.TcpServer] Closing connection 6a8944fffe08e2d6-00000005-00000003-9f9dcef12e958c47-1c029e09
2023-12-08T18:17:10.007+0000 I MONGOT [nioEventLoopGroup-3-3] [c.x.m.server.tcp.op.OperationManager] killing ops associated with channel 6a8944fffe08e2d6-00000005-00000003-9f9dcef12e958c47-1c029e09

Hi all &, @Jason_Tran
just updating, it wasn’t the issue on my machine because I’m with the latest Podman version (4.8.0).
but I saw that there is a new version of the CLI so I installed it via brew and it works like a charm.
I can create search indexes and run queries on them, etc.
thanks.

1 Like

Is there anything that we can do to help you get more info on this !? from my side, even with an updated podman it still stuck with no log information. also running on mac M1. LMK

Hi @Herbert_Pimentel ,
Can you please provide more details of what issues are you rinning into? Ideally, can you describe the steps you’re taking and when running into issues, can you share the command output in --debug mode? Including Atlas CLI, Podman, Qemu, OS versions will help too.

@Tony_Curwen were you able to make it work for you? Or are you still running into issues? If so, can you provide the steps, --debug output and version information?

hey there, sorry for intruding I’m not on mac os but this is the closest thread I found to what I’m facing.
I’m having the same issue on my Ubuntu 22.04 LTS Jammy Jellyfish server,
here the docker compose:

services:
  mongo:
    image: mongodb/atlas
    privileged: true
    entrypoint: '/home/scripts/entrypoint.sh'
    tty: true
    volumes:
      - ./atlas:/home/scripts
    ports:
      - 27017:27017
    env_file:
      - .env

and the entrypoint.sh inside the atlas folder:

#!/usr/bin/env bash

DEPLOYMENT_INFO=$(atlas deployments list | grep '$ATLAS_DEPLOYEMENT_NAME')

if [[ $DEPLOYMENT_INFO ]]; then
    # Restart a deployment
    atlas deployments start $ATLAS_DEPLOYEMENT_NAME
else
    # Create a new deployment
    atlas deployments setup $ATLAS_DEPLOYEMENT_NAME --type local --port 27778 --username $ATLAS_USERNAME --password $ATLAS_PASSWORD --bindIpAll --skipSampleData --force
fi

# Pause the deployment whenever this container is shutdown to avoid corruption.
function graceful_shutdown() {
    atlas deployments pause $ATLAS_DEPLOYEMENT_NAME
}
trap 'graceful_shutdown' EXIT

sleep infinity &
wait $!

current logs are:

[+] Running 1/0
 ⠿ Container architectural_components-mongo-1  Created                                                                                                                              0.0s
Attaching to architectural_components-mongo-1
architectural_components-mongo-1  | 
architectural_components-mongo-1  | To list both local and cloud Atlas deployments, authenticate to your Atlas account using the "atlas login" command.
architectural_components-mongo-1  | 
architectural_components-mongo-1  | [Default Settings]
architectural_components-mongo-1  | Deployment Name   centrodocumentazione3513035143458465
architectural_components-mongo-1  | MongoDB Version   7.0
architectural_components-mongo-1  | Port              27778
architectural_components-mongo-1  | 
architectural_components-mongo-1  | Creating your cluster %ATLAS_DEPLOYEMENT_NAME%
architectural_components-mongo-1  | 1/2: Starting your local environment...
2/2: Creating your deployment %ATLAS_DEPLOYEMENT_NAME%...
Error: context deadline exceeded