C# REALM sync anomalies abound

It would seem there are extreme nuanses in this product. Realm in particular.
Its been suggested that I’m doing something unusual, but I don’t know if that’s the case?

I will tell you the struggle is real though.

What I feel should be simple. Isn’t and it’s frustrating what principle I’m missing. I’m sure I am.

All I want to do is have a Collection (Sales) with a couple nested object’s, even in a list. I gave up on Sales as that’s a more involved collection. How about Products? Only two Embeded as such. This is a test harness to just read the realm.

 public class Product : RealmObject
    {
        IList<Price> _priceList;
        
        public Product()
        {
            _id = ObjectId.GenerateNewId();
            _priceList = new List<Price>();
        }
        [BsonId]
        [PrimaryKey]
        [MapTo("_id")]
        public ObjectId _id { get; set; }
        [MapTo("_partition")]
        [BsonElement("_partition")]
        public string _partition { get; set; } = "6310fa126afd4bc77f5517e4";

        [MapTo("ref_id")]
        [BsonElement("ref_id")]
        public int RefId { get; set; }

        [MapTo("descript")]
        [BsonElement("descript")]
        public string Descript { get; set; }

 /*     [MapTo("prod_type")]
        [BsonElement("prod_type")]
        public ProductType ProdType { get; set; }
*/
        [MapTo("price")]
        [BsonElement("price")]
        public double Price { get; set; }

        [MapTo("price_list")]
        [BsonElement("price_list")]
        public Price[]? PriceList
        {
            get => _priceList.ToArray();
            set
            {
                _priceList.Clear();
                foreach (var price in value)
                {
                    _priceList.Add(price);
                }
            }
        }

        [MapTo("active")]
        [BsonElement("active")]
        public bool Active { get; set; }

    }
    public class ProductType : EmbeddedObject
    {
        [MapTo("prod_type_id")]
        [BsonElement("prod_type_id")]
        public int ProdTypeId { get; set; }

        [MapTo("type_descript")]
        [BsonElement("type_descript")]
        public string? TypeDescript { get; set; }

    }
    public class Price : EmbeddedObject
    {
        [MapTo("primary")]
        [BsonElement("primary")]
        public bool Primary { get; set; }

        [MapTo("charge")]
        [BsonElement("charge")]
        public double? Charge { get; set; }
    }

I use these same model’s to load the data (pulling from an existing DB) in another console app.
After I generated the exact same models on a client side console app to test. Which is basically this to list out the products.

try
            {
                var config = new AppConfiguration("nrt-product-xkyox");
                

                _RealmAppProduct = App.Create(config);
                Console.WriteLine("Test Instance generated");

                var user = await _RealmAppProduct.LogInAsync(Credentials.Anonymous());

                if (!user.State.Equals(UserState.LoggedIn))
                    Console.WriteLine($"Uer = null");
                else
                    Console.WriteLine("Authenticated");

                Console.WriteLine($"Instance of PRODUCT REALM Generated");

                var _partition = "6310fa126afd4bc77f5517e4";

                var partition_config = new PartitionSyncConfiguration(_partition, user);
                Console.WriteLine($"Test user Generated");

                _Realm_Test = await Realm.GetInstanceAsync(partition_config);
                Console.WriteLine($"Waiting for download");

                await _Realm_Test.SyncSession.WaitForDownloadAsync();
                Console.WriteLine($"Test Downloaded");


                var test = _Realm_Test.All<Product>();

                foreach (var s in test)
                {
                    Console.WriteLine(s.Descript);
                    
                }

                Console.ReadLine();
                // Read Sales data
            }
            catch (Exception ex) { Console.WriteLine($"InitializeRealm Exception\n{ex.Message}\n\n{ex.InnerException}"); }

Clearly I’m missing somthing because this just dosn’t work and I’m not clear enough to understand what’s happening? Note I commented out the ProdType property to try and simplify. That gives a worse error that makes no sense to me.

Without it I keep being told it’s made my partition key optional? Why would that be optional?

Realms.Exceptions.RealmException: The following changes cannot be made in additive-only schema mode:

  • Property ‘Product._partition’ has been made optional.

If I’m trying to think logically that should be “required”? Either way. Won’t work…despite it downloading the data in the local Realm DB. It’s just inacessible I guess?

Even when you look at the Realm Studio, it’s decided to convert my obect into something else.

image

That was before I commented the prod_type property out.

I don’t even know what to ask yet? I was ambitious and had all my collections (Sales, Employees, Product) all set out and figured “how hard could it be”? Match up some Schema’s…hack a little and should be workable even if it takes a bit to figure it out…right? Not any further than when I started at this point and just don’t know if this thing has any stability at all. Maybe all you can do is some basic primatives and that’s it? That would be pointless.

When I had my original 3 collections set with one AppService for each - that was the plan, that was a cluster.

I thought Maybe it was because I wanted my console app (expecting some threading issues even though I loaded Nito) to use the driver and Realm in the same app? That was the “design”.

Start with batching the data into Atlas
Sync the local Realms (simple partitions to start)
Simple CRUD on the realms.
All that in a Service. (console to start dev)

Except you can’t do that. Easily. The errors are abundant and, at this point at lease, make little sense.

Can you use the same model to load Atlas and in the Realm CRUD? I realize It inherits from the RealmObject, but what’s happening in the background? Certainly loads Atlas fine, but on the other side it just fails and complains and then reports things in other object models etc…

so, I thought I’d separate the load from the realm CRUD operations as above. All that’s supposed to do is read the realm.

So, it seems Realm is doing some reflection and failing maybe because I’m not sure what it’s doing.

The expectation would have been.; monitor a data source and do operations on the REALM’s.

Not sure where/why this seems insurmountable?

Okay, I think I understand what’s going on - unfortunately, the Realm .NET SDK doesn’t yet process models that are annotated for nullability correctly - that’s why it treats string as a nullable type. You need to add the [Required] attribute to all string properties that you want to treat as non-nullable. We realize that getting your local models and cloud models to sync is a little tricky, which is why there’s a tool you can use to export them - in your cloud app, you can navigate to “Realm SDKs” under the “Build” section and then choose the second tab which says “Realm Object Models” - you’ll have the option to export your json schema in a language of your choosing. Unfortunately, it will not add nullability annotations for nullable strings, so you’ll need to do that manually, but it will add all the [Required] attributes.

My hunch that you’re using nullability annotated files is due to ProductType.TypeDescript being declared as string?. If that’s not correct, then it may be a different issue. Note that after changing your C# models, you may need to delete your local Realm file to avoid schema validation errors.

1 Like

Much appreiated @nirinchev !!
That gives me hope that maybe it’s still possible. :slight_smile:

I’ve scrapped it and am starting fresh-ish. One mode/field at a time. I’ve managed to sync Products with only the primatives and that got synced. Now I’m going to attempt to add a single nested object. Which I’ll start with prod_type. Which is a Embedded ProductType model.

public class ProductType : EmbeddedObject
    {
        [MapTo("prod_type_id")]
        public int ProdTypeId { get; set; }

        [MapTo("type_descript")]
        
        public string? TypeDescript { get; set; }
    }

Is this not a workable property?

public ProductType prod_type { get; set; }

If that’s workable GREAT!!

Next will be an array of objects. If I get there?

I know that it can’t be annotated as [Required] due to it being an object?

I persiste :confused:

Thanks!!
CPT

You can definitely use embedded objects as properties, but make sure you’re not forming a loop. E.g. this is totally fine:

public class Product : RealmObject
{
    // ...
    public ProductType? prod_type { get; set; }
}

public class ProductType : EmbeddedObject
{
    // ...
}

But the following isn’t:

public class ProductType : EmbeddedObject
{
    public ProductType self_link { get; set; }
}

On the nullability note, make sure to annotate your ProductType property as nullable as object references can never be required with Realm. Realm will ignore the nullability annotation, but it will be there for the rest of your code to take advantage of.

1 Like

Thanks for the response.
I did end up doing that, but never got it to work.

Any idea what this error represents?

Realms.Exceptions.RealmException: The following changes cannot be made in additive-only schema mode:

  • Property ‘Product.prod_type’ has been changed from ‘<Product_prod_type>’ to ‘’.

So, I’ve separated the loading of Atlas and the Realm in separate test harnesses.

The loading model which does a batch to Atlas

public class Product : RealmObject
 {
        [PrimaryKey]
        [MapTo("_id")]
        public ObjectId _id { get; set; } = ObjectId.GenerateNewId();
        [Required]
        public string _mypartition { get; set; } = "6310fa126afd4bc77f5517e4";
        [Required]
        public string descript { get; set; }
        [Required]
        public int? ref_id { get; set; }
        public bool active { get; set; }
        
        public ProductType? prod_type { get; set; }

    }

Similar the Realm test model

public class Product : RealmObject
    {
        [PrimaryKey]
        public ObjectId _id { get; set; }
        [Required]
        public string _mypartition { get; set; }
        [Required]
        public string descript { get; set; }

        [Required]
        public int? ref_id { get; set; }
        public bool active { get; set; }
        
        public ProductType? prod_type { get; set; }

    }

Schema worked out to be.

"title": "Product",
  "required": [
    "_id",
    "active",
    "descript",
    "_mypartition",
    "ref_id"
  ],
  "properties": {
    "_id": {
      "bsonType": "objectId"
    },
    "_mypartition": {
      "bsonType": "string"
    },
    "active": {
      "bsonType": "bool"
    },
    "descript": {
      "bsonType": "string"
    },
    "prod_type": {
      "bsonType": "object",
      "properties": {
        "ProdTypeId": {
          "bsonType": "int"
        },
        "TypeDescript": {
          "bsonType": "string"
        }
      }
    },
    "ref_id": {
      "bsonType": "int"
    }
  }

I’m not exactly sure what “additive mode” is or what road that leads to.

It does populate and I can see it in the Studio. Excepct of course the same strange occurance happens to that property?

image

Feels close :confused:

CPT

Hey sorry for the delay - as far as I can tell, the issue is that your json schema doesn’t define a title for the prod_type object. I guess this causes the server to assume this object is of type "" (i.e. empty string).

Hello @nirinchev !!
I’ve been hacking a little and thanks to some of your insight (.NET Model Exports), maybe I’m onto a path.

What I noticed is that the Models were old models? Some of the attributes were named differently (caps etc) after some changes I’d been making. So, that was alarming.

I decided to copy the export and overwrite my current models to match. And what do you know…that error went away. AND I could print out the properties. HOWEVER, my Real Studio still showed the object as such. So, that still wasn’t accessible.

image

Nevertheless, I’m tearing it down again with that new tidbit.

I’m going to load the data and build my nested Models as they come out and see what happens. Maybe it’s just a matter of doing a new Realm App? Idk…i’ve been trying to do each from scratch every time, but I can’t reconcile how those old properties hung around.

Right now I’m loading fresh data and will build a new Realm App service around it and see if what I’m thinking works or not :frowning:

Thanks for responding…it’s a tremendous help as it strings me along thinking this is “possible”. If I get to some “understanding” about objects and array’s of objects then I’d be able to make some progress.

Very nuanced product I’d say. But when it printed data a sh*t my pants!! :rofl: So, onward!!

Cheers.
CPT

hmph. I got it!! As I said I am loading things into Atlas with my

Model:

public class Product : RealmObject
{
        [PrimaryKey]
        [MapTo("_id")]
        public ObjectId _id { get; set; } = ObjectId.GenerateNewId();
        [Required]
        public string _mypartition { get; set; } = "6310fa126afd4bc77f5517e4";
        [Required]
        public string descript { get; set; }
        [Required]
        public int? ref_id { get; set; }
        public bool active { get; set; }
        
        public ProductType? prod_type { get; set; }

    }


public class ProductType : EmbeddedObject
    {
        public int? ProdTypeId { get; set; }

        public string TypeDescript { get; set; }
    }

I guess this is where Idk what’s happening, but I deduced via the .NET SDK Model export that Mongo DB doesn’t name it the same. so, you have to follow what Mondo is doing. I think?

The export of the Product and ProductType Models as Mongo see’s it is as follows. So, they re-named the embedded Model?

public class Product : RealmObject
{
    [MapTo("_id")]
    [PrimaryKey]
    public ObjectId Id { get; set; }
    [MapTo("_mypartition")]
    public string Mypartition { get; set; }
    [MapTo("active")]
    public bool Active { get; set; }
    [MapTo("descript")]
    [Required]
    public string Descript { get; set; }
    [MapTo("prod_type")]
    public Product_prod_type ProdType { get; set; }
    [MapTo("ref_id")]
    public int RefId { get; set; }
}

public class Product_prod_type : EmbeddedObject
{
    public int? ProdTypeId { get; set; }
    public string TypeDescript { get; set; }
}

Once I used THEIR naming convention and generated a class Model of the same name…bingo.

var test = _Realm_Test.All<Product>();

                foreach (var s in test)
                {
                    Console.WriteLine($"{s.RefId}\t{s.Descript}\t{s.Active}\t{s.Mypartition}\t{(s.ProdType.TypeDescript)}");
                    
                }

The results… Which is correct.

Waiting for download
Test Downloaded
1       Test Product x  False   6310fa126afd4bc77f5517e4        Ordering Product
2001    Test Prod 22    True    6310fa126afd4bc77f5517e4        Ordering Product
2002    Default Message True    6310fa126afd4bc77f5517e4        Manual Keyboard
2003    Default Seat    True    6310fa126afd4bc77f5517e4        Seating Position
2004    /as Appetizer   True    6310fa126afd4bc77f5517e4        Option (ie Hold)
2005    /as Main        True    6310fa126afd4bc77f5517e4        Option (ie Hold)
2006    Hold and Fire   True    6310fa126afd4bc77f5517e4        Delay Print Command
2007    Clear Table     True    6310fa126afd4bc77f5517e4        Bussing Command
2009    Arizona Iced Tea        True    6310fa126afd4bc77f5517e4        Ordering Product
2010    Cup of Joe      True    6310fa126afd4bc77f5517e4        Ordering Product

I absolutely don’t know what assumptions I’m making with this hack I/we got to, but it’s curious if this is the expectation of how to do what I’m doing?

Ugh…
Anywho, it’s forward movement and now I gotta see how an array looks in Mongo’s eyes.

Cheers,
CPT

@nirinchev , I think I have some success!!
I haven’t merged my realm app back to the originally intended console (to be a service) but I’ve loaded the Product data (11000 docs) with array’s of objects and can manipulate the array of objects directly even with now setter defined as the complier will not allow when inheriting from RealmObject.

I can’t imagine this is the final driver version, which is a concern as I suspect I’d have to rewrite all my models to the intended nameing/models.

It would seem “logical” to use the same model for both Atlas and Realm work, but I’ll see how that goes.

Not that intuitive, I’d think, but it’s good to get some results finally. My expectation is that I’ll have to include redundant Object Models naming the objects as Mongo does.

At least there’s a “process”.

  1. Load the documents for all collections
  2. Create your redundant models from the .NET SDK Model export which will give you Class names for the Embeded objects
  3. Write your CRUD operations to REALM(s) with the redundant models

I was about to embark on using constructors to set the Object Array’s, but turns out that functions as expected.

var test = _Realm_Test.All<Product>().FirstOrDefault(t => t.RefId == 2001);

                _Realm_Test.Write(() =>
                {
                    test.PriceList[1].Primary= false;
                    test.PriceList[1].Charge = 15.5;
                }
                );

Can’t tell you how much I appreciate your assistance!! It got me moving forward when I didn’t think I could!!

:rofl: :smiling_face_with_three_hearts:

I’m sure I’ll be back to the forum sooner than later. The “support” dosn’t seem to be as quick…when they respond (average 2 days per message) I’ve already moved past the issue thanks to you!!

Best regards,
CPT

Regarding the naming, it looks like cloud generated the product type class name based on the parent collection title and the property name - i.e. (Product + _ + product_type). If you want to change that, you can always add a title to your json schema:

  "title": "Product",
  "properties": {
    // ...
    "prod_type": {
      "bsonType": "object",
      "title": "ProductType", // <-- Set this to something custom
      "properties": {
        // ...
      }
    },
  }

Or you can add MapTo to your C# model to get a nicer name:

[MapTo("Product_prod_type")]
public class ProductType : EmbeddedObject
{
    public int? ProdTypeId { get; set; }
    public string TypeDescript { get; set; }
}

Happy to hear things are finally moving forward and hope the worst issues are behind us now. If you do encounter more problems, we’re here to help.

Good day @nirinchev . Is it safe to say this Realm stuff is a hot mess beyond anything usable?
I’m quite aware that I could be missing something fundamental, but I’m kind of a “brute force” guy and I’ve hacked, re-worked, took different directions, re-built, did it again, took another angle…etc…etc. Separated driver application from Realm app, and it’s just doesn’t work. Is it fundamentally ONLY for tiny mobile data? Like one form or something?

I always get one Realm working, and as soon as I try to access another (say Employee after Products is working) just to observe the itterated data…it’s over…It starts complaining about nullable here there and not even for it’s own object model. At this point I go from one working Realm to nothing as the required fields are now wrong???

Can things in different model NOT have the same property names or something? Why is it even asking about another model? I’m using the Employee model… wth does it have to do with the Product model. Once that (attempting to use two Realms in the same program) starts to unravel, everthing is about done and needs to be rebuilt pretty much because there’s nothing that makes sense anymore and who knows what’s happening?

Trying to just run the Employee Realm/Object model, which I’ve done 100 times suddenly it’ thinks the property Product.active_price is nullable on one side but not the other? Is it now?

This product (Realm) doesn’t work does it?

I think I started this thread trying to use the driver to load data and Realm to manipulate after the load.
I think you were asking “why would I want to do that”? And although you said “techinically” it’s workable you didn’t understand why I would do that. Which I think I tried to explain?

I don’t think it works either way. I think, at this point, you can “maybe” work with one Realm and that’s about it.
Frankly, I haven’t even got to that point because BEFORE that happens I needed to establish working with multiple realms in one program. However, I’m going in circles and getting to the same point every time.

  1. Get my Collection loaded
  2. Generate the schema and Realm
  3. Get an itteration of the sync’d Realm data and even a test write.
  4. And…scene. :frowning:

I always get that done and don’t know why I can write to Lists that only have getters? That might be a sign of something bad? I’m not sure why array of objects seem acceible in the Driver and or Realm as I think I’ve seen doc talking about the opposite.

As soon as I go down the path of the second model, doesn’t matter which all hell breaks loose and it seems like there’s no recovery.

I just spent 2 days separating the driver loads from the REALM program because I was thinking maybe the Realm and drivers can’t be in the same application space. So, I’ll load things and just run a separate dedicated program for the Realm componont. That in and of itself is a hot mes and it turns out it’s no better anyway.

I’m distraught by how much wasted time just to get something small to work is required here.
The docs says some things but in practice it does something else, but what does that mean? Such as, I can’t use setters on lists, but I can?

Anywho, that’s a lot and considering you’re the only one who was listening and an excellent resource by the way, I just had to quit after the last failure. It’s endless.

So, the only real question are.

  1. Does Realm even work for C#?
  2. If it does, I know it’s limited. In what capacity are the limits?
  3. Are there not enough C# developers for it to get something workable?
  4. Does Node.js and java based development work any better? I’m thinking that’s where the major support is focused with Mongo?
  5. I’ve opened support tickets but they’re not very quick and it seems they’re not as advanced necessarily. So, that brings me here.
  6. Are there plans to make it work or did I miss somthing sooooo fundamental that I’ll never get there with my basic strategy.

Anyway, I always appreciate it and I suspect I’ll throw out a support ticket to see what they say.

Mongo wins, I’m throwing in the towel for now.

uncle :rofl:
CPT

Everyone decides for themselves whether a product is usable or not and I won’t presume to make that decision for you and your company. Generally speaking, we do have a number of customers using Realm and it suits their needs.

We’re aware that the schema validations are somewhat annoying, especially early on in development and we do have some projects that would improve that, but for now it is a requirement that your client models match exactly the JSON Schema defined on the server.

One way to do that as I mentioned is to use the generated models from the cloud UI - if you’re using the exported models, you should not be running into these schema mismatch errors. If you do run into them, we would need to see your C# model and your Json schema to try and understand what the cause for the discrepancy is.

Regarding your questions:

  1. Yes, we do have multiple customers using the C# SDK.
  2. The only limitation that appears to be relevant for your use case is that the SDK is not annotated for nullability and as such will ignore nullability annotations on your models, resulting in a bit of double bookkeeping. This should not prevent any use cases though.
  3. As I said, workable is something everyone defines for themselves. We believe the SDK is usable and has been used.
  4. The issues you’re describing (model mismatches between the client and the server) are not something inherent for the C# SDK. You are absolutely free to try out the other SDKs, but they are all using the same Core database (the shared C++ component in the heart of all SDKs), so it’s possible that you’ll run into the same issues.
  5. The support team has SLAs based on the support plan you’ve chosen and the severity of the issues being opened. They are handling tickets from many people and have their own process for prioritizing work. I cannot speak to your experience with them. The forum is a purely voluntary medium we monitor on an ad-hoc basis and there’s no contractual commitment on our end for the responses here. Sometimes people get quick responses, other times they don’t.
  6. As I said, we believe the product works, but we also feel the onboarding experience isn’t optimal. We do plan to make improvements to make the schema validation more forgiving, but the core functionality should work.

Again, I understand and sympathize with your frustrations. It appears like you’re trying to get a fairly complex schema syncing by manually specifying both the client and the server schema - this is always going to be difficult and the general recommendation we have is to choose only one side and go from there. If you want your C# models to be the source of truth, then you can enable dev mode on the server and have the schema get generated from the client. Alternatively, you can specify your json schema on the server and use the model generation tool to generate the c# classes.

Godd day @nirinchev!!
Well, I kept plugging away and DID get a little progress. I broke it all up, simplified (no lists) and got both collections sync’d as expected.

I feel like a couple things happen and confusion ensuse due to my lack of REALM processing understanding. I still don’t know what it’s doing. However, I hacked a few things to get something working. Still holding before I move forward.

For example, I started defining the .realm db file instead of letting it default. Which generated two local files/directories for each Realm (Employee, Products).

What I still don’t understand is how “partitioning” is processing?
I happen to have designated a “_partition” field in my collections and in this case they represent a location objectId. So, truthfully it’s the same in Employee and Products for my testing. What I’m confused about is that if all I load initially is Products, I noticed the system loads both collections. Even though I didn’t even ask for Employee’s? So, that’s confusing and leads me to believe there’s something global about these partition keys? Which is also weird as I beleive I can interact with both Realms? Even though I didn’t sign into it (Employee). I haven’t confirmed all those details on if it sync’s etc or what? But for me not very intuitive. My initial thought was one App(Realm) was it’s own app, but the cross over isn’t clear?

I created a Server APP ID for each Realm(App service) but again, it doesn’t really need it. Once I login I with one, I think I can do stuff since it’d loaded everything locally? Right now, I’m trying to understand the “correct” method of:

  1. loading the Realm’s I want locally and the exact process with authentication methods (Server APP Keys in this instance)
  2. switching between the realm’s or do I even have to since everything is loaded anyway can I just read/write to the singular db file? It’s all in there on one login?
  3. A question is why does ever realm load based on what seems to be a global patition key? Does that mean I have to give each collection a different field for partition key? i.e. _emp_partition, _prod_partition.

When I’m editing the Product App I find it weird that I can see the Employee object Models in the Realm SDK along with the embedded Price object model. However, editing the Employee Object Models it shows Product, Employee but not the ebmeded Price model? Not sure if that’s by design or being worked on. Idk why the cross over but I still haven’t got the basic Realm operations.

Well, that’s already too much…again :frowning: I’m down but still not out. I’ll keep hacking along in hopes of unwinding how this is functioning. I certainly don’t seem to find a whole lot in terms of doc relating to multiple reaml access and the “standard” practice. Do I log out, login each access? I heard I have to nill it, but how expensive is signing in every time? Maybe it’s not because off-line is local anyway.

If I get there, I’m golden. :slight_smile: I have ONE List of Obects working in my Products (a price list). I going for a location list in the Employee Object model today. Inch by inch until it explodes :wink:

Best regards,
CPT

Hey, so I’m a little confused and I think it might be a good idea to take a few steps back. Let’s start with the idea behind Realm and Device Sync. I imagine you have a pretty large MongoDB database with multiple products and employees. If you were to sync that to every single device, it’d be a ton of data that most users likely don’t need. So the idea of partitioning your data is to define subsets of that dataset to store locally on the end-user device and synchronize with the server. Different users have different partitioning schemes, but the rule of thumb is: all your documents with a particular value for the _partition field will be grouped together in one partition. So for example, if you have "abc" for _partition, all Product and Employee documents that have that value will get synchronized to the same local Realm file.

While this works pretty well, it’s not particularly flexible - e.g. it doesn’t allow you to get all products and only a subset of the employees. That’s why we have a different synchronization mode, called “Flexible Sync”. With flexible sync, you define the queries that you want to subscribe to and the server will make sure to send you the documents that match. For example, you can do something like:

realm.Subscriptions.Update(() =>
{
    var californiaEmployees = realm.All<Employee>().Where(e => e.State == "CA");
    realm.Subscriptions.Add(californiaEmployees);
    
    var allProducts = realm.All<Product>();
    realm.Subscriptions.Add(allProducts);
});

That way, you can explicitly request the data you need from the server. It’s up to you to decide which sync mode you prefer to use, but one thing to keep in mind is that with partition sync, every document may exist in exactly one partition. On the other hand, with flexible sync, you can have the same document be sent to multiple clients depending on their subscriptions.

Regarding your questions:

  1. Not sure I understand that one, can you clarify it a bit?
  2. Generally, with flexible sync, you’d want a single Realm database and you just select the queries that you want. With partition sync, you may need to have multiple files if you need the results from multiple partitions.
  3. With partition sync, you can only have a single global partition key. You can’t define partition keys per collection.

Thanks for responding!! I always apreciate a knowlegable resource. :slight_smile: Well, ironically I think I’m way further along that you might think. :slight_smile:

So, I actually understand the idea of Partition sync and knew I’d get to Felxible after I proved out Partition sync. Which I’m wanting understand clearly why Partion sync is even a problem. The _partition fiedl I use in both the Product and Employee collection, in THIS case should be fine. Considering I am using one restaurant the Employee collection can be 1000 restaurants but partitioned by “location”. I happen to have ONE location configured and it’s ObjectId is 6310fa126afd4bc77f5517e4 which I use in the _partition field for both Products and Employee.

So, at least I understand that. I’d like to know about your response on point 3 when you said “… 1. With partition sync, you can only have a single global partition key. You can’t define partition keys per collection…”.

What’s the net affect of that? So, I can have several collections (Employee, Sales, Product, Shedules) wihch I beleive would be a separate App service per collection. That’s how I understand it. The “basic” understanding would be that the “location id” could be used as a partition So, one location get’s only it’s product, employee’s and sales etc.

Shouldn’t that work?

The odd thing, to me, was about how that partition was defining what was being downloaded into the local Realm DB. Meaning I just hatd to sign into anything, and it would load ALL collections with that partion. So, what’s the point of security if I can do that? I figured I’d need a App Key for each realm/collection and sign into each. Which would give me granular control if I wanted it. But it seems I just need any login and a partition and I can see verything in Realm Studio.

Yes, flexible sync is where it’s at. I’m just trying to get over this basic hurdle before I get more complex. Which is the root of my stress.

I thought I was simple.

  1. Create the Atlas collections
  2. Create Realm App services per collection
  3. On the client side dowload each realm I had a sign in Key for with the assigned access
  4. do CRUD.

that’s it.

So, what I feel I’m missing at this point. Is the best way to interact with multiple realm’s (product, sales, employee, schedules etc) that are AT LEAST partitioned by location ID.

Like I said I did server API key’s for each Realm (Product, Employee).
So, let’s say now I’m in a loop or using timers of some sort. (basically monitoring a local SQL db for changes in said Sales, Emploee’s, Product, Schedules etc…

When changes occur they update the specific Realm.

Do I need separate local DB’s or can a single file db work fine? Or does the partion limit that?

var config1 = new PartitionSyncConfiguration(_locationId, user, "C:\\Users\\...\\Documents\\iMonkey\\iMonkey.realm");


            var _Realm = Realm.GetInstance(config1);

looking forward to the next step!!

:slight_smile:
CPT

One thing I’m not sure I’m getting is why do you need multiple app services per collection (assuming by App Service, you mean server-side App). Is there a reason why you can’t use a single app that takes care of all collections?

@Colin_Poon_Tip I didn’t read through this entire thread but I’m a Xamarin dev who’s had some success with MongoDB + Realm Sync so here is my advice.

First of all just to make sure we are on the same page with a few things: a Realm can contain multiple collections and you sync them down with a partition key. You don’t need timers to check if things change - the whole point of Realm Sync is that the local db and backend are kept in sync for you. Just use one App Service in Atlas to hold your collections.

I would create a new app, turn DEV mode on, create a simple RealmObject, start your app, and let it automagically generate the schema for you on the backend. Don’t do stuff like string? … you don’t need nullable strings. If the string is optional just have it be string… if it’s required put [Required] over it. Keep your models basic until you know what you’re doing.

Here are some models I’m using in a prototype that you can follow if you need:[.NET MongoDB + Realm Sync models demonstrating relationship, embedded objects, and view model behaviour · GitHub](https://.NET Realm Models). You can see that the main model is an Issue with multiple embedded objects and a to-one relationship to a Contractor that the issue is assigned to. I also have some [BsonIgnore] attributes because you can’t put enums in a RealmObject… so we get/set to an int through an enum property. I also have Folder and Document models. A folder can contain many folders and documents and has one parent folder. Hopefully you can use these as a reference for mapping some relationships in your app.

Using the models from this gist, my prototype syncs all the collections for a single project by passing a partition key of “projectId=89f97030-a9e4-11ec-b909-0242ac120002”. This way I get all the Folders, Documents, Issues, Contractors, and I have relationships setup so I can navigate the object graph through Realm.

Hope this helps

1 Like

To answer the rest of your question: you can control on the backend who has permission to sync what data. So even though anybody could attempt to sync a location… if they aren’t an employee at that location for example, they won’t have permission to sync down. Or maybe you give them permission to read certain data from that location but not write it.

Check the docs, but this is what I have setup in a different app of mine in Apps → Build → Functions:

exports = async function(partitionValue) {
  try {
    const callingUser = context.user;

    // The user custom data contains a canReadPartitions array that is managed
    // by a system function.
    const {canReadPartitions} = callingUser.custom_data;
    
    // If the user's canReadPartitions array contains the partition, they may read the partition
    return canReadPartitions && canReadPartitions.includes(partitionValue);

  } catch (error) {
    console.error(error);
    return false;
  }
};

And over in my Triggers:

exports = async function createNewUserDocument({user}) {
  
  const cluster = context.services.get("mongodb-atlas");
  const customUserData = cluster.db("myApp").collection("CustomUserData");
  
  return customUserData.insertOne({
    _id: user.id,
    _partition: `user=${user.id}`,
    canReadPartitions: [`user=${user.id}`],
    canWritePartitions: [],
  });
  
};

So when a new user is created I give them access to partitions with their user ID, but in your example you could give employees access to location partitions based off some other logic. You could manage it with other functions etc if say a Manager is giving Employees access to Locations through a web portal or your app.

Let me know if this helps!

I believe you. It’s one of those fundamental Q&A I’ve been trying to answer. I think it’s about the language.
When I think of a “realm”. I’m thinking each app service (realm) I generate around a collection in MongoDB.

I saw them as separate App services as they each have an appId. Meaning, I assume I have to log into each one as such in my client side application (console app I’m testing with):

for my product AppService(realm)

var config = new AppConfiguration("productrt-pnyci");
                var apiKey = "S4JZU00zKPbLr69upuTYFXh6ZhKooh56h9owBrPOi3G0Kn0b3zxqW2hVzYbNcpdw";
                _RealmApp = App.Create(config);

Is it not the case if I want to connect to my other AppService(realm) Employees I’d mirror the above with the Employee AppId to sign in?

var config = new AppConfiguration("employeert-ueqyw");
                var apiKey = "PulSpxewf6XjpeUKFCLAVfBu3SipfVRDufWW2gKnsnF9pkNUB7xTGA2jO1tsp3lT";
                var _RealmApp = App.Create(config);

I generated apiKey’s for each Appservice as I expected that to be a method of controling access later.

Maybe I’m mixing the tearm “realm” up or maybe I’m answering my question?

Are you saying in the MongoDB AppServices configuration I can create a SINGLE AppService and service multiple collection in it? So, if I generated an new AppService in MongoDB called Restaurant I could load all the collections I need? In otherwords, the schema would hold the collections Sales, Employees, Product, Schedules etc? Holy cow…that would change everything in my thinking!! Which would make sense why I couldn’t understand why my partition key spanned both when I didn’t ask for one realm(collection)?

hmm…o-boy.

@Derek_Winnicki Helps TREMENDOUSLY!! :slight_smile:
I’m gonna rebuild with what I feel I had wrong the whole time. Which, i believe, was what an AppService(Realm) was. I started my tutorials thinking One App service per collection and I think that put me in a world of hurt. Soo…now I have to put that lesson to test. OR, I’ll fail miserably with another wrong assumption :rofl:

Thanks for responding, it’s super helpful!!
CPT