Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

How to prevent duplication in update statement

I have documents like this in my MongoDB collection:

my_dict = {"chat": 1, "entity": 2, "count": 55}

I need to update them and if the document does not exist (entirely), then insert a new one:

results_collection.update_one(my_dict,
                    {"$set": {"chat": 1,
                              "entity": 2,
                              "count": 60}},
                    upsert=True)

But if the document exists, it is duplicating it! it should just update because chat and entity are the same. But it is inserting a new one.
So, I will be having two similar documents with differences in count only.

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

See how this is duplicated in the picture enter image description here,

I want the new one to replace the above one.

How can I do that?

>Solution :

You need to find the matching document. In your case, you can do it by the identical fields:

results_collection.update_one({'chat': my_dict['chat'], 'entity': my_dict['entity']},
                    {"$set": {"chat": 1,
                              "entity": 2,
                              "count": 60}},
                    upsert=True)

The code in the original question is using my_dict to try to find the matching document to update, but my_dict includes the count, thus never finds the document and thus creating a new one…

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading