Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

Write dict to csv with integers as keys

I have a dict with epoch times as keys. It looks like this:

my_dict = {199934234: "val1", 1999234234: "val2"}

When trying to write it to a csv, I get the error "iterable expected, not int". There is no problem, however, when using regular keys.

import csv

with open('my_file.csv', 'w', newline='') as csvfile:
writer = csv.writer(csvfile)
writer.writerows(my_dict)

I want to write it to a csv so that I can load it at a later point again as a dictionary so that I can update it … and then write it to a csv again. The csv will be accessed by my website later on.

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

What would be the best solution to do this? In any other case I would use rrd but in this case I do not have irregular update times.

>Solution :

The function treats the dictionary as an iterable from which it wants to get other iterables (i.e., the rows).

If you iterate a dictionary, you get only the keys:

for item in my_dict:
    print(item)
199934234
1999234234

Instead, you want an iterable containing your rows. You can use my_dict.items() for that:

print(my_dict.items())
dict_items([(199934234, 'val1'), (1999234234, 'val2')])

So:

import csv

with open('my_file.csv', 'w', newline='') as csvfile:
    writer = csv.writer(csvfile)
    writer.writerows(my_dict.items())
# File content.
199934234,val1
1999234234,val2
Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading