Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

Can we prevent pandas from prefixing the cols when normalizing a d/json?

My input is a Python dictionnary :

d = {
  "username": "foo999",
  "email": "bar999@example.com",
  "address": {
    "city": "Faketown",
    "state": "Fakeshire"
  },
  "is_premium_user": True,
}

Using pd.json_normalize(d, record_prefix=False), I’m still getting the prefix like in here : address.city

  username               email  is_premium_user address.city address.state
0   foo999  bar999@example.com             True     Faketown     Fakeshire

My expected output is :

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

  username               email  is_premium_user         city         state
0   foo999  bar999@example.com             True     Faketown     Fakeshire

The code below works but seems hacky and just a workaround.. Also, it suppose I know all the nested keys..

df.columns = df.columns.str.replace("(?:address|2nd_key|...)\.", "", regex=True)

>Solution :

I’m quite sure it’s not possible. This avoids ambiguity in case subkeys are identical:

pd.json_normalize({"address": {"city": "Faketown"},
                   "destination": {"city": "Othertown"}})

Some other workarounds:

df = (pd.json_normalize(d, record_prefix='X')
        .rename(columns=lambda x: x.split('.')[-1])
     )

Or:

df = pd.json_normalize(d, record_prefix='X')
df.columns = df.columns.str.replace(r'.*\.', '', regex=True)
Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading