Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

How to select the scala dataframe column with special character in it?

I am reading a json file where the key is having come special character. E.g

[{
        "ABB/aws:1.0/CustomerId:2.0": [{
            "id": 20,
            "namehash": "de8cfcde-95c5-47ac-a544-13db50557eaa"
        }]
}]

I am creating a scala dataframe and then trying to select the column using spark.sql "ABB/aws:1.0/CustomerId:2.0". Thats when its complaining about special character.

dataframe looks like this
enter image description here

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

>Solution :

Use backtick to select column which has special characters. Check below code.

scala> df.select("`ABB/aws:1.0/CustomerId:2.0`").show(false)
+--------------------------------------------+
|ABB/aws:1.0/CustomerId:2.0                  |
+--------------------------------------------+
|[{20, de8cfcde-95c5-47ac-a544-13db50557eaa}]|
+--------------------------------------------+
Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading