Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

using a sql request in spark sql error in execution

I try to execute this query in pyspark i get all the time error. I have looked everywhere but I don’t know or it doesn’t work if someone can help me. the goal of this request is to update a new column that I will later create called temp_ok :
this my code:

CASE WHEN _temp_ok_calculer='non' AND Operator level 2 ="XXX" OR  Operator level 2= "AAA" AND Auto Cleaning criteria !="YYY" Auto Cleaning criteria <> "AA"  AND Workstation Type = "Chaine A" THEN 'ok' ELSE CASE WHEN _temp_ok_calculer='ok' THEN 'ok' ELSE 'ko' END END

My table contains this columns: _temp_ok_calculer,Operator level 2, Auto Cleaning criteria, Workstation Type

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

>Solution :

Spark SQL uses back-tick ` for delimiters, and identifiers with spaces in them require using delimiters SQL. So

CASE WHEN _temp_ok_calculer='non' AND `Operator level 2` ="XXX" OR . . .
Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading