Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

Unable to process list of sets having multiple element in set

When I scrap websites for all the emails on each website and try to output it, I can get a given data frame which is a list of sets of multiple elements for each website :

URL_WITH_EMAILS_DF = pd.DataFrame(data=[{'main_url': 'http://keilstruplund.dk', 'emails': [{'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, set(),{'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersens@hk.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'prima-rent@youseepost.dk', 'jb@rentind.dk', 'frisoren01@gmail.com', 'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'stigterndrup@gmail.com', 'psn@psn.dk', 'samuel@malerfirmaet-lykkebo.dk', 'jan@mundt-reklame.dk', 'ole.norlin@mail.dk', 'nordsjalland@phonixtag.dk', 'jp@rudersdalmaleren.dk', 'vvs@hestetangen.dk', 'steenkragelund@mail.tele.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'kasserer@keilstruplund.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}]}, 
                                    {'main_url': 'http://kirsebaergaarden.com', 'emails': [{'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'soesterb@gmail.com', 'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}]},
                                     {'main_url': 'http://koglernes.dk', 'emails': [{'info@koglernes.dk'}, {'info@koglernes.dk'}, {'info@koglernes.dk'}, {'info@koglernes.dk'},set(), set(), {'info@koglernes.dk'}, {'info@koglernes.dk'}]},
                                      {'main_url': 'http://kongehojensbornehave.dk', 'emails': []}
                                   ])

enter image description here

However, I want to process the data frame to look like the following:

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

URL_WITH_EMAILS_DF = pd.DataFrame(data=[{'main_url': 'http://keilstruplund.dk', 'emails': ['ole.norlin@mail.dk', 'ole.gregersen@hk.dk', 'prima-rent@youseepost.dk', 'jb@rentind.dk', 'frisoren01@gmail.com','stigterndrup@gmail.com', 'psn@psn.dk', 'samuel@malerfirmaet-lykkebo.dk', 'jan@mundt-reklame.dk',  'nordsjalland@phonixtag.dk', 'jp@rudersdalmaleren.dk', 'vvs@hestetangen.dk', 'steenkragelund@mail.tele.dk', 'kasserer@keilstruplund.dk']},                                        
                                     {'main_url': 'http://kirsebaergaarden.com', 'emails': ['info@kirsebaergaarden.com']},
                                     {'main_url': 'http://koglernes.dk', 'emails': ['info@koglernes.dk']},
                                      {'main_url': 'http://kongehojensbornehave.dk', 'emails': []}
                                   ])

enter image description here

How can it be achieve?

I have tried the following code but it only manage to return first element of first set only while running to error when there is no element in the email list for a given website :

URL_WITH_EMAILS_DF['emails'] = [', '.join(x.pop()) if not None else "" for x in URL_WITH_EMAILS_DF['emails'].values]

Please help. Thanks

P.S: As per first dataframe, I needed to get a set of multiple emails to be inserted because there can be multiple webpage for a single website and I do not want to take duplicate email from each web page

>Solution :

chain.from_iterable can solve this problem.

from itertools import chain
URL_WITH_EMAILS_DF = pd.DataFrame(data=[{'main_url': 'http://keilstruplund.dk', 'emails': [{'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersens@hk.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'prima-rent@youseepost.dk', 'jb@rentind.dk', 'frisoren01@gmail.com', 'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'stigterndrup@gmail.com', 'psn@psn.dk', 'samuel@malerfirmaet-lykkebo.dk', 'jan@mundt-reklame.dk', 'ole.norlin@mail.dk', 'nordsjalland@phonixtag.dk', 'jp@rudersdalmaleren.dk', 'vvs@hestetangen.dk', 'steenkragelund@mail.tele.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'kasserer@keilstruplund.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}, {'ole.norlin@mail.dk', 'ole.gregersen@hk.dk'}]}, 
                                    {'main_url': 'http://kirsebaergaarden.com', 'emails': [{'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'soesterb@gmail.com', 'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}, {'info@kirsebaergaarden.com'}]},
                                     {'main_url': 'http://koglernes.dk', 'emails': [{'info@koglernes.dk'}, {'info@koglernes.dk'}, {'info@koglernes.dk'}, {'info@koglernes.dk'}, {'info@koglernes.dk'}, {'info@koglernes.dk'}]},
                                      {'main_url': 'http://kongehojensbornehave.dk', 'emails': []}
                                   ])


URL_WITH_EMAILS_DF['emails'] = URL_WITH_EMAILS_DF.emails.apply(lambda x: list(set(chain.from_iterable(x))))
Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading