Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

Selenium Can't find elements by CLASS_NAME

I’m trying to scrape a website and get every meal_box meal_container row in a list by driver.find_elements but for some reason I couldn’t do it. I tried By.CLASS_NAME, because it seemed the logical one but the length of my list was 0. Then I tried By.XPATH, and the length was then 1 (I understand why). I think I can use XPATH to get them one by one, but I don’t want to do it if I can handle it in a for loop.

I don’t know why the "find_elements(By.CLASS_NAME,’print_name’)" works but not "find_elements(By.CLASS_NAME,"meal_box meal_container row")"

I’m new at both web scraping and stackoverflow, so if any other details are needed I can add them.

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

Here is my code:

meals = driver.find_elements(By.CLASS_NAME,"meal_box meal_container row")
print(len(meals))

for index, meal in enumerate(meals):
    foods = meal.find_elements(By.CLASS_NAME, 'print_name')
    print(len(foods))
    
    if index == 0:
        mealName = "Breakfast"
    elif index == 1:
        mealName = "Lunch"
    elif index == 2:
        mealName = "Dinner"
    else:
        mealName = "Snack"
    for index, title in enumerate(foods):
        recipe = {}
        print(title.text)
        print(mealName + "\n")
        recipe["name"] = title.text
        recipe["meal"] = mealName

Here is the screenshot of the HTML:
HTML

>Solution :

It seems Ok but about class name put a dot between characters.
Like "meal_box.meal_container.row" Try this.
meals = driver.find_elements(By.CLASS_NAME,"meal_box.meal_container.row")

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading