for page in range(1, pages + 1):
def append_organizator(organizator, organizatorzy=[]):
organizatorzy.append(organizator)
for i in organizatorzy:
try:
query = "INSERT INTO stypendia (organizator) values(%s)"
values = []
values.append(organizatorzy.pop())
cursor.execute(query, values)
conn.commit()
except:
pass
def append_type(rodzaj, rodzaje=[]):
rodzaje.append(rodzaj)
for i in rodzaje:
try:
query = "INSERT INTO stypendia (rodzaj) values(%s)"
values = []
values.append(rodzaje.pop())
cursor.execute(query, values)
conn.commit()
except:
pass
Those are 2 functions that are inserting the data scrapped from website into the database
The program is iterating through all available pages on site. The data that’s scrapped is inserted to database.
As you can see on screenshot, the title is inserted 7 times(the amount of pages), then the organizator again 7 times etc…
How can i solve this problem and have everything at same indexesdatabase ss
>Solution :
You need to combine the insert operations – each insert will create a new row. You should also just use the parameters without the array, they really aren’t needed.
This example only handles two parameters (same as your code above). Add additional parameters as needed and adjust the insert statement
def append(organizator: str, rodzaj: str):
try:
query = "INSERT INTO stypendia (organizator, rodzaj) values(%s, %s)"
values = (organizator, rodzaj)
cursor.execute(query, values)
conn.commit()
except:
pass