Multiplying value if true

Advertisements

I’m writing a payroll script and on top of the default pay, there’s a bonus pay.

However, for that bonus pay there is a condition you must meet:

df['Qualified'] = df['Average Parcels'].apply(lambda x: 'Qualified' if 10 >= x < 18 else 'Not Qualified')

Now for ID’s that are qualified, I’d like to multiply the number of orders fulfilled by 1.20.

I’ve tried something like this but I get a syntax error:

df['Extra'] = df['Orders'].apply(lambda x: x * 1.2 if ['Qualified'] == 'Qualified')

I’m not sure if doing it this way is the most efficient way, I’m open to other ideas. I’m still new to Python and Pandas so thanks for your help.

>Solution :

You don’t need to create the "Qualified" column, you can directly get the "Extra" column:

df["Extra"] = df["Orders"].mul(1.2).where(df["Average Parcels"].between(10, 18, "left"), 0)

Full working example:

import pandas as pd
import numpy as np

np.random.seed(100)
df = pd.DataFrame({"Average Parcels": np.random.randint(1, 100, size=15),
                   "Orders": np.random.randint(1, 100, size=15)})

df["Extra"] = df["Orders"].mul(1.2).where(df["Average Parcels"].between(10, 18, "left"), 0)

>>> df
    Average Parcels  Orders  Extra
0                 9      25    0.0
1                25      16    0.0
2                68      61    0.0
3                88      59    0.0
4                80      17    0.0
5                49      10    0.0
6                11      94  112.8
7                95      87    0.0
8                53       3    0.0
9                99      28    0.0
10               54       5    0.0
11               67      32    0.0
12               99       2    0.0
13               15      14   16.8
14               35      84    0.0

Leave a ReplyCancel reply