```
import math
point_dist = 0.0
x1 = float(input())
y1 = float(input())
x2 = float(input())
y2 = float(input())
point_dist = math.pow(math.sqrt(x2 - x1) + (y2 - y1), 2.0)
print('Points distance:', point_dist)
```

here is what I have written so far, keeps giving me incorrect numbers for output

input: 1.0, 2.0, 1.0, 5.0

expected output: 9.0

what im getting: 3.0

### >Solution :

`math.dist`

can do this for you.

```
import math
point_dist = 0.0
x1 = float(input())
y1 = float(input())
x2 = float(input())
y2 = float(input())
point_dist = math.dist((x1, y1), (x2, y2))
print('Points distance:', point_dist)
```

Note that the distance between (1, 2) and (1, 5) *is* 3.