Given two pointsA(x1,y1) and B(x2,y2) on a coordinate plane, their distance d is given by the following formula.
d=(x2−x1)2+(y2−y1)2
Proof
Start by plotting A(x1,y1) and B(x2,y2) on the coordinate plane. Both points can be arbitrarily plotted in Quadrant I for simplicity. Note that the position of the points in the plane does not affect the proof. Assume that x2 is greater than x1 and that y2 is greater than y1.
The difference between the x-coordinates of the points is the length of one of the legs of the triangle. Furthermore, the length of the other leg is given by the difference between the y-coordinates. Therefore, the lengths of the legs are x2−x1 and y2−y1. Now, consider the Pythagorean Equation.
a2+b2=c2
Here, a and b are the lengths of the legs, and c the length of the hypotenuse of a right triangle. Substitute the expressions for the legs for a and b to find the hypotenuse's length. Then, the equation can be solved for c.
Note that, when solving for c, only the principal root was considered. The reason is that c represents the length of a side and therefore must be positive. Keeping in mind that c is the distance between A(x1,y1) and B(x2,y2), then c=d. By the Transitive Property of Equality, the Distance Formula is obtained.
Mathleaks uses cookies for an enhanced user experience. By using our website, you agree to the usage of cookies as described in our policy for cookies.