If you have 10 dots on a table, can you always cover them with 10 pennies without overlapping?

*Remark: You can assume the dots are not too close to the edge of the table.*

Assume the dots lie in a plane and the radius of a penny is 1. Make an infinite grid of circles with radii 1, as shown on the picture, and choose one of them - call it C. Now place the grid in the plane, so that the position of the center of C is chosen uniformly among the points in some hexagon H with circumradius 1.

If we choose any point in the plane, the probability that it will end up inside some circle of the grid is equal to the ratio S(C)/S(H), where S stands for "area". Simple calculation shows that this ratio is bigger than 90%. Therefore the probability that some chosen point in the plane will not end up inside any circle of the grid is less than 10%. If we have 10 points, the probability that neither of them will end up inside a circle is less than 100%. This means that we can place the grid in the plane so that all ten dots end up in some circles. Now just place the given coins where these circles are, there are at most ten of them.