For average speed over a long distance it takes the distance traveled and divides it by the time taken - exactly what the other answers said.
For instantaneous speed, how fast you were going at one point in time, this approach doesn't work. Which is a pity because that's exactly what you want to know if you want to check your drivers aren't speeding.
Each GPS position has an error of up to 5 meters, that means that if you take two positions a second apart take the distance between them and calculate the speed you could be off by up to 10 meters per second or about 22.5 mph.
The way a GPS actually calculates speed at any given moment is by measuring the doppler shift in the signals from the satellites. Just as a car engine or police siren sounds higher pitch when it's heading towards you than when it's heading away the signals from the satellites change very slightly depending on how fast you are moving towards or away from them.
If you know where you are (you have a GPS position) and you know where and how fast the satellites are moving (they broadcast that information) then by measuring the doppler on the signals you can calculate how fast the receiver is moving far more accurately than by taking the difference between two positions.
How accurate is it? As with anything to do with GPS it depends on how many satellites are being tracked and how strong the signals are. It would be rare for it to be more than one or two mph wrong.