If I had realized this thread was a sticky I'd have posted my findings here to begin with. It's actually a little more complicated than this. The game is run in "ticks" (equivalent to frames), the speed is measured in pixels per tick, and you can't have a fraction of a tick; the layer will move a distance equal to the speed*ticks. 1 block length is equal to 32 pixels.
What makes it dicey is that the number of ticks are a bit off from what is expected.
No wonder it's so hard to prevent cycling layers from getting farther and farther off if you don't use the same speed and delay both ways. A 2 second delay isn't even the same as two 1-second delays!
The number of ticks for a given number of seconds appears to be:
(seconds*65)+1, rounded to the nearest ODD number if it ends in .5
Normally .5 is rounded to the nearest even number, but presumably it's rounded before the 1 is added, resulting in it rounding towards odd numbers instead.
In that case the distance moved (in pixels) is:
((seconds*65)+1, rounded to the nearest ODD number if it ends in .5)*speed
The extra 1 is what tends to throw off the timing of layers. For example, a speed of -1 for 2 seconds and a speed of 2 for 1 second moves the layer -131 and then +132, causing it to go further off by 1 each cycle.
However, moving it by -1 for 1 second and +2 for 0.5 seconds does work because the number of ticks for 1 second is twice the number for 0.5 seconds. Another solution is to split up the first delay, so you have the layer moving -1 for 1 second, then -1 for 1 second again, then +2 for 1 second.