I'd be grateful if someone could sense check what is being said in the following two hypothetical scenarios (am I correct in what's said?)...
Scenario 1:
In this scenario, the sprite moves at 'perfect' smoothness (not jumping pixels) across the screen at 144 pixels every second, thus taking 2560/144 i.e. 17 seconds to go from screen left to right.
Scenario 2: We then swap to a 500Hz monitor...
To sum up the above, we can generalise things as follows...
The 'blurriness' we see in entity / any gfx data / any 'camera panning' movement in games, TV programs/cinema film is primarily because image data is being moved across a screen in increments of '>1 pixel'. This is especially prominent in cinema where e.g. 30 frames per second looks blurry on even moderate moving/panning content...
In the 'ideal' scenario, to maximise movement smoothness we need to move data across the screen by 1 screen pixel (or by high frames per second in e.g. cinema movies), and, have a monitor update rate high enough to support the desired speed of the data being moved across the screen, and (for computer/gfx cards), have a program movement update loop that loops fast enough to support the desired speed of the data being moved across the screen.
We have some way to go before technology allows us that 1 pixel movement at gfx data movement speeds used today (gfx cards or digital/analogue film camera).
However, 500Hz monitors are beneficial for movement 'smoothness' in gaming where the game loop is optimised to use that higher refresh rate i.e. moving data by fewer pixels on each update while retaining the same movement speed because of the higher refresh rate. Not every game will be optimised for high refresh rate monitors.