When talking about FPS in the general aspect of PC gaming, the reference is not to screen refresh rate, but to the rate at which each GAME frame is displayed on screen. In other words, as you know, each frame is computed, rendered and prepared for display. The graphics card displays 60, 30, 85 times per second (or whatever times-per-second or "refresh rate" the user can select) whatever it has in the frame buffer - or more simply, in a portion of its GRAM. Regardless of that, the PC has to fill that buffer with useful information.
FPS in this (and our) context means: how many times per second the PC manages to prepare new scenes (or frames) for display, so while the card displays always at the selected monitor "refresh" rate, regardless of the monitor technology - the new scenes prepared by the PC are not necessarily prepared as many times per second.
So if the hardware is not up to the necessary specs, the graphic card would have that scene information pushed to its frame buffer by the PC etc. twice, three times or 10 times per second. The fact that the graphic card displays the frame buffer 60 times per second, is irrelevant. Human eye will detect those 2, 3 10FPS immediately.
The argument in that thread and here too seems to be around two different definitions of FPS - the "game-FPS" and the display system FPS which are two very different things.
When WE - and the majority of the gaming community - say FPS, we refer to how many times per second the PC and the graphic subsystems can prepare a new scene for display.
This is completely different from anything connected with the monitor and display part whose refresh rates are governed by standards and individual capabilities.
That's why MS has still to explain what THEY mean by FPS as it is clear that normally 15 game-fps should result in a choppy display, but in FSX case it does not.
In our case, anything above 24-30 fps will give a fluid image - in accordance with regular "movie" and physiological expectations, while the monitor may be refreshed 60 or 75 or whatever times per second. Even taking your projector shutter explanation, it is easy to see why these two terms are not connected: you may flip the shutter 24 times per second but if the movie runs slower by changing "scenes" every two frames (without affecting the shutter), effectively dropping the FPS to 12, the viewers will see a choppy movie.
Regardless of the game FPS, the real-timeness of the sim must be maintained up to a certain reasonable hardware-dependent extent, so while the clock and sim engine run at real time or let's say, one tick every 20ms, the displayed frame rate may be slower, with choppy "animation" etc. due to an underspec PC.
What we did with the new engine among other things, is optimize the real-time engine to work correctly also at very low game-FPS.
/Admin