To be fair to S7 x10^8 30 became the benchmark because it was readily attainable with older hardware, and gave good performance.
Now, Jzargo hit on some points, but there are some issues with his explanation. We have to look at the hardware to really get a good picture.
You have 3 main things you have take into account (there are other things, but I dont want to get into that much detail)
Framerate = FPS
Refresh rate = Hz
Pixel density = P/in^2
So lets say you have a computer with a standard 60hz monitor with a 1080p resolution (Pixel density), a graphics card that is capable of running whatever game you choose at 300fps.
In this circumstance, anything over 60 Fps is absolutely pointless. The FPS are going to be truncated by the maximum refresh rate. IE the monitor is not capable of updating the picture more than 60 times per second.
Now lets change the hardware...
Same graphics, new fancy monitor, 120Hz refreshrate, same 1080p resolution.
In this case, you could crank up to 120FPS and there would be a measurable difference (measurable being the key word here). That being said... there might be an appreciable difference between 60FPS and 90FPS but it would be marginal. Between 90 and 120FPS there would be even less. This is because the pixel density is just not enough to really showcase the difference.
Changing the hardware one more time...
Same graphics, Brand new, 4k with a 240Hz refresh rate (Yes they make them, no they have no point in 95% of situations)
In this case, running things you would see a marked difference in running a fast paced game (or movie) at 120FPS over 60FPS, you would even see a difference between 120FPS and 180FPS but much higher and its going be less and less noticeable the closer you get to the maximum theoretical ability for our eyes to detect movement in an electronic format.
Jzargo touched on, but was mistaken in this. The 300~Hz limit is true, but that is because of the format we are using. If you are curious to see a similar phenominon in action, you can look at your computer/tv through your camera, you will notice a shudder/flicker/rolling aspect to the video. Things that are not displayed electronically, can be seen at much higher rates. But not the 7000000FPS he was talking about FPS for measuring anything but videorate is pointless. There is a good paper on this that you can read if you are interested
http://www.nature.com/neuro/journal/v5/n10/full/nn924.htmlThere is no quick way to determine what the "Best framerate" is, its entirely subjective to the application, the hardware, your own personal sensitivity, ect. Yes the higher your framerate the better your experience will be, same goes with resolution. You must always take into account the law of diminishing returns, after a certain threshold... things become less and less productive.