Is the high frame rate of Avatar: The Way Of Water the future of cinema?

And not just because of its aesthetics. Technology can evolve too quickly to accommodate average frame rates.

A brief overview. In the days before sound, movie projectors didn't necessarily run at 24 fps as standard. In some cases, early projectors were hand-cranked, leaving the actual frame rate in the skilled hands of the projectionist. Silent films were usually shot at around 16 frames per second and then projected at around 20, giving that "sped up" look a viewer might see in older films. It took until the invention of synchronized sound for a standard to be established. 24 frames was the lowest number a studio could get away with while still maintaining a reasonably realistic illusion of motion. The standard modern audience that are accustomed to was launched for reasons of economy.

With the advent of video cassettes and, later, digital cameras, shooting at different frame rates became more common. Newscasts filmed in 36 fps, for example. In doing so, audiences were trained to see this look as television-related. As video games progressed, digital frame rate changes also increased, and most modern games are now presented at 60 fps. When a movie was released that also tried 60 fps, however, audiences complained that it looked too much like games.

Movies then tried to stay stable at 24 fps. In 2022, however, a schism has emerged in the technology used to shoot the films and the technology used to present it. As newer digital projectors can process more and more information, they are starting to come up against the century-old traditional flicker of 35mm film. This causes visual "shake" in the presentation, such as when the image appears to "flicker" when a camera pans horizontally across a scene.

Is the high frame rate of Avatar: The Way Of Water the future of cinema?

And not just because of its aesthetics. Technology can evolve too quickly to accommodate average frame rates.

A brief overview. In the days before sound, movie projectors didn't necessarily run at 24 fps as standard. In some cases, early projectors were hand-cranked, leaving the actual frame rate in the skilled hands of the projectionist. Silent films were usually shot at around 16 frames per second and then projected at around 20, giving that "sped up" look a viewer might see in older films. It took until the invention of synchronized sound for a standard to be established. 24 frames was the lowest number a studio could get away with while still maintaining a reasonably realistic illusion of motion. The standard modern audience that are accustomed to was launched for reasons of economy.

With the advent of video cassettes and, later, digital cameras, shooting at different frame rates became more common. Newscasts filmed in 36 fps, for example. In doing so, audiences were trained to see this look as television-related. As video games progressed, digital frame rate changes also increased, and most modern games are now presented at 60 fps. When a movie was released that also tried 60 fps, however, audiences complained that it looked too much like games.

Movies then tried to stay stable at 24 fps. In 2022, however, a schism has emerged in the technology used to shoot the films and the technology used to present it. As newer digital projectors can process more and more information, they are starting to come up against the century-old traditional flicker of 35mm film. This causes visual "shake" in the presentation, such as when the image appears to "flicker" when a camera pans horizontally across a scene.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow