Tuesday, 18 March 2008 01:25 pm

(no subject)

deckardcanine: (Default)
[personal profile] deckardcanine
There's something I've never quite understood about video cameras. Many a TV program or home video since the '70s has a frame-per-second rate that looks darn close to real-life motion (think of any game show from this period), yet movies invariably have it slower. Generally speaking, in fact, the more expensive fare will not have lifelike FPS. And sometimes when there's a show within a show, like on an episode of "Seinfeld," it switches to lifelike.

The only explanation I can conceive is that fewer FPS makes a film editor's job much easier. But I wouldn't have expected the lavish and often perfectionist industry to cut corners like that. Besides, movies have more freedom to extend their deadlines than TV shows. And when I consider the special effects that were used on some "lifelike" shows, I'd expect them to have required about as much editing as "Seinfeld" or "Friends." (Oh, right: Those two shows weren't super-expensive to cover expenses, just comedians who mostly have little popularity in other endeavors.)
Date: Tuesday, 18 March 2008 06:29 pm (UTC)

From: [identity profile] nefaria.livejournal.com
It may be that the TV refresh rate and the movie refresh rate are out of synch (24 fps for TV vs. 36 for movies?), so for movies you're losing 1 frame out of 3, which makes it look choppy on TV.
Date: Tuesday, 18 March 2008 06:43 pm (UTC)

From: [identity profile] deckardcanine.livejournal.com
But it's not just on TV. The rate looks about the same on the silver screen.
Date: Tuesday, 18 March 2008 08:59 pm (UTC)

From: [identity profile] thatcatgirl.livejournal.com
Maybe just cheaper to use less film total? Film's pretty expensive by itself.
Date: Tuesday, 18 March 2008 10:16 pm (UTC)

From: [identity profile] deckardcanine.livejournal.com
So why didn't "Clarissa Explains It All" do that? It was a pretty cheap show.
Date: Wednesday, 19 March 2008 04:57 am (UTC)

From: [identity profile] stevenroy.livejournal.com
Film, traditionally, has always used 24 FPS. American TV, however, uses 30 (or 60, depending on who you ask). That means that most shows recorded on film have to have every fourth frame duplicated. That's a source of choppiness.

Another cause comes from the fact that the NTSC TV signal actually uses 60 FPS, but the picture is interlaced, meaning that only every other line is drawn per frame. Even when the source is tape instead of film, this results in it being a common practice to duplicate a single frame until both its even and odd scanlines are drawn. This results in 30 FPS video even when 60 FPS is possible (and even used elsewhere in the same show).
Date: Wednesday, 19 March 2008 02:38 pm (UTC)

From: [identity profile] deckardcanine.livejournal.com
Well, that makes sense, thanks, but it still doesn't explain why pricier = fewer FPS.
Date: Thursday, 20 March 2008 12:21 am (UTC)

From: [identity profile] octan.livejournal.com
24 was selected because it was considered ideal for vieweing. Deviate too far from that rate, in either direction, and the video starts looking unrealistic to some people. However, AC devices like TV sets operated at 60Hz, because that was the frequency of the 120V current. So, they went with a system that places every other line, then goes back and fills in the gaps, for each frame, for a combined framerate of 30, which is reasonably close to the 24fps "ideal." The phosphors that make up the screen can only stay glowing so long after being hit with the cathode ray, so I'm guessing a triple-phase process that produced 20fps would lead to an unsatisfactory amount of flicker.

In jolly old England, meanwhile AC power is supplied at 250V and operates at 50Hz, so they went with the much more ideal 25 fps.

Profile

deckardcanine: (Default)
Stephen Gilberg

January 2026

S M T W T F S
    123
45 678910
11121314151617
18192021222324
25262728293031

Style Credit

Expand Cut Tags

No cut tags
Page generated Saturday, 10 January 2026 04:16 am
Powered by Dreamwidth Studios