I shoot in the US and use 30 (29.97) fps; however in viewing a "making of" I noticed that Downton Abby was shot at 60fps, why do you think this is done? Advantages?
High motion content. Shooting at 30P often results in unacceptable judder. Shooting at 24P is even worse. I've seen lots of documentaries on PBS and Smithsonian that were shot at 24 P that produce really terrible judder if there is any significant panning. I personally don't like such judder--others do I guess. Shooting at a higher frame rate gives you more options in my view. I think that's a real limitation of 4K at the moment--at least for cameras that I can afford. Just an opinion.
With high refresh rates of 2010+ displays, 60p is pretty much the new NTSC standard of late.
Downton is all about costumes and locations. Even though there is not much action, the increased frame rate means faster shutter, or sharper images with less blurring.
Maybe you could share the link you read.
Here this link says Downton was shot on a D-21, which is only 2k 30p. Unlikely they shot in HD, but maybe they did back in 2008. the article does reference them shooting to Log 4:2:2, so maybe they were running the D-21 in 60p mode.
Downton Abbey shoots on Alexas. The first season was shot on the d21. They shoot 24p as does every scripted show I can think of that comes out of Hollywood, at least. I suppose some reality shows might shoot 60p, but I'm not in that loop so I couldn't say.
120 fps is a very useful frame rate as it is evenly divisible by 24, 30 and 60. You want 24 fps, just use every 5th frame, 30 fps, use every 4th frame...
I understand your 120P divisible comment, but perhaps you can clarify something ...
I often shoot at 60P but render to 30P (don't ask why, it's not important for my question). Can I assume that Vegas just drops every other frame? Does it blend two frames to get one frame? Does it matter if "Smart Render" is on or off?
No, Vegas does not just drop every frame by default, it will resample a 60p clip to 30p by blending frames and this is not always desirable. In the clip's properties, set it to "Disable Resample" and change the "Undersample Rate" to .500, Vegas will treat the video just like it was shot at 30p.
"From 120->24 you will have too sharp frames, without natural motion blur."
Yes, you are absolutely correct. That said, I don't know why 24 fps is even still a thing. I can understand 30 fps for online stuff, but 60 fps is currently my preferred frame rate.
This subject has such divided opinions - I can not stand 25p/30p. I have shot 50p for some years now and with my new Sony A7sii tried the 4k with the aim of downscaling or cropping in. I simply can't stomach the 25p so am content with 1080 50p.
Some people's eyes notice this more - some find it fine and "cinema like". I find going to the cinema an "epilepsy inspiring" affair and desperately seek out the HFR versions. Just me. For instance, the second to last regular 24p films I saw at the cinema was one of the Lord of the Rings films. They had some epic shots, I assume were done from a helicopter or drone over an awesome landscape. The jitter I saw made it almost unwatchable for me. I could hardly see what was going on. I wondered how any director/editor or quality assurance could ever consider this acceptable. The last film was the new James Bond - even worse being a high impact film. I could hardly see a damn thing.
I am on your side. In this day and age we should not limit ourselves to old habits and their technology. I shoot 4K 30p but only for my personal use and I have to constrain myself when I do pans and tilts; other then that it's 1080 60p.
So, when will a 4K 60P camera be available that mere mortals can afford? Given the severe limitations of the current crop of 4K offerings, I can't imagine that it's not being pursued. Even a 2.7K 60P much like the GoPro would be an improvement. Any speculation?
It depends on what you consider affordable. The Panasonic AG-DVX200 is certainly obtainable; I am just not a Panasonic fan and wait for a comparable Sony product.
>>>scroll down to behind the scenes, read clapper board
Normally, you don't see fps on the slate, unless it's a frame rate other than the normal frame rate of the show, and then, you put it on a big, bright piece of tape announcing it's something different so Post will know that this is a fx shot. (slo mo etc)
As for shooting in 24, it's the "film look." People are familiar with the artifacts, so they in effect disappear.
And I'd always learned that "judder" was not the "bip-bip-bip-bip-bip" strobing effect of a fast pan, but rather the "slow-fast-slow-fast-slow-fast" pan effect as a result of the 3-2 pulldown when converting 24 film to 30 video.
Thirty years ago 60fps was thought to be the Holy Grail of filmmaking, but like so many other improvements it failed to reach critical mass for universal adoption. And some argued back then that 24 or even slower would be just fine in a future of high-powered digital interpolation. Plug in as many intermediate frames as you want.
It's not just a question of what you consider affordable, it should also include what you consider "4K".
There's at least one Sony camera that has a "4K" recording capability but you need to read the specifications carefully, Yes it will record UHD but at 8 bit 4:2:0. Switch to 1080 and the camera records 10bit 4:2:2. Most likely if you shot HD with this camera and up scaled it to UHD it'd look as good as footage shot if the camera's UHD mode.
Regarding frame rate: For sure shooting for the higher field of view that 4K demands higher frame rates are required however keep in mind that when the frame rate is doubled the shutter is open for half as long i.e. 1 stop of light is lost.
I also hate 24p but only recorded with home equipment.
With pro cinema cameras 24p looks beautiful, strobe is different than with my devices. Yes, i tried using ND filters to lower my shutter speed, but still my footage is almost unwatchable (while fast motion), and cinema films are ok!
I like to know the secret :)
yes, we tried to record 25p footage at 1/50sec shutter with my friend Panasonic camera, but still strobe strobe everywhere. Individual frames has motion blur, but the end result is worse than 1/48 shuttered 24p movie. Maybe its a digital sensor vs film sensor?
Thirty years ago 60fps was thought to be the Holy Grail of filmmaking, but like so many other improvements it failed to reach critical mass for universal adoption.
I think it failed 30 years ago because it was just too costly. Imagine to buy almost triple the quantity of film stock for a 60p movie and all the subsequent cost of it. Today is a different story but only to some degree as post processing and special effects at higher frame rates is also more expensive.
""Disable Resample" and change the "Undersample Rate" to .500"
Only the disable is needed. Setting undersample is an unnecessary step as it is completely superfluous in this case. All it will do is incur extra CPU cycles.
In effect, disabling resampling accomplishes the same thing as undersampling whenever slowing down the frame rate, but simpler and faster.
I meant to say "or" and not "and", nevertheless, you are correct. As you point out, disabling resampling is probably the more universal option as that would take care of some of the oddball frame rates that smartphones cough up. For example, 60p on my Samsung Note 4 is not exactly 59.94fps.