Comments

wwaag wrote on 3/2/2016, 1:05 PM
High motion content. Shooting at 30P often results in unacceptable judder. Shooting at 24P is even worse. I've seen lots of documentaries on PBS and Smithsonian that were shot at 24 P that produce really terrible judder if there is any significant panning. I personally don't like such judder--others do I guess. Shooting at a higher frame rate gives you more options in my view. I think that's a real limitation of 4K at the moment--at least for cameras that I can afford. Just an opinion.

wwaag

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

astar wrote on 3/2/2016, 1:49 PM
With high refresh rates of 2010+ displays, 60p is pretty much the new NTSC standard of late.

Downton is all about costumes and locations. Even though there is not much action, the increased frame rate means faster shutter, or sharper images with less blurring.

Maybe you could share the link you read.

Here this link says Downton was shot on a D-21, which is only 2k 30p. Unlikely they shot in HD, but maybe they did back in 2008. the article does reference them shooting to Log 4:2:2, so maybe they were running the D-21 in 60p mode.

http://www.arrirental.co.uk/news/view/13/downton-abbey
ddm wrote on 3/2/2016, 4:02 PM
Downton Abbey shoots on Alexas. The first season was shot on the d21. They shoot 24p as does every scripted show I can think of that comes out of Hollywood, at least. I suppose some reality shows might shoot 60p, but I'm not in that loop so I couldn't say.
John_Cline wrote on 3/2/2016, 6:03 PM
120 fps is a very useful frame rate as it is evenly divisible by 24, 30 and 60. You want 24 fps, just use every 5th frame, 30 fps, use every 4th frame...
Rich Parry wrote on 3/3/2016, 12:47 AM
@John_Cline,

I understand your 120P divisible comment, but perhaps you can clarify something ...

I often shoot at 60P but render to 30P (don't ask why, it's not important for my question). Can I assume that Vegas just drops every other frame? Does it blend two frames to get one frame? Does it matter if "Smart Render" is on or off?

Thanks in advance,
Rich

CPU Intel i9-13900K Raptor Lake

Heat Sink Noctua  NH-D15 chromas, Black

MB ASUS ProArt Z790 Creator WiFi

OS Drive Samsung 990 PRO  NVME M.2 SSD 1TB

Data Drive Samsung 870 EVO SATA 4TB

Backup Drive Samsung 870 EVO SATA 4TB

RAM Corsair Vengeance DDR5 64GB

GPU ASUS NVDIA GeForce GTX 1080 Ti

Case Fractal Torrent Black E-ATX

PSU Corsair HX1000i 80 Plus Platinum

OS MicroSoft Windows 11 Pro

Rich in San Diego, CA

John_Cline wrote on 3/3/2016, 1:42 AM
No, Vegas does not just drop every frame by default, it will resample a 60p clip to 30p by blending frames and this is not always desirable. In the clip's properties, set it to "Disable Resample" and change the "Undersample Rate" to .500, Vegas will treat the video just like it was shot at 30p.
relaxvideo wrote on 3/3/2016, 2:56 AM
From 120->24 you will have too sharp frames, without natural motion blur.
YOu can have add blur fx, but this need lot of cpu power.

#1 Ryzen 5-1600, 16GB DDR4, Nvidia 1660 Super, M2-SSD, Acer freesync monitor

#2 i7-2600, 32GB, Nvidia 1660Ti, SSD for system, M2-SSD for work, 2x4TB hdd, LG 3D monitor +3DTV +3D projectors

Win10 x64, Vegas21 latest

John_Cline wrote on 3/3/2016, 5:52 AM
"From 120->24 you will have too sharp frames, without natural motion blur."

Yes, you are absolutely correct. That said, I don't know why 24 fps is even still a thing. I can understand 30 fps for online stuff, but 60 fps is currently my preferred frame rate.
pilsburypie wrote on 3/3/2016, 9:09 AM
This subject has such divided opinions - I can not stand 25p/30p. I have shot 50p for some years now and with my new Sony A7sii tried the 4k with the aim of downscaling or cropping in. I simply can't stomach the 25p so am content with 1080 50p.

Some people's eyes notice this more - some find it fine and "cinema like". I find going to the cinema an "epilepsy inspiring" affair and desperately seek out the HFR versions. Just me. For instance, the second to last regular 24p films I saw at the cinema was one of the Lord of the Rings films. They had some epic shots, I assume were done from a helicopter or drone over an awesome landscape. The jitter I saw made it almost unwatchable for me. I could hardly see what was going on. I wondered how any director/editor or quality assurance could ever consider this acceptable. The last film was the new James Bond - even worse being a high impact film. I could hardly see a damn thing.
OldSmoke wrote on 3/3/2016, 9:25 AM
pilsburypie

I am on your side. In this day and age we should not limit ourselves to old habits and their technology. I shoot 4K 30p but only for my personal use and I have to constrain myself when I do pans and tilts; other then that it's 1080 60p.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

wwaag wrote on 3/3/2016, 10:32 AM
So, when will a 4K 60P camera be available that mere mortals can afford? Given the severe limitations of the current crop of 4K offerings, I can't imagine that it's not being pursued. Even a 2.7K 60P much like the GoPro would be an improvement. Any speculation?

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

OldSmoke wrote on 3/3/2016, 11:15 AM
It depends on what you consider affordable. The Panasonic AG-DVX200 is certainly obtainable; I am just not a Panasonic fan and wait for a comparable Sony product.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

ddm wrote on 3/3/2016, 11:24 AM
>>>scroll down to behind the scenes, read clapper board

Normally, you don't see fps on the slate, unless it's a frame rate other than the normal frame rate of the show, and then, you put it on a big, bright piece of tape announcing it's something different so Post will know that this is a fx shot. (slo mo etc)

Something more like this slate shot

http://www.latimes.com/entertainment/tv/showtracker/la-ca-0106-hollywood-backlot-downton-abbey_pictures-photogallery.html



riredale wrote on 3/5/2016, 1:26 PM
As for shooting in 24, it's the "film look." People are familiar with the artifacts, so they in effect disappear.

And I'd always learned that "judder" was not the "bip-bip-bip-bip-bip" strobing effect of a fast pan, but rather the "slow-fast-slow-fast-slow-fast" pan effect as a result of the 3-2 pulldown when converting 24 film to 30 video.

Thirty years ago 60fps was thought to be the Holy Grail of filmmaking, but like so many other improvements it failed to reach critical mass for universal adoption. And some argued back then that 24 or even slower would be just fine in a future of high-powered digital interpolation. Plug in as many intermediate frames as you want.
farss wrote on 3/5/2016, 2:56 PM
It's not just a question of what you consider affordable, it should also include what you consider "4K".

There's at least one Sony camera that has a "4K" recording capability but you need to read the specifications carefully, Yes it will record UHD but at 8 bit 4:2:0. Switch to 1080 and the camera records 10bit 4:2:2. Most likely if you shot HD with this camera and up scaled it to UHD it'd look as good as footage shot if the camera's UHD mode.

Regarding frame rate: For sure shooting for the higher field of view that 4K demands higher frame rates are required however keep in mind that when the frame rate is doubled the shutter is open for half as long i.e. 1 stop of light is lost.

Bob.
relaxvideo wrote on 3/6/2016, 12:43 AM
I also hate 24p but only recorded with home equipment.
With pro cinema cameras 24p looks beautiful, strobe is different than with my devices. Yes, i tried using ND filters to lower my shutter speed, but still my footage is almost unwatchable (while fast motion), and cinema films are ok!
I like to know the secret :)

#1 Ryzen 5-1600, 16GB DDR4, Nvidia 1660 Super, M2-SSD, Acer freesync monitor

#2 i7-2600, 32GB, Nvidia 1660Ti, SSD for system, M2-SSD for work, 2x4TB hdd, LG 3D monitor +3DTV +3D projectors

Win10 x64, Vegas21 latest

farss wrote on 3/6/2016, 1:32 AM
[I]I like to know the secret :) [/I]

Shutter speed (angle) as you've already noted is part of it. Other parts are:

Watch panning and tracking rates.
Lack of "detail" aka sharpening. If your camera has settings for these try turning them down.

Bob.
relaxvideo wrote on 3/6/2016, 1:36 AM
yes, we tried to record 25p footage at 1/50sec shutter with my friend Panasonic camera, but still strobe strobe everywhere. Individual frames has motion blur, but the end result is worse than 1/48 shuttered 24p movie. Maybe its a digital sensor vs film sensor?

#1 Ryzen 5-1600, 16GB DDR4, Nvidia 1660 Super, M2-SSD, Acer freesync monitor

#2 i7-2600, 32GB, Nvidia 1660Ti, SSD for system, M2-SSD for work, 2x4TB hdd, LG 3D monitor +3DTV +3D projectors

Win10 x64, Vegas21 latest

OldSmoke wrote on 3/6/2016, 8:50 AM
Thirty years ago 60fps was thought to be the Holy Grail of filmmaking, but like so many other improvements it failed to reach critical mass for universal adoption.

I think it failed 30 years ago because it was just too costly. Imagine to buy almost triple the quantity of film stock for a 60p movie and all the subsequent cost of it. Today is a different story but only to some degree as post processing and special effects at higher frame rates is also more expensive.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

Chienworks wrote on 3/6/2016, 10:02 AM
""Disable Resample" and change the "Undersample Rate" to .500"

Only the disable is needed. Setting undersample is an unnecessary step as it is completely superfluous in this case. All it will do is incur extra CPU cycles.

In effect, disabling resampling accomplishes the same thing as undersampling whenever slowing down the frame rate, but simpler and faster.
John_Cline wrote on 3/6/2016, 4:04 PM
"Only the disable is needed."

I meant to say "or" and not "and", nevertheless, you are correct. As you point out, disabling resampling is probably the more universal option as that would take care of some of the oddball frame rates that smartphones cough up. For example, 60p on my Samsung Note 4 is not exactly 59.94fps.