Thread: Tech Talk
View Single Post
Old 12-18-2009, 04:09 PM   #77
Just Me
Banned
 
Join Date: Oct 2009
Location: Phoenix, Arizona
Posts: 94
Default

Quote:
Originally Posted by nartkd0924 View Post
OK Full HD is not only screen resolution but also frame rates as well. as there are 1080p24 1080p30 1080p50 1080p60 and 1080p100 each country has diff ratings. Which displays the frame rates of how mach each pic is refresh per second. So a 1080p24 would mean that a screen resolution of 1920x1080 is refresh 24 times per second. In movies or TV there not much diffs in quality the main focus in 1080p50 and higher is when it comes to PC games is that 50 FPS will bring less lag in playing PC games
Frame rate has NOTHING to do with HD at all. A film (not digital) camera captures at many frames rates, and is NOT HD. Cinema frame rates are usually 24 frames per second and is NOT HD.
So frame rate has absolutely nothing to do with a video capture being High Definition.

HD is where how much pixel density and latitude an image sensor can capture.
720p is at one time an HD standard back in 2003. Now in present 1080p is full HD, 1080i is NOT HD in today's standards.

Again.... P stand for progressive which is true lines of resolution.
i - stand for interlace, which means taking a resolution size and interlacing it to a larger scale, same as up scaling in theory.
For example.... Consumer DVD players can up scale a DVD from 480, 720 up to 1080 for the new standard wide screen TV's. It looks like crap and a full HD TV when upscale signal.

If you have a full HD TV such as my Samsung 55" 1080P 240hrtz LED, you can in fact see the difference between all the resolutions clearly, otherwise there would be NO need for SONY to invent Blu Ray to deliver full 1080P which broadcasting Station can not at this time. Blu Ray is 20mbps to 45mbps
whichs way to high for the equipment they currently have to broadcast HD bandwidth signals.

There is a major difference between all the mentioned resolutions, you just need a TV that can display full 1080P to be able to see the difference.

Refresh rate has nothing to do with frame rate.... Frame rate is how many pictures are going through the glass per second. For example: Shooting movies on film: Frame is is how many frames on a film strip are moving over the projected or capture light per second.
The higher your frame rate is, the high the quality will be. Higher frame rates are able to capture more information and detail.

Refresh is what a graphics card or PC monitors do. hence 120hrtz refresh rate
A countries ratings are on rated power supply. UK is 50hrtz USA is 60hrtz
So to make that all work properly the frame will be different due to power requirements.
60Hrtz 60p, 30p, 24p 60i, ect ect

I can not shoot 50i and display i properly in the United States, as I am under a faster electricity system which is 60hrtz. 50i is PAL rated for UK at 50hrtz

Again, screen size has nothing to do with what HD actually is.
High Definition is how much information is with in that certain screen size.
I have two full HD video cameras, one SONY EX1 3 chip Professional Video Camera and One Canon 5D MkII DSLR which has one chip that captures full HD as well, both cams are 1920x1080P
However, the Sony EX1 has three bigger chips than the Canon 5D MkII so it can capture higher quality video. The Sony EX1 captures more colors, shades, and way more information with three full size chips than the 5D MkII. Sony captures at 35mbps, Canon 5D MkII captures at 15 to 18mbps
due NOT being able to see more information with a single chip.
rather than just having one.

Rob delivers 8000k per second on FTV which is no where near what our cameras actually capture before compressed for the web.

Last edited by Just Me; 12-18-2009 at 04:45 PM.
Just Me is offline   Reply With Quote