Resolution is the visible detail in an image. Since pixels are the smallest point of information in the digital world, it would seem that comparing pixel count is a good way to compare relative resolution.
Film is analog so there are no real “pixels.” However, based on converted measures, a 35mm frame has 3 to 12 million pixels, depending on the stock, lens, and shooting conditions. An HD frame has 2 million pixels, measured using 1920 x 1080 scan lines. With this difference, 35mm appears vastly superior to HD.
This is the argument most film purists use. The truth is, pixels are not the way to compare resolution. The human eye cannot see individual pixels beyond a short distance. What we can see are lines.
Consequently, manufacturers measure the sharpness of photographic images and components using a parameter called Modulation Transfer Function (MTF). This process uses lines (not pixels) as a basis for comparison.
Since MTF is an industry standard, we will maintain this standard for comparing HD with 35mm film. In other words, we will make the comparison using lines rather than pixels. Scan lines are the way video images are compared, so it makes sense from this viewpoint, as well.
HD Resolution
As discussed previously, standard definition and high definition refer to the amount of scan lines in the video image. Standard definition is 525 horizontal lines for NTSC and 625 lines for PAL.
Technically, anything that breaks the PAL barrier of 625 lines could be called high definition. The most common HD resolutions are 720p and 1080i lines.
35mm Resolution
There is an international study on this issue, called Image Resolution of 35mm Film in