Our test scene is at the core of our camera testing, designed to allow like-for-like comparisons between cameras. Here we explain how the tests are conducted and why we work this way. We will publish a video explaining what we look for in the scene and where we look, in the coming weeks.
Our studio test scene is used to give a consistent and reproducible means of comparing camera output. We have a well-established testing methodology designed to shed light on the performance differences of camera sensors and the results of their JPEG engines. All analyses are cross-checked against our real-world experiences.
To enable comparison between cameras with different pixel counts, we offer a ‘Compare’ mode that re-sizes all the cameras to the largest resolution shared by all the selected cameras.
Lenses and focal lengths
Interchangeable lens cameras are shot using prime lenses that offer around 85mm equivalent field-of-view – a decision that stems from our historical use of each brand’s 50mm lenses on APS-C, which are generally very sharp and consistent across the frame when stopped down a little. The aim is to remove, as much as possible, the impact of the lens. Our testing has shown the use of dedicated own-brand primes to be more reliable than using multiple copies of third-party lenses.
Compact cameras are test-shot across a range of focal lengths and apertures. We then chose the focal length closest to 85mm equivalent that offers sharpness and across-frame consistency that fairly represents the lens performance as a whole. Unlike our ILC tests, any would-be buyer will be forced to use the built-in lens so we aim to include, rather than remove, the lens performance. As such, we do not cherry-pick the best performance if it’s unrepresentative of the rest of the lens’s performance, nor do we rigidly use the 85mm equivalent setting if it’s uncharacteristically poor.
JPEG images are exposed assuming that most users will rely either on their camera’s meter or on the histogram and, as such, are shot using whatever shutter speed is required to give correctly exposed middle grey values. White balance is set manually for the daylight scene, and low light is shot using the default Auto White Balance setting, to show the degree to which the camera tries to correct a very orange light source.
|Cameras are mounted securely on a macro rail on a heavily weighted-down tripod, to minimize external vibrations. Self timer and any available anti-shock modes are also employed to minimize the impact of shutter shock.|
Raw images are shot using set combinations of shutter speeds and apertures to allow the assessment of sensor performance on a common basis (so at any given ISO, all cameras will receive the same amount of light). At higher ISOs, we reduce the illumination of the scene by up to two stops if a camera doesn’t offer sufficiently fast shutter speeds to allow correct exposure. If this still isn’t sufficient, we then stop down the camera’s aperture, again ensuring that the net effect of illumination, shutter speed and aperture values are consistent across cameras.
These files are processed using the Adobe Camera Raw with noise reduction minimized and with shadows brightened to reveal the difference in shadow performance. All Raw images are white balanced during processing.
How can I check which settings you used?
All relevant shooting settings can be viewed by clicking the [i] icon at the lower right of each comparison window. If the [i] is illuminated in yellow, then some aspect of that particular shot is considered non-standard in such a way that it is not 100% comparable with other images. The cause of this inconsistency should be noted in the information tab if you click on the [i] icon.
We offer two lighting conditions, a ‘Daylight’ mode that is illuminated to 10EV using daylight-balanced Kino Flo RF55 lamps, and a low light mode lit by a 25W tungsten incandescent light bulb.
Like all processes there are sources of variation (error), including differences in chart alignment, focus and lens performance over time. While we have done everything possible to minimize the impact of these errors (including using a large, easy-to-align chart, careful manual focusing and selecting copies of lenses that are used only for studio testing), it is impossible to eliminate experimental error altogether.
Our comparison tool makes it possible to identify differences that are within the realms of well-controlled error, so we trust our readers not to read too much into very slight differences in apparent performance.
Overall, the aim of the test scene is to provide fair, consistent and comparable images across every camera that comes through our test studio. We endeavor to maintain the highest possible standards and are happy to discuss and investigate any apparent inconsistencies raised by personal message or feedback email.