Skip to content


Graham Finlayson

Professor Graham Finlayson

Professor Graham Finlayson

Research Fellow

Interests and expertise (Subject groups)

Grants awarded

From Pixels to Perception

Scheme: Wolfson Research Merit Awards

Organisation: University of East Anglia

Dates: Oct 2008-Sep 2013

Value: £50,000

Summary: The Royal Society uses case studies of Research Fellows to demonstrate the value of our fellowships to Government and to identify holders who may be suitable for media and other promotion opportunities. Please supply a short lay summary of your research which includes the following: Your research area; a short lay explanation of the science; the potential impact of your work and the possible applications/benefit to society (where applicable). Your summary should be no more than 2000 characters in length. We are now surrounded by images whether we view them on conventional devices like televisions or on the latest smart phones. Ours is an era of ubiquitous image content. However, modern digital devices have the advantage that images can be processed prior to display. The most important processing step is the one that makes the images more pleasing (where pleasing might mean 'attractive', 'vivid', preferred' or 'cinematic'). Indeed, it is truly shocking to look at the raw image recorded by a camera and see how poor it looks compared to the final reproduction. Yet, even the pictures we see on our medium of choice can often be further improved by hand (hence the prevalence of applications such as iPhoto and Photoshop which sell in their 10s of millions). Making pictures look right requires an in-depth knowledge of image processing, computer algorithms and the human visual system. In the 'Pixels to Perception' project, I aim to advance the state of the art in the understanding of our own color vision so we can develop automatic algorithms which produce 'better' looking images. Viewed critically, many images have a wash of color that didn't appear to be in the original scene. In fact scene colors depend as much on the color of the light as the color of objects. Yet, we see do not see the color cast due to our own, remarkable, color constancy processing. I am investigating how these mechanisms work so that we can improve color constancy processing in digital devices.

Was this page useful?
Thank you for your feedback
Thank you for your feedback. Please help us improve this page by taking our short survey.