Of course, one of the reasons (but by far not the only one) that the iphone has been so successful is the quality of the camera that is built in. It was certainly one of the features that made me switch from Nokia about 3 years ago after more than 15 years of loyalty to the swedish brand. So I was interested to read recently that the next iphone may feature advanced colour correction methods and promises to be even better than its predecessors. You can read about the story here.
Colour correction is necessary because different cameras use different RGB primaries and because the activation of the RGB sensors when taking an image depend upon the quantity and quality of the ambient illumination. So, for example, imagine the light was very very red, then the R channel of the camera would be more strongly activated than if the light was whiter. However, our visual systems are able to compensate for this so that most of the time we don’t notice objects changing colour when we move from one room to another or from inside to outside. Colour correction is inspired by human colour constancy and attempt to correct the images so that the objects in the scene would retain their daylight appearance. However, colour correction is difficult; that is, it is very difficult to get it right all of the time. One frustration I have is taking a photo of my band (I play drums in a covers band) under very colourful lighting. Often the images are very disappointing and lack the intensity of the original scene. That is because, human colour constancy is only partial and under extreme lighting things really do change colour markedly – such as under our intense LED stage lighting. In these cases I think sometimes the automatic colour correction is actually too much and I have found that I have to modify the images I capture on my mac to try to recreate what I think the original scene looked like. So auto colour correction – the state of the art – is certainly not perfect. Let’s hope this story about an advance made by Apple is true.
Another simulator on the market that shows you what your image or website would look like to someone who is colour blind. This one is from a company called ETRE – for further details see http://www.etre.com/tools/colourblindsimulator/
In the image series below the left image is normal and the ones in the middle and right show protonopia and deuteranopia respectively.
For more on colour blindness see my earlier post.
A while ago I posted about whether colour blindness was something that designers should take more seriously. After all, about 8% of all the men in the world are colour blind. Of course, this does not mean that they cannot see colour (the term, colour blindness is a bit of a misnomer) but it does mean that they have difficulty discriminating between colours that the rest of us can easily tell apart. In my original post I was referring to the computer game, Call of Duty, and whether the gameplay could be reduced for colour blind players who may have difficulty telling the various colour tags apart that appear on the screen.
So it was quite interesting that I just came across news that the developers of SimCity have added three special colour filters that make adjustments to the colours on screen so that colour blind players can better discriminate. A great idea – but about time!!
Readers may be interested in a new colour-related blog by the SDC’s Chief Executive Graham Clayton. The SDC – the Society of Dyers and Colourists – is the world’s leading independent educational charity dedicated to advancing the science and technology of colour worldwide. It is a professional, chartered Society and becoming a member gives access to SDC’s professional coloration qualifications. I have been a member since about 1982 and I am a Chartered Colourist and a Fellow of the SDC.
I also recently came across another colour blog called chromatic notes. It’s not clear from the web site who runs this blog but there is a great deal of technical information there.
There is now an official MATLAB page for our MATLAB colour book. See here.
Over the summer I was asked to take part in a BBC documentary about the recent discovery of the first colour movie film that was fond at the National Media Museum (Bradford). I met the presenter Antonia Quirke (who was very nice) and we filmed for half a day. In the end only a few minutes of our footage made the final cut. Still it was nice to be on TV and BBC1 at that!! For further details see here.
The films were made by a young British photographer and inventor called Edward Turner, a pioneer who can now lay claim to being the father of moving colour film, well before the pioneers of Technicolor.
The footage will be shown to the public from 13 September at the museum in Bradford. And a BBC documentary, The Race for Colour, will be broadcast on 17 September in the Yorkshire and south-east regions on BBC1. I will feature in the film for a minute or two. Exciting.
For further details see the story in the Guardian.
Quite a lot of people are colour blind and have poor colour discrimination. There are tests that can be carried out and these include the Ishihara test (which is a screening test that I certainly remember from School) and the Munsell 100-hue test (where people have to arrange a number of coloured discs in order). These tests need to be performed whilst being viewed in daylight. There are online tests but these are less reliable – partly because the viewing conditions vary such a lot. I recently came across a new online test provide by X-rite. It seems to be based on the 100-hue test (or, at least, something similar) and I can see how it could work, despite being an on-line test). I just had a go. It gave me a score of 34 and suggested that for my age group (and gender) the best score was 0 (perfect colour acuity) and the worst was 99 (low colour acuity). Hmmmmmmmmm. I have a version of the 100-hue test and I can perform it perfectly. My real score should be 0. I have perfect colour discrimination. So, much as I like the X-rite test, I have not changed my opinion that on-line tests like these should be used for fun and should be understood to not provide an accurate assessment of your colour vision. On the other hand, it could just be bitter because I only scored 34.
I came across an article in the Daily Mail about a phone app that can be used to measure skin colour. It is produced by Fujitsu. Trying to measure colour using a digital camera is difficult. The RGB values you obtain depend upon the lighting and the settings of the camera and even on the make of the camera. Technically, we say that the RGB values are device-dependent. Fujitsu have got around this by using a mask (see picture) that contains some standard colours that are skin tones. Presumably the app grabs the RGB values of the standard colours and then uses these to make an adjustment to the captured RGB values of the actual skin. It’s very clever and I am impressed.
An interesting project, Color Forecast, has been developed by Pedro Cruz and runs feeds from high definition cameras in Milan, Paris and Antwerp to track the colour of fashions worn across the cities.The software analyzes the passing colors and shows in real time which colors are worn most often, then the colors are compiled into an infographic to see how trends evolve. For further information see here.