Daily Bulletin

  • Written by Adrian Dyer, Associate Professor , RMIT University

When you step outside your house, light from the sun means that the colour of everything changes. You probably haven’t even noticed it, because your brain helps you see colours the same way under most conditions.

But it’s a phenomenon that is incredibly hard to mimic with technology, and so our cameras often struggle to interpret colour with the same accuracy.

Published today, our latest research identifies how the eyes and brains of honeybees work together to process colour information.

For a honeybee, matching colours under all conditions is vital so they can go to the right flower. It’s literally a matter of life or death for the colony.

If we can design technology to mimic the way bees do this, we’ll be able to create better cameras and image-processing systems for drones and robots.

Bees can help us design better camera technology.

What is colour constancy?

Being able to see an object as the same colour – the “true” colour – independent of illumination is called colour constancy. This visual phenomenon fascinated the impressionist Monet, who painted the Rouen Cathedral in France in different weather conditions, noting the changes in the stone’s colour.

image Monet’s Series Paintings, in which he painted many views of the same subject under different lighting conditions, are an attempt to illustrate the importance of light in our perception of a subject at a given time and place. Wikimedia Commons

Many theories of colour constancy assume that on average a scene should appear grey. Then the approximate colour of the light can be estimated, whether in an animal or a camera system. However, these approaches, and even more complex methods, are imperfect.

Despite our large and complex brains, colour constancy is not perfect in humans. How can bees solve this problem with their brain the size of a sesame seed? Bees must be able to identify the same flowers in direct or shaded light, or a green forest.

Like most people, bees have trichromatic (three-colour) vision. For humans, the primary colours we perceive are red, green and blue, with wavelengths measured in nanometres (nm) around 570nm, 540nm and 430nm. The forward facing compound eyes of bees respond most strongly to ultraviolet (350 nm), blue (440 nm), and green (540 nm) light.

In addition to their compound eyes used for sight, bees have three tiny “eyes” called ocelli. Until now we didn’t really know what ocelli were for – they had been assumed to aid flight stability. But we also know that each of the ocelli contains two colour sensors pointing up, sensing the whole sky. This study focused on those sensors.

A model of bee vision

It is relatively easy to measure the colour of the sky in the laboratory with a device called a spectrophotometer; but bees don’t have access to lab based equipment.

We asked the question: in a system containing just two colour receptors, could this allow bees to interpret the whole spectrum from red through to ultraviolet? We built a mathematical model to solve this problem. Our model showed that an ocelli sensitivity focusing on 330 nm and 470 nm (ultraviolet and blue), is optimal for typical foraging illuminations that bees encounter.

image Whether in shade or full sun, bees must be able to reliably use colour to pick out flowers worth visiting. UpPiJ/shutterstock

We then applied our model to a database of 111 flower colours and compared their apparent colour change with natural daylight variation. By combining the data from sensing in ocelli, the model significantly reduced variation in how colours were perceived. We found a model that – in theory – showed how bees would find the right flower most often.

This mathematical model shows that honeybees could use this information – but it doesn’t show how they do it in reality. To find out, our colleague Yu-Shan Hung (University of Melbourne) carefully stained and traced the neural pathways from bee ocelli, and found that these connected with a region of the brain that performs high-level colour processing.

As well as showing the connections that exist between bee ocelli and the brain, our model also explains certain bee behaviours. Honeybees have difficulty foraging under artificial light – our model shows that such light is “too yellowish” for the ocelli to properly interpret colours.

Also, honeybees that have blocked ocelli typically forage later in the morning, and return to the hive earlier. This makes sense, as bees in this state can’t take advantage of information from ocelli to correct for the highly variable coloured light conditions at the start and end of the day.

Designing artificial visual systems and cameras

Evolution has invested millions of years finding clever solutions to practical tasks in life. Understanding how insects solve these tasks is important, because it can give us hints around ways to avoid using complicated solutions that require sophisticated, energy-hungry computers.

Right now, we capture images with cameras, and then rely on intensive computer processing to interpret and present the colours in ways that make sense to our brains. If we can make cameras more like bee visual processing solutions, then we could collect images in a way that mimics nature more closely. Then the demand on processing the image becomes reduced.

Beyond just better photos, there are implications for industry too. For humans using machine vision in complex natural environments to reliably find coloured objects (like mineral-rich sands, ripe fruit or defects in building constructions), it is vital to find new, reliable and efficient solutions to the colour constancy problem.

image With better cameras, drones could check orchards for ripe fruit. Catalin Petolea/shutterstock

The honeybee shows that two very simple, small and spectrally different sensors are sufficient to reconstruct true colour for most illumination conditions found on Earth. Similar sensors could be cheaply added to the skyward facing surface of a device such as a camera to efficiently provide information about illumination. This would drastically reduce the computational cost of colour constancy.

Authors: Adrian Dyer, Associate Professor , RMIT University

Read more http://theconversation.com/want-a-better-camera-just-copy-bees-and-their-extra-light-sensing-eyes-80385

Business News

A Guide to Finance Automation Software

When running a business, it is critical to streamline certain processes to maintain efficiency. Too much to spent manually on tasks can wind up being detrimental to the overall health of the organis...

Daily Bulletin - avatar Daily Bulletin

Top Tips for Cost-effective Storefront Signage

The retail industry is highly competitive and if you are in the process of setting up a retail store, you have come to the right place, as we offer a few tips to help you create a stunning storefront...

Daily Bulletin - avatar Daily Bulletin

How Freight Forwarding Simplifies Global Trade Operations

Global trade operations are becoming increasingly complex due to international regulations, customs procedures, and the sheer scale of global logistics. For businesses looking to expand internation...

Daily Bulletin - avatar Daily Bulletin