Animals see the world in different colors than humans – new camera reveals what it looks like

By | January 25, 2024

<açıklık sınıfı=Visualizing the colors that birds perceive reveals ultraviolet patterns that are often hidden from us. Vasas et al. (2024) PLOS Biology, Author provided” src=”https://s.yimg.com/ny/api/res/1.2/8oo0dqH3pL831JhcP3IrVw–/YXBwaWQ9aGlnaGxhbmRlcjt3PTk2MDtoPTk2MA–/https://media.zenfs.com/en/the_conversation_464/2e44a336834d04d1 9f438ddce50b9ac6″ data-src= “https://s.yimg.com/ny/api/res/1.2/8oo0dqH3pL831JhcP3IrVw–/YXBwaWQ9aGlnaGxhbmRlcjt3PTk2MDtoPTk2MA–/https://media.zenfs.com/en/the_conversation_464/2e44a336834d04d19f43 8ddce50b9ac6″/>

If you’ve ever wanted to see the world through another animal’s eyes, we have good news for you. We wondered about this, too, and as scientists who specialize in color vision, we created a solution: a camera system and software package that allows you to record videos in animal appearance colors.

Many animals, including bees, birds, and even mammals such as reindeer and mice, can detect ultraviolet light. In fact, lack of UV sensitivity in humans is the exception rather than the rule. At the other end of the visible light spectrum, human eyes contain red-sensitive receptors, while many animals, including bees, mice and dogs, are as blind to red as we are to ultraviolet light.

Even in the case of blues and greens, which are perceived colors in the animal kingdom, the exact wavelength of light an animal will perceive as “pure blue” or “pure green” is species-specific. As a result, no two species see the world in the same color.

We invite you to look at the sky and appreciate that its blueness is the joint product of sunlight scattered through the atmosphere and your own sensory system. The color you see is unique to you; In fact, for many animals the sky is ultraviolet colored.

Now slowly lower your eyes and try to imagine what the rest of the landscape might look like to other species. With our new camera system, we are one step closer to understanding this wonderful, strange world that other animals live in.

Capturing the world in motion

While we can’t possibly imagine what ultraviolet light looks like to animals that can detect it, we can visualize it using false-color images. For example, for honeybees that are sensitive to three types of light (ultraviolet, blue, and green), we can shift their perceptible colors into the human-visible range, with ultraviolet represented as blue, blue becoming green, and green becoming red.

Until now, we could only apply this process to motionless objects. False color photography relies on passing a series of photographs through a series of optical filters and then superimposing them, and this sequential method means that everything must be in exactly the same position in all the photographs.

A large butterfly in shades of fluorescent orange and purple contrasts with a smaller yellow version of the same image.

A large butterfly in shades of fluorescent orange and purple contrasts with a smaller yellow version of the same image.

This is a serious disadvantage. It creates a laborious process that limits the number of objects that can be realistically displayed. For example, photographing an iridescent peacock feather from hundreds of different angles requires turning each filter on and off a hundred times.

Worse, all information about the movement is discarded. But the living world is constantly in motion: Trees sway in the wind, leaves flutter, birds hop along the branches in search of insects wandering through the bushes. We needed a way to visualize all this movement.

The first challenge was to design a camera that records in ultraviolet and visible light simultaneously. The solution turned out to be a beam splitter. This special optical equipment reflects ultraviolet light as if it were a mirror, but allows visible light to pass through, just like clear glass.

We placed two cameras (not too fancy, the same kind you can buy in stores and online, but one modified for ultraviolet recording) in a 3D-printed case, while the modified camera received reflected ultraviolet light, a stock camera received. transmits visible light. We overlaid and synchronized the recordings from these two cameras, and a series of conversion steps allowed us to calculate the amount of light reaching each camera’s sensors.

From this, we could estimate the amount of light an animal’s eyes would capture if the animal were viewing the scene from our camera’s point of view.

Try it yourself

We’ve made all the necessary code and camera system plans to implement video conversions available for free online, and we’ve also made our best attempt at explaining how to build the camera from scratch.

Our goal is for other researchers to build their own cameras and use them to answer their own questions about how other species see the world. There are many possibilities.

We can record peacocks dancing and see how dazzling their feathers look to female peacocks. The iridescence of these feathers extends to ultraviolet light; Our records show that feathers appear much more colorful to the target audience than to us.

We can accurately describe what caterpillars’ frightening displays appear to birds of prey and understand why the flash of unexpected colorful patterns scares them away. We can ask questions about how animals move between spots on the forest floor to show or hide their colors.

We can also create image recordings of butterflies and other insects in museum collections and offer animal appearance transformations as part of the digital library. We can also ensure that glass facades are visible enough to birds that might otherwise bump into them.

But the most exciting questions will be the ones we haven’t considered yet. But now that we’ve started taking videos of the natural world in the colors animals see, we’re starting to realize how much information is out there. Discoveries await you in your own backyard.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

SpeechSpeech

Speech

Daniel Hanley received funding from the National Geographic Society.

Vera Vasas does not work for, consult, own shares in, or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond her academic duties.

Leave a Reply

Your email address will not be published. Required fields are marked *