Cosmic Dream Blog


Signup & Login To Post A Blog

Photo by Greg Rakozy on Unsplash

Jan. 26, 2023 @


A person's worth is contingent upon who he is, not upon what he does, or how much he has. The worth of a person, or a thing, or an idea, is in being, not in doing, not in having.
Author: Alice Mary Hilton

Technology » Science

On Jan. 15, 2023 by Admin

Artificial vision systems find a wide range of applications, including self-driving cars, object detection, crop monitoring, and smart cameras. Such vision is often inspired by the vision of biological organisms. For instance, human and insect vision have inspired terrestrial artificial vision, while fish eyes have led to aquatic artificial vision. While the progress is remarkable, current artificial visions suffer from some limitations: they are not suitable for imaging both land and underwater environments, and are limited to a hemispherical (180°) field-of-view (FOV).

To overcome these issues, a group of researchers from Korea and USA, including Professor Young Min Song from Gwangju Institute of Science and Technology in Korea, have now designed a novel artificial vision system with an omnidirectional imaging ability, which can work in both aquatic and terrestrial environments. Their study was made available online on 12 July 2022 and published in Nature Electronics on 11 July 2022.

"Research in bio-inspired vision often results in a novel development that did not exist before. This, in turn, enables a deeper understanding of nature and ensure that the developed imaging device is both structurally and functionally effective," says Prof. Song, explaining his motivation behind the study.

The inspiration for the system came from the fiddler crab (Uca arcuata), a semiterrestrial crab species with amphibious imaging ability and a 360° FOV. These remarkable features result from the ellipsoidal eye stalk of the fiddler crab's compound eyes, enabling panoramic imaging, and flat corneas with a graded refractive index profile, allowing for amphibious imaging.

Accordingly, the researchers developed a vision system consisting of an array of flat micro-lenses with a graded refractive index profile that was integrated into a flexible comb-shaped silicon photodiode array and then mounted onto a spherical structure. The graded refractive index and the flat surface of the micro-lens were optimized to offset the defocusing effects due to changes in the external environment. Put simply, light rays traveling in different mediums (corresponding to different refractive indices) were made to focus at the same spot.

To test the capabilities of their system, the team performed optical simulations and imaging demonstrations in air and water. Amphibious imaging was performed by immersing the device halfway in water. To their delight, the images produced by the system were clear and free of distortions. The team further showed that the system had a panoramic visual field, 300o horizontally and 160o vertically, in both air and water. Additionally, the spherical mount was only 2 cm in diameter, making the system compact and portable.

"Our vision system could pave the way for 360° omnidirectional cameras with applications in virtual or augmented reality or an all-weather vision for autonomous vehicles," speculates Prof. Song excitedly.

Source: GIST (Gwangju Institute of Science and Technology)
Article Credit: Science Daily

On Jul. 14, 2021 by Admin

When people see a toothbrush, a car, a tree -- any individual object -- their brain automatically associates it with other things it naturally occurs with, allowing humans to build context for their surroundings and set expectations for the world.

By using machine-learning and brain imaging, researchers measured the extent of the "co-occurrence" phenomenon and identified the brain region involved. The findings appear in Nature Communications.

"When we see a refrigerator, we think we're just looking at a refrigerator, but in our mind, we're also calling up all the other things in a kitchen that we associate with a refrigerator," said corresponding author Mick Bonner, a Johns Hopkins University cognitive scientist. "This is the first time anyone has quantified this and identified the brain region where it happens."

In a two-part study, Bonner and co-author, Russell Epstein, a psychology professor at the University of Pennsylvania, used a database with thousands of scenic photos with every object labeled. There were pictures of household scenes, city life, nature -- and the pictures had labels for every mug, car, tree, etc. To quantify object co-occurrences, or how often certain objects appeared with others, they created a statistical model and algorithm that demonstrated the likelihood of seeing a pen if you saw a keyboard, or seeing a boat if you saw a dishwasher.

With these contextual associations quantified, the researchers next attempted to map the brain region that handles the links.

While subjects were having their brain activity monitored with functional magnetic resonance imaging, or fMRI, the team showed them pictures of individual objects and looked for evidence of a region whose responses tracked this co-occurrence information. The spot they identified was a region in the visual cortex commonly associated with the processing of spatial scenes.

"When you look at a plane, this region signals sky and clouds and all the other things,quot; Bonner said. "This region of the brain long thought to process the spatial environment is also coding information about what things go together in the world."

Researchers have long-known that people are slower to recognize objects out of context. The team believes this is the first large-scale experiment to quantify the associations between objects in the visual environment as well as the first insight into how this visual context is represented in the brain.

"We show in a fine-grained way that the brain actually seems to represent this rich statistical information," Bonner said.

Copied From: Science Daily
Source: Johns Hopkins University

On Jul. 04, 2021 by Artemis

The stink of ammonia in urine, sweat, and rotting meat repels humans, but many insects find ammonia alluring. Now, UConn researchers have figured out how the annoying insects smell it, a discovery that could lead to better ways to make them buzz off.

The sense of smell is enormously important. Mammals devote a third of their genetic code to odor receptors found in the nose, and have more than 1,000 different kinds that allow us to smell an estimated trillion different odors.

Flies don't have noses. Instead, they smell with their antenna. Each antenna is covered with tiny hairs called sensilla. Each sensilla contains a few neurons -- fly brain cells. Each neuron expresses one type of odor receptor, and they all fall into two main classes. Or so scientists thought.

But recent work by UConn neuroscientist Karen Menuz and her colleagues, reported online in June in Current Biology, identified a new type of odor neuron devoted to sniffing ammonia. And the receptor it uses is unlike any other odor receptor known.

Flies and other insects use the scent of ammonia to find food sources. Mosquitoes find humans to bite by following the faint scent of ammonia in our sweat, along with other clues. Many crop pests do the same, locating fruit and agricultural products to infest and consume. "When an odor binds to a receptor, the cell depolarizes, and sends a signal saying 'hey, the odor is here!' Insects are small, and odors come in plumes, so most insects will fly straight as long as the concentration is the same or growing. If they lose the odor plume, they'll do a casting behavior, flying in zig zags to find it," Menuz says.

Knowing exactly how the insects smell ammonia might yield effective ways to block them from following that scent plume -- and from finding us and our crops.

But figuring out exactly how and what a fly smells is tricky. Menuz and her colleagues are able to gently hold a fly down and use incredibly fine pieces of glass to probe individual neurons in sensilla on the fly's antenna. Then they let the ammonia waft.

They probed all three types of scent neurons in the flies' sensilla, but they didn't respond to ammonia. But the fly was obviously smelling it. So the researchers realized there had to be a fourth scent neuron they hadn't known was there. And they found it -- but it didn't seem to have the usual odor receptors on it. It was covered in ammonia transporter (Amt) a molecule that is known to allow ammonia in and out of cells.

No one had ever known a transporter molecule to also act as an odor receptor. But there it was. When they selectively killed off only that type of neuron, the flies did not respond to ammonia at all. And when the team forced scent neurons that don't normally respond to ammonia to express Amt on their surfaces, those neurons began responding to ammonia, too.

The team hopes to learn whether mosquitoes use the same system to smell ammonia. If it's used by both mosquitoes and flies, it's a good bet the Amt receptor-as-sniffer is used by all insects, and developing ways to block Amt could be an effective way to protect people and crops from pests attracted to ammonia.

Copied From: Science Daily

On Jun. 05, 2021 by Artemis

Humans can observe what and where something happens around them with their hearing, as long as sound frequencies lie between 20 Hz and 2,000 Hz. Researchers at Aalto University have now developed a new audio technique that enables people to also hear ultrasonic sources that generate sound at frequencies above 20,000 Hz with simultaneous perception of their direction. The results have been published in Scientific Reports on 2 June 2021.

'In our study, we used bats in their natural habitat as sources of ultrasonic sound. With our new technique, we can now hear the directions-of-arrival of bat sounds, which means we can track bats in flight and hear where they are -- we're essentially giving ourselves super hearing,' says Professor Ville Pulkki from Aalto University.

Small devices have been used before to listen to bats but previous versions haven't allowed listeners to locate the bats, just hear them. With their device the researchers record ultrasound using an array of microphones flush mounted and uniformly distributed on the surface of a small sphere. After the signal has been pitch-shifted to audible frequencies, the sound is played back on the headphones immediately. Currently, the pitch-shifting is performed on a computer, but, in the future, it could be done with electronics attached to the headphones.

'A sound-field analysis is performed on the microphone signals, and as a result we obtain the most prominent direction of the ultrasonic sound field and a parameter that suggests that the sound comes only from a single source. After this, a single microphone signal is brought to the audible frequency range of human hearing and its single-source signal is played back on the headphones so that the listener can perceive the source from the direction the sound was analysed to arrive,' Pulkki says.

On top of its popular appeal, the technique has tangible real-world applications.

'In science and art, people have always been interested in how they could improve their senses. Finding sources of ultrasonic sound is also useful in many practical situations, such as finding leaks in pressurized gas pipes. Minor pipe leaks often produce strong ultrasound emissions not detected by normal hearing. The device allows us to spot the sound source quickly,' Pulkki explains.

'Sometimes, damaged electrical equipment also emit ultrasound, and the device could be used for locating faulty equipment faster in places such as data centres,' he continues.

Copied From: Science Daily

On Sep. 27, 2020 by Admin

John Stewart Bell's eponymous theorem and inequalities set out, mathematically, the contrast between quantum mechanical theories and local realism. They are used in quantum information, which has evolving applications in security, cryptography and quantum computing.

The distinguished quantum physicist John Stewart Bell (1928-1990) is best known for the eponymous theorem that proved current understanding of quantum mechanics to be incompatible with local hidden variable theories.

Thirty years after his death, his long-standing collaborator Reinhold Bertlmann of the University of Vienna, Austria, has reviewed his thinking in a paper for EPJ H, "Real or Not Real: That is the question". In this historical and personal account, Bertlmann aims to introduce his readers to Bell's concepts of reality and contrast them with some of his own ideas of virtuality.

Bell spent most of his working life at CERN in Geneva, Switzerland, and Bertlmann first met him when he took up a short-term fellowship there in 1978. Bell had first presented his theorem in a seminal paper published in 1964, but this was largely neglected until the 1980s and the introduction of quantum information.

Bertlmann discusses the concept of Bell inequalities, which arise through thought experiments in which a pair of spin-- particles propagate in opposite directions and are measured by independent observers, Alice and Bob. The Bell inequality distinguishes between local realism -- the 'common sense' view in which Alice's observations do not depend on Bob's, and vice versa -- and quantum mechanics, or, specifically, quantum entanglement. Two quantum particles, such as those in the Alice-Bob situation, are entangled when the state measured by one observer instantaneously influences that of the other. This theory is the basis of quantum information.

And quantum information is no longer just an abstruse theory. It is finding applications in fields as diverse as security protocols, cryptography and quantum computing. "Bell's scientific legacy can be seen in these, as well as in his contributions to quantum field theory," concludes Bertlmann. "And he will also be remembered for his critical thought, honesty, modesty and support for the underprivileged."

Copied From: Science Daily

Cosmic Dream Calendar
   Prev January 2023 Next   
Sun Mon Tue Wed Thu Fri Sat
01 02 03 04 05 06 07
08 09 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30 31

Latest Topics

Latest Blogs

Novel Artificial Vision Jan 15, 2023

Popular Blogs

Earth Tribute Nov 2020
Live And Let Live Jan 2021
Health Benefits Of A Hobby Mar 2021
The Indestructible House Mar 2021
Impact of COVID-19 May 2021

Latest Members

 Admin Since Oct 2021
 Angela Since Aug 2021
 Darrel Since Aug 2021
 Yodasmydad Since Aug 2021
 Brad Since Aug 2021
 Sparks Since Jul 2021

Latest Comments

 Susan Aug 2021
 Admin Aug 2021
 Sparks Jul 2021
 Fred Jul 2021
 Terry Jul 2021
 Karla Jun 2021


Cosmos for Cosmic Dream Blog
Copyright © 2020 - 2023 All Rights Reserved
Theme Ported From FlatPress Blog by
Unique Visits:
Powered by PHP-Fusion Copyright © 2023 PHP-Fusion Inc
Released as free software w/o warranties under GNU Affero GPL v3
Copyright © 2020 - 2023 | Trans-Galactic.Com - Cosmic Dream Blog | All Rights Reserved