Artificial intelligence (AI) has made remarkable advances in recent years, but most AI systems lack the emotional aspect of human intelligence. Emotions play a crucial role in how we make decisions, especially when it comes to what we eat. Our sense of taste helps us choose foods that we like and that satisfy our physiological and psychological needs.
A team of researchers from Penn State is developing a novel electronic tongue that mimics how taste influences human eating behavior. The electronic tongue is composed of an array of sensors that can detect different flavors, such as sweet, sour, salty, bitter, and umami. The sensors are connected to a neural network that learns from the sensor data and generates an output signal that represents the preference for a certain food.
The researchers tested the electronic tongue on different types of foods, such as fruits, vegetables, meats, cheeses, and candies. They found that the electronic tongue could distinguish between different foods and show a preference for some over others. For example, the electronic tongue preferred sweet foods over bitter foods, even when it was not hungry. This indicates that the electronic tongue has a psychological component that influences its choices.
The researchers also found that the electronic tongue could adapt to changing conditions and learn from its experiences. For instance, the electronic tongue could adjust its preference for salty foods depending on its hydration level. If the electronic tongue was dehydrated, it would prefer more salty foods to replenish its electrolytes. If it was hydrated, it would prefer less salty foods to avoid excess sodium intake. The electronic tongue could also learn from feedback and modify its preferences accordingly. If the electronic tongue received a positive reward for choosing a certain food, it would increase its preference for that food in the future. If it received a negative reward, it would decrease its preference for that food.
The researchers believe that their electronic tongue is a possible first step to artificial emotional intelligence, which is the ability of AI to understand and express emotions. They hope that their work will inspire more research on how emotions affect AI behavior and how AI can interact with humans in a more natural and empathetic way.
The study was published recently in Nature Communications. The lead author of the study is Saptarshi Das, associate professor of engineering science and mechanics at Penn State. The co-authors are Abhronil Sengupta, assistant professor of electrical engineering at Penn State; Ankit Kumar Yadav, graduate student in engineering science and mechanics at Penn State; and Aaryan Oberoi, undergraduate student in engineering science and mechanics at Penn State.
Add a Comment: