Seeing Taste (12-16-20)

Research confirms links between what’s seen and tastes experienced.  Ueda, Spence, and Okajima, using augmented reality visors, collected view-taste data and report that “What we taste is affected by what we see, and that includes the colour, opacity, and shape of the food we consume. . . . We developed a novel AR [augmented reality] system capable of modifying the luminance distribution of foods [the light coming off/bouncing off food] in real-time using dynamic image processing for simulating actual eating situations. Importantly, this form of dynamic image manipulation does not change the colour on the food (which has been studied extensively previously). . . .  Participants looked at a piece of Baumkuchen [a German cake] . . . or a spoonful of tomato ketchup . . . having different luminance distributions and evaluated the taste on sampling the food. Manipulating the SD [standard deviation/variation] of the luminance distribution affected not only the expected taste/flavour of the food (e.g. expected moistness, wateriness and deliciousness), but also the actual taste properties on sampling the food.” When SD of luminance is smaller, a slice of cake, etc., has a smoother appearance and when the SD is larger the same item gives the impression of being rougher.

Junya Ueda, Charles Spence, and Katsunori Okajima.  2020. “Effects of Varying the Standard Deviation of the Luminance on the Appearance of Food, Flavour Expectations, and Taste/Flavour Perception.”  Scientific Reports, vol. 10, 16175, https://doi.org/10.1038/s41598-020-73189-8