<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>ePanorama.net &#187; Virtual reality</title>
	<atom:link href="http://www.epanorama.net/blog/category/virtual-reality/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.epanorama.net/blog</link>
	<description>All about electronics and circuit design</description>
	<lastBuildDate>Fri, 24 Apr 2026 06:07:11 +0000</lastBuildDate>
	<language>en-US</language>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.9.14</generator>
	<item>
		<title>Animals drive and play for science</title>
		<link>https://www.epanorama.net/blog/2022/02/07/animals-drive-and-play-for-science/</link>
		<comments>https://www.epanorama.net/blog/2022/02/07/animals-drive-and-play-for-science/#comments</comments>
		<pubDate>Mon, 07 Feb 2022 21:33:58 +0000</pubDate>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
				<category><![CDATA[Computers]]></category>
		<category><![CDATA[Raspberry Pi]]></category>
		<category><![CDATA[Science news]]></category>
		<category><![CDATA[Virtual reality]]></category>

		<guid isPermaLink="false">https://www.epanorama.net/blog/?p=190671</guid>
		<description><![CDATA[<p>There&#8217;s an amusing joke about some fish in a tank, idly wondering how they drive it. this can be tested by building them an FOV (fish-operated vehicle). Fish Discover How To Drive Raspberry Pi Powered Tank and These fish can drive their tank to get treats tell how scientists have taught fish to drive their <a class="moretag" href="https://www.epanorama.net/blog/2022/02/07/animals-drive-and-play-for-science/">&#8594;</a></p>]]></description>
				<content:encoded><![CDATA[<p>There&#8217;s an amusing joke about some fish in a tank, idly wondering how they drive it. this can be tested by building them an FOV (fish-operated vehicle). </p>
<p><a href="https://www.tomshardware.com/uk/news/raspberry-pi-powered-fish-tank-driven-by-fish">Fish Discover How To Drive Raspberry Pi Powered Tank</a> and <a href="https://www.raspberrypi.com/news/these-fish-can-drive-their-tank-to-get-treats/">These fish can drive their tank to get treats</a> tell how scientists have taught fish to drive their tank around an enclosure using a motion-detecting algorithm running on Raspberry Pi 3B+.</p>
<p><a href="https://www.raspberrypi.com/news/these-fish-can-drive-their-tank-to-get-treats/"><img src="https://www.raspberrypi.com/app/uploads/2022/01/Screenshot-2022-01-11-at-13.06.00-1536x1007.png" width="1536" height="1007" class="alignnone" /></a></p>
<p>Check <a href="https://www.youtube.com/watch?v=OQ7_6gDx7DI">this video</a>:<br />
<iframe width="560" height="315" src="https://www.youtube.com/embed/OQ7_6gDx7DI" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></p>
<p>Can animals play video games? <a href="https://nerdist.com/article/rats-play-doom-ii-using-mini-vr-rig/">SOMEBODY TRAINED RATS TO PLAY DOOM II USING A MINI ‘VR’ RIG</a> article tells that the engineer says he had <a href=" https://nerdist.com/article/rats-play-doom-ii-using-mini-vr-rig/">“fun building a rodent VR rig and training rats to kinda play” Doom II</a>. The <a href="https://medium.com/mindsoft/rats-in-doom-eb6c52c73aca">engineer has written a blog post</a> that tells how he was able to build his system for the “ridiculously cheap” price of $2,000.</p>
<p><a href="https://nerdist.com/article/rats-play-doom-ii-using-mini-vr-rig/"><img src="https://nerdist.com/wp-content/uploads/2022/01/Rat-Doom-feature-image-01072022.jpg" width="1200" height="676" class="alignnone" /></a></p>
<p><a href="https://www.youtube.com/watch?v=J83oDP7IDVU">VR Setup Overview</a></p>
<p><iframe width="560" height="315" src="https://www.youtube.com/embed/J83oDP7IDVU" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></p>
]]></content:encoded>
			<wfw:commentRss>https://www.epanorama.net/blog/2022/02/07/animals-drive-and-play-for-science/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>Virtual taste and smell</title>
		<link>https://www.epanorama.net/blog/2020/05/26/virtual-taste-and-smell/</link>
		<comments>https://www.epanorama.net/blog/2020/05/26/virtual-taste-and-smell/#comments</comments>
		<pubDate>Tue, 26 May 2020 21:04:48 +0000</pubDate>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
				<category><![CDATA[Electronics Design]]></category>
		<category><![CDATA[Innovation]]></category>
		<category><![CDATA[Science news]]></category>
		<category><![CDATA[Virtual reality]]></category>

		<guid isPermaLink="false">https://www.epanorama.net/blog/?p=186325</guid>
		<description><![CDATA[<p>You can now add a virtual taste anf smell to a tasteless snack! Meiji University in Japan has developed a taste display that can artificially recreate any flavor by triggering the five different tastes on a user’s tongue. Researchers can also create virtual smells by electrocuting your nose. Electric Chopsticks Add Salty Flavor Where None <a class="moretag" href="https://www.epanorama.net/blog/2020/05/26/virtual-taste-and-smell/">&#8594;</a></p>]]></description>
				<content:encoded><![CDATA[<p>You can now add a virtual taste anf smell to a tasteless snack!</p>
<p>Meiji University in Japan has developed a taste display that can artificially recreate any flavor by triggering the five different tastes on a user’s tongue. Researchers can also create virtual smells by electrocuting your nose.</p>
<p>Electric Chopsticks Add Salty Flavor Where None Exists article tells how a special pair of chopsticks built by Nimesha Ranasinghe could let you experience that great salty taste without actually consuming any salt. </p>
<p>This Lickable Screen Can Recreate Almost Any Taste or Flavor Without Eating Food<br />
<a href="https://gizmodo.com/this-lickable-screen-can-recreate-almost-any-taste-or-f-1843609903">https://gizmodo.com/this-lickable-screen-can-recreate-almost-any-taste-or-f-1843609903</a></p>
<p>Virtual reality with smells and taste will meet synthetic food<br />
<a href="https://www.epanorama.net/blog/2018/10/22/virtual-reality-with-smells-and-taste-will-meet-synthetic-food/">https://www.epanorama.net/blog/2018/10/22/virtual-reality-with-smells-and-taste-will-meet-synthetic-food/</a></p>
]]></content:encoded>
			<wfw:commentRss>https://www.epanorama.net/blog/2020/05/26/virtual-taste-and-smell/feed/</wfw:commentRss>
		<slash:comments>27</slash:comments>
		</item>
		<item>
		<title>Smartglasses shoot lasers directly into the retina</title>
		<link>https://www.epanorama.net/blog/2020/02/05/smartglasses-shoot-lasers-directly-into-the-retina/</link>
		<comments>https://www.epanorama.net/blog/2020/02/05/smartglasses-shoot-lasers-directly-into-the-retina/#comments</comments>
		<pubDate>Wed, 05 Feb 2020 06:41:26 +0000</pubDate>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
				<category><![CDATA[Audio and Video]]></category>
		<category><![CDATA[Mobile]]></category>
		<category><![CDATA[Virtual reality]]></category>

		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=185631</guid>
		<description><![CDATA[<p>Cyberpunk 2020! Omg lasers directly into the retina! Lol! Whew. I think I&#8217;ll skip this one, but it is super rad. Check out this article: Bosch Gets Smartglasses Right With Tiny Eyeball Lasers A tiny laser array paints images directly onto your retina https://spectrum.ieee.org/tech-talk/consumer-electronics/gadgets/bosch-ar-smartglasses-tiny-eyeball-lasers &#8220;Lightweight and slim, with a completely transparent display that’s brightly visible <a class="moretag" href="https://www.epanorama.net/blog/2020/02/05/smartglasses-shoot-lasers-directly-into-the-retina/">&#8594;</a></p>]]></description>
				<content:encoded><![CDATA[<p>Cyberpunk 2020! Omg lasers directly into the retina! Lol! Whew. I think I&#8217;ll skip this one, but it is super rad.</p>
<p>Check out this article:</p>
<p>Bosch Gets Smartglasses Right With Tiny Eyeball Lasers<br />
A tiny laser array paints images directly onto your retina<br />
<a href="https://spectrum.ieee.org/tech-talk/consumer-electronics/gadgets/bosch-ar-smartglasses-tiny-eyeball-lasers">https://spectrum.ieee.org/tech-talk/consumer-electronics/gadgets/bosch-ar-smartglasses-tiny-eyeball-lasers</a></p>
<p><a href="https://spectrum.ieee.org/tech-talk/consumer-electronics/gadgets/bosch-ar-smartglasses-tiny-eyeball-lasers"><img src="https://spectrum.ieee.org/image/MzU2MjA3MQ.jpeg" width="1240" height="827" class="alignnone" /></a></p>
<p>&#8220;Lightweight and slim, with a completely transparent display that’s brightly visible to you and invisible to anyone else, the smart glasses in this video seem like something that could be really useful without making you feel like a huge dork. But concept videos are just that, and until I got a chance to try them out for myself, it was hard to know how excited I should be.&#8221;</p>
<p>&#8220;A custom fitting is necessary is because of how the glasses work. Rather than projecting an image onto the lenses of the glasses themselves, the Bosch “Light Drive” uses a tiny microelectromechanical mirror array to direct a trio of lasers (red, green, and blue) across a transparent holographic element embedded in the right lens, which then reflects the light into your right eye and paints an image directly onto your retina.&#8221;</p>
<p>&#8220;If anything gets misaligned, you won’t see the image.&#8221;</p>
<p>&#8220;The concept video is a quite accurate representation of how the glasses look when you’re using them.&#8221;</p>
<p>Here is the concept video </p>
<p><iframe width="1024" height="576" src="https://www.youtube.com/embed/nB_RHkse80k?feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></p>
<p>Check overview of technical details from here (text in Finnish written by me, use Google translate and check links)<br />
<a href="https://www.uusiteknologia.fi/2019/12/13/bosch-tahtaa-seuraavan-sukupolven-alylaseihin/">https://www.uusiteknologia.fi/2019/12/13/bosch-tahtaa-seuraavan-sukupolven-alylaseihin/</a></p>
<p><a href="https://www.uusiteknologia.fi/2019/12/13/bosch-tahtaa-seuraavan-sukupolven-alylaseihin/"><img src="https://www.uusiteknologia.fi/wp-content/uploads/2019/12/BoschSensortec_moduuli_www.jpg" width="299" height="408" class="alignnone" /></a></p>
<p>Component manufacturer product page is at <a href="https://www.bosch-sensortec.com/news/smartglasses.html">https://www.bosch-sensortec.com/news/smartglasses.html</a></p>
]]></content:encoded>
			<wfw:commentRss>https://www.epanorama.net/blog/2020/02/05/smartglasses-shoot-lasers-directly-into-the-retina/feed/</wfw:commentRss>
		<slash:comments>3</slash:comments>
		</item>
		<item>
		<title>Bright spots in the VR market &#124; TechCrunch</title>
		<link>https://www.epanorama.net/blog/2018/12/08/bright-spots-in-the-vr-market-techcrunch/</link>
		<comments>https://www.epanorama.net/blog/2018/12/08/bright-spots-in-the-vr-market-techcrunch/#comments</comments>
		<pubDate>Sat, 08 Dec 2018 19:47:47 +0000</pubDate>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
				<category><![CDATA[Trends and predictions]]></category>
		<category><![CDATA[Virtual reality]]></category>

		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=180727</guid>
		<description><![CDATA[<p>https://techcrunch.com/2018/12/02/bright-spots-in-the-vr-market/ Virtual reality is in a public relations slump. Two years ago the public’s expectations for virtual reality’s potential was at its peak. But still today the holistic VR experience is still a non-starter for most people. Can we extrapolate beyond the current state of affairs to a magnificent future where the utility of virtual <a class="moretag" href="https://www.epanorama.net/blog/2018/12/08/bright-spots-in-the-vr-market-techcrunch/">&#8594;</a></p>]]></description>
				<content:encoded><![CDATA[<p><a href="https://techcrunch.com/2018/12/02/bright-spots-in-the-vr-market/">https://techcrunch.com/2018/12/02/bright-spots-in-the-vr-market/</a><br />
<span style="color:rgb(51,51,51); font-family:" helveticaneuehelveticaarialsans-serif="helveticaneuehelveticaarialsans-serif" _16px="font-size:_16px" normal="white-space:normal" _400="font-weight:_400" _-0.1px="letter-spacing:_-0.1px" _2text-indent0px="orphans:_2text-indent0px" none="float:none" _2="widows:_2" _0px="_-webkit-text-stroke-width:_0px" initial="text-decoration-color:initial" inlineimportant="display:inlineimportant" left="text-align:left">Virtual reality is in a public relations slump. Two years ago the public’s expectations for virtual reality’s potential was at its peak.<span> </span></span></p>
<p>But still today <span style="color:rgb(51,51,51); font-family:" helveticaneue",helvetica,arial,sans-serif; font-size:16px; font-style:normal; font-variant-ligatures:normal; font-variant-caps:normal; font-weight:400; letter-spacing:-0.1px; orphans:2text-indent:0px; text-transform:none; white-space:normal; widows:2; word-spacing:0px; -webkit-text-stroke-width:0px; text-decoration-style:initial; text-decoration-color:initial; display:inline!important; float:none; text-align:left;">the holistic VR experience is still a non-starter for most people.<span> </span></span></p>
<p>Can we<span style="color:rgb(51,51,51); font-family:" helveticaneuehelveticaarialsans-serif="helveticaneuehelveticaarialsans-serif" _16px="font-size:_16px" normal="white-space:normal" _400="font-weight:_400" _-0.1px="letter-spacing:_-0.1px" _2text-indent0px="orphans:_2text-indent0px" none="float:none" _2="widows:_2" _0px="_-webkit-text-stroke-width:_0px" initial="text-decoration-color:initial" inlineimportant="display:inlineimportant" left="text-align:left"> extrapolate beyond the current state of affairs to a magnificent future where the utility of virtual reality technology is pervasive</span>?</p>
]]></content:encoded>
			<wfw:commentRss>https://www.epanorama.net/blog/2018/12/08/bright-spots-in-the-vr-market-techcrunch/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
		</item>
		<item>
		<title>VR/AR/MR, what&#8217;s the difference?</title>
		<link>https://www.epanorama.net/blog/2018/11/13/vrarmr-whats-the-difference/</link>
		<comments>https://www.epanorama.net/blog/2018/11/13/vrarmr-whats-the-difference/#comments</comments>
		<pubDate>Tue, 13 Nov 2018 22:24:51 +0000</pubDate>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
				<category><![CDATA[Audio and Video]]></category>
		<category><![CDATA[Virtual reality]]></category>

		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=180354</guid>
		<description><![CDATA[<p>https://www.foundry.com/industries/virtual-reality/vr-mr-ar-confused Since the beginning of time, most new and emerging technology has nurtured an unhealthy attachment to acronyms, and virtual reality is no different. This article tells the basics of VR, AR and MR. <a class="moretag" href="https://www.epanorama.net/blog/2018/11/13/vrarmr-whats-the-difference/">&#8594;</a></p>]]></description>
				<content:encoded><![CDATA[<p><a href="https://www.foundry.com/industries/virtual-reality/vr-mr-ar-confused">https://www.foundry.com/industries/virtual-reality/vr-mr-ar-confused</a></p>
<h3 style="margin: 0px; padding: 25px; border: 0px; box-sizing: border-box; font-weight: normal; font-size: 24px; line-height: 30px; color: #494845; max-width: none; background: #ffffff; min-height: 90px; font-family: AvenirMedium,Arial,sans-serif; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; letter-spacing: normal; orphans: 2text-indent:0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration-style: initial; text-decoration-color: initial; text-align: left;">Since the beginning of time, most new and emerging technology has nurtured an unhealthy attachment to acronyms, and virtual reality is no different. This article tells the basics of VR, AR and MR.</h3>
]]></content:encoded>
			<wfw:commentRss>https://www.epanorama.net/blog/2018/11/13/vrarmr-whats-the-difference/feed/</wfw:commentRss>
		<slash:comments>8</slash:comments>
		</item>
		<item>
		<title>Virtual reality with smells and taste will meet synthetic food</title>
		<link>https://www.epanorama.net/blog/2018/10/22/virtual-reality-with-smells-and-taste-will-meet-synthetic-food/</link>
		<comments>https://www.epanorama.net/blog/2018/10/22/virtual-reality-with-smells-and-taste-will-meet-synthetic-food/#comments</comments>
		<pubDate>Mon, 22 Oct 2018 18:41:03 +0000</pubDate>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
				<category><![CDATA[Science news]]></category>
		<category><![CDATA[Virtual reality]]></category>

		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=180078</guid>
		<description><![CDATA[<p>The 6 Creepiest Lies the Food Industry is Feeding You article claims that the food industry is based almost entirely on a series of lies that most of us just prefer to believe. Everything we love to eat is a scam article tells that pretty much on the same line. So how easy is to <a class="moretag" href="https://www.epanorama.net/blog/2018/10/22/virtual-reality-with-smells-and-taste-will-meet-synthetic-food/">&#8594;</a></p>]]></description>
				<content:encoded><![CDATA[<p><a href="http://www.cracked.com/article_19896_the-6-creepiest-lies-food-industry-feeding-you.html">The 6 Creepiest Lies the Food Industry is Feeding You</a> article claims that <!--StartFragment--><a href="http://www.cracked.com/article_19896_the-6-creepiest-lies-food-industry-feeding-you.html">the food industry is based almost entirely on a series of lies that</a> most of us just prefer to believe. <a href="https://nypost.com/2016/07/10/the-truth-behind-how-were-scammed-into-eating-phony-food/">Everything we love to eat is a scam</a> article tells that pretty much on the same line. So how easy is to fool our tastes? Yes. Can we make artificial food that tastes like traditional food? Yes. Can we fool our senses electrically? Yes.</p>
<p><!--StartFragment--></p>
<p><!--EndFragment--></p>
<p><!--StartFragment--></p>
<p class="article-main-title"><a href="https://blog.hackster.io/electric-chopsticks-add-salty-flavor-where-none-exists-2044906642a2">Electric Chopsticks Add Salty Flavor Where None Exists</a> article tells how <!--StartFragment--> <a class="markup--anchor markup--p-anchor" href="https://spectrum.ieee.org/the-human-os/biomedical/devices/hacking-the-flavor-of-food-with-electric-chopsticks-and-soup-bowls" target="_blank" rel="noopener" data-href="https://spectrum.ieee.org/the-human-os/biomedical/devices/hacking-the-flavor-of-food-with-electric-chopsticks-and-soup-bowls">a special pair of chopsticks built by Nimesha Ranasinghe</a> could let you experience that great salty taste without actually consuming any salt. <!--StartFragment-->Nimesha Ranasinghe is an<a href="https://blog.hackster.io/electric-chopsticks-add-salty-flavor-where-none-exists-2044906642a2"> assistant professor and director of the University of Maine’s</a> <a href="http://www.mimlab.info/">Multisensory Interactive Media Lab</a> is researching<!--StartFragment--> on sending sensations remotely &#8211; for example <a href="https://blog.hackster.io/electric-chopsticks-add-salty-flavor-where-none-exists-2044906642a2">how to transmit flavors over the internet</a>. <!--StartFragment-->Ranasinghe was able to successfully <a href="Ranasinghe%20was able to recreate saltiness, sourness, and bitterness tastes">recreate saltiness, sourness, and bitterness tastes</a><!--EndFragment--><!--EndFragment--><!--EndFragment--><!--EndFragment--> using metal chopsticks as electrodes. Tastes are generated by sending different<!--StartFragment--> voltage and frequency of the electricity using <!--StartFragment-->battery and <a class="markup--anchor markup--p-anchor" href="https://www.hackster.io/arduino" target="_blank" rel="noopener" data-href="https://www.hackster.io/arduino">an Arduino</a> to <!--StartFragment-->chopsticks. Some more details can be read from IEEE Spectrum article <a href="https://spectrum.ieee.org/the-human-os/biomedical/devices/hacking-the-flavor-of-food-with-electric-chopsticks-and-soup-bowls">Hacking the Flavor of Food With Electric Chopstick</a> that also tells that<!--StartFragment--> he <a href="https://spectrum.ieee.org/the-human-os/biomedical/devices/hacking-the-flavor-of-food-with-electric-chopsticks-and-soup-bowls">also made an electric soup bowl that imparts flavors when people slurp directly from the bowl</a> and a virtual cocktail glass. There is also <a href="https://www.sciencedirect.com/science/article/pii/S0963996918303983">scientific article on the study</a> (behind paywall).</p>
<p><iframe width="1024" height="576" src="https://www.youtube.com/embed/RQI6UDP1kOQ?feature=oembed" frameborder="0" allow="autoplay; encrypted-media" allowfullscreen></iframe></p>
<p class="article-dek"><!--StartFragment--><a href="https://techcrunch.com/2018/10/18/researchers-create-virtual-smells-by-electrocuting-your-nose/">Researchers can also create virtual smells by electrocuting your nose</a>. IEEE Spectrum article <a href="https://spectrum.ieee.org/the-human-os/biomedical/devices/these-researchers-want-to-send-smells-over-the-internet">These Researchers Want to Send Smells Over the Internet</a> tells that researchers have used electrical stimulation of cells in the nasal passages to produces sweet fragrances and chemical odors. <a href="https://spectrum.ieee.org/the-human-os/biomedical/devices/these-researchers-want-to-send-smells-over-the-internet">The researchers who are working on “digital smell” are still a very long way from practical applications</a>—in part because their technology’s form factor leaves something to be desired. <a href="https://spectrum.ieee.org/the-human-os/biomedical/devices/these-researchers-want-to-send-smells-over-the-internet">Right now, catching a whiff of the future means sticking a cable up your nose</a>. <!--StartFragment--><span style="color: #333333;">By <a href="https://techcrunch.com/2018/10/18/researchers-create-virtual-smells-by-electrocuting-your-nose/">stimulating your olfactory nerve with a system that looks like one of those old-fashioned kids electronics kits</a>.</span></p>
<p><iframe width="1024" height="576" src="https://www.youtube.com/embed/uLWZSFHbrcY?feature=oembed" frameborder="0" allow="autoplay; encrypted-media" allowfullscreen></iframe></p>
<p><!--EndFragment-->We know that good stories and nice visual apperarance can me your food taste better. How about using virtual reality for that? <a href="https://techcrunch.com/2018/10/19/virtual-reality-makes-food-taste-better/">Virtual reality makes food taste better</a> article tells that <a href="https://techcrunch.com/2018/10/19/virtual-reality-makes-food-taste-better/">Cornell University food scientists found that cheese eaten in pleasant VR surroundings tasted better</a> than the same cheese eaten in a drab sensory booth.<!--StartFragment-->That’s right:<a href="https://techcrunch.com/2018/10/19/virtual-reality-makes-food-taste-better/"> cheese tastes better on a virtual farm versus inside a blank, empty cyberia</a>.<!--EndFragment--></p>
<p><!--EndFragment--></p>
<p>If we can make taste and smell artificially can we make food artificially? Yes we can. Some researcher see that <a href="https://www.bbc.com/news/world-us-canada-45865403?ns_mchannel=social&amp;ocid=socialflow_facebook&amp;ns_source=facebook&amp;ns_campaign=bbcnews">there&#8217;s a looming crisis over the world&#8217;s growing appetite for meat</a>. <a href="https://en.m.wikipedia.org/wiki/Cultured_meat">Cultured meat, also called clean meat, synthetic meat or <i>in vitro</i> meat</a>, is meat produced by <a title="In vitro" href="https://en.m.wikipedia.org/wiki/In_vitro">in vitro</a> <a title="Cell culture" href="https://en.m.wikipedia.org/wiki/Cell_culture">cultivation</a> of animal cells, instead of from slaughtered animals. <a href="https://www.bbc.com/news/world-us-canada-45865403?ns_mchannel=social&amp;ocid=socialflow_facebook&amp;ns_source=facebook&amp;ns_campaign=bbcnews">T<!--StartFragment-->his is actual meat grown from animal cells and variously described as cultured, synthetic, in-vitro, lab-grown or even &#8220;clean&#8221; meat</a>.<!--EndFragment--></p>
<p><!--StartFragment--> <a href="https://en.m.wikipedia.org/wiki/Cultured_meat">On August 5th 2013, the world&#8217;s first lab-grown burger was cooked and eaten at a news conference in London</a>. In 2013 the first lab-grown burger was served up, so where are our synthetic steaks now? <a href="https://www.sciencefocus.com/future-technology/the-artificial-meat-factory-the-science-of-your-synthetic-supper/">The artificial meat factory – the science of your synthetic supper</a> article takes a look at the cultured meat market and the race to mass-produce in-vitro meat. <a href="https://www.bbc.com/news/world-us-canada-45865403?ns_mchannel=social&amp;ocid=socialflow_facebook&amp;ns_source=facebook&amp;ns_campaign=bbcnews">Would you eat slaughter-free meat?</a> Some people have do that already.</p>
<p><!--StartFragment--><a href="https://www.iflscience.com/health-and-medicine/we-tried-the-first-labgrown-sausage-made-without-killing-animals-this-is-what-it-tasted-like/">We Tried The First Lab-Grown Sausage Made Without Killing Animals. This Is What It Tasted Like</a> article tells the experience on tasking lab made sausage. <a href="https://www.iflscience.com/health-and-medicine/we-tried-the-first-labgrown-sausage-made-without-killing-animals-this-is-what-it-tasted-like/">New Age Meats&#8217; sausage was the first in history to be made with fat and muscle cells</a> made in laboratory. And according to article it tasted like sausage made of normal meat. <a href="https://www.iflscience.com/health-and-medicine/we-tried-the-first-labgrown-sausage-made-without-killing-animals-this-is-what-it-tasted-like/page-2/">The places where the meat of the future will be produced can look like today&#8217;s brewery</a>. <!--StartFragment--><a href="https://www.bbc.com/news/world-us-canada-45865403?ns_mchannel=social&amp;ocid=socialflow_facebook&amp;ns_source=facebook&amp;ns_campaign=bbcnews">Would you eat slaughter-free meat?</a> article shows <a href="https://www.bbc.com/news/world-us-canada-45865403?ns_mchannel=social&amp;ocid=socialflow_facebook&amp;ns_source=facebook&amp;ns_campaign=bbcnews">chicken nuggets that were grown in a lab from cells taken from a living animal</a>.<!--EndFragment--><!--EndFragment--></p>
<p><a href="https://www.lut.fi/web/en/news/-/asset_publisher/lGh4SAywhcPu/content/protein-produced-from-electricity-to-alleviate-world-hunger">Protein produced from electricity to alleviate world hunger</a> article tells that researches can generate protein using just air and sun.<a href="https://www.lut.fi/web/en/news/-/asset_publisher/lGh4SAywhcPu/content/protein-produced-from-electricity-to-alleviate-world-hunger"> <!--StartFragment-->A batch of single-cell protein has been produced by using electricity and carbon dioxide in a joint study by the Lappeenranta University of Technology (LUT) and VTT Technical Research Centre of Finland. Protein produced in this way can be further developed for use as food and animal feed</a>. Scientists claim that generating protein from co2 and electricity from solar panel is far more efficient than making meat protein traditionally.</p>
<p>I am thinking the is the time when synthetic food will meet virtual taste. When this has been said in public, I will expect someone to do that pretty soon.</p>
<p><a href="https://openclipart.org/detail/281091/food-and-drink-icon-meat"><img class="alignnone" src="https://openclipart.org/image/300px/svg_to_png/281091/FoodAndDrinkIconMeat.png" alt="" width="298" height="284" /></a></p>
]]></content:encoded>
			<wfw:commentRss>https://www.epanorama.net/blog/2018/10/22/virtual-reality-with-smells-and-taste-will-meet-synthetic-food/feed/</wfw:commentRss>
		<slash:comments>14</slash:comments>
		</item>
		<item>
		<title>Finnish startup developing ‘human-eye resolution’ VR and XR</title>
		<link>https://www.epanorama.net/blog/2018/10/12/finnish-startup-developing-human-eye-resolution-vr-and-xr/</link>
		<comments>https://www.epanorama.net/blog/2018/10/12/finnish-startup-developing-human-eye-resolution-vr-and-xr/#comments</comments>
		<pubDate>Fri, 12 Oct 2018 05:44:59 +0000</pubDate>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
				<category><![CDATA[Finland]]></category>
		<category><![CDATA[Virtual reality]]></category>

		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=179971</guid>
		<description><![CDATA[<p>https://techcrunch.com/2018/10/07/varjo/ writes: &#8220;Varjo Technologies, the Finnish startup that made a splash earlier this year with news it had developed a virtual reality headset capable of “human-eye resolution,” has raised $31 million in Series B funding.&#8221; Varjo (shadow) aims to bring to market “the world’s first” human-eye resolution virtual reality (VR) and mixed reality (XR) product. <a class="moretag" href="https://www.epanorama.net/blog/2018/10/12/finnish-startup-developing-human-eye-resolution-vr-and-xr/">&#8594;</a></p>]]></description>
				<content:encoded><![CDATA[<p><a href="https://techcrunch.com/2018/10/07/varjo/">https://techcrunch.com/2018/10/07/varjo/</a> writes:</p>
<p>&#8220;<a href="https://varjo.com/" style="background-repeat: no-repeat; box-sizing: inherit; background-color: transparent; touch-action: manipulation; text-decoration: none; color: rgb(0, 165, 98); border-bottom: 1px solid rgb(0, 165, 98); transition: color 0s ease 0s, border-color 0.2s linear 0s; font-family: " helvetica="helvetica" neue="neue" arial="arial" sans-serif="sans-serif" _="_-webkit-text-stroke-width:_" _16px="_16px" normal="normal" _400="_400" _-0.1px="_-0.1px" _2="_2" start="start" _0px="_0px" none="none">Varjo Technologies</a><span style="color:rgb(51,51,51); font-family:" helveticaneuehelveticaarialsans-serif="helveticaneuehelveticaarialsans-serif" _16px="font-size:_16px" normal="white-space:normal" _400="font-weight:_400" _-0.1px="letter-spacing:_-0.1px" _2text-indent0px="orphans:_2text-indent0px" none="float:none" _2="widows:_2" _0px="_-webkit-text-stroke-width:_0px" initial="text-decoration-color:initial" inlineimportant="display:inlineimportant" left="text-align:left">, the Finnish startup that made a splash earlier this year with news it had developed a virtual reality headset<span> </span></span><a href="https://techcrunch.com/2017/06/19/this-startup-wants-to-build-vr-headsets-with-human-eye-resolution/" style="background-repeat: no-repeat; box-sizing: inherit; background-color: transparent; touch-action: manipulation; text-decoration: none; color: rgb(0, 165, 98); border-bottom: 1px solid rgb(241, 241, 241); transition: color 0s ease 0s, border-color 0.2s linear 0s; font-family: " helvetica="helvetica" neue="neue" arial="arial" sans-serif="sans-serif" _="_-webkit-text-stroke-width:_" _16px="_16px" normal="normal" _400="_400" _-0.1px="_-0.1px" _2="_2" start="start" _0px="_0px" none="none">capable of “human-eye resolution,”</a><span style="color:rgb(51,51,51); font-family:" helveticaneuehelveticaarialsans-serif="helveticaneuehelveticaarialsans-serif" _16px="font-size:_16px" normal="white-space:normal" _400="font-weight:_400" _-0.1px="letter-spacing:_-0.1px" _2text-indent0px="orphans:_2text-indent0px" none="float:none" _2="widows:_2" _0px="_-webkit-text-stroke-width:_0px" initial="text-decoration-color:initial" inlineimportant="display:inlineimportant" left="text-align:left"><span> </span>has raised $31 million in Series B funding.</span>&#8221;</p>
<p>Varjo (shadow) <span style="color:rgb(51,51,51); font-family:" helveticaneuehelveticaarialsans-serif="helveticaneuehelveticaarialsans-serif" _16px="font-size:_16px" normal="white-space:normal" _400="font-weight:_400" _-0.1px="letter-spacing:_-0.1px" _2text-indent0px="orphans:_2text-indent0px" none="float:none" _2="widows:_2" _0px="_-webkit-text-stroke-width:_0px" initial="text-decoration-color:initial" inlineimportant="display:inlineimportant" left="text-align:left">aims to bring to market “the world’s first” human-eye resolution virtual reality (VR) and mixed reality (XR) product</span>.</p>
<p>Based on my personal short experiences with their product last week. It was impressive when I had change to test one of their prototypes last week. I wrote on it to</p>
<p><a href="https://www.uusiteknologia.fi/2018/10/10/suomalaisyhtio-demosi-virtuaalilasejaan-tarkkaa-kuvaa/">https://www.uusiteknologia.fi/2018/10/10/suomalaisyhtio-demosi-virtuaalilasejaan-tarkkaa-kuvaa/</a></p>
<p><img src="http://www.epanorama.net/newepa/wp-content/uploads/2018/10/wpid-screenshot_20181012-0837051532724492.png" class="wp-image-179969 alignnone size-full" width="1080" height="1920"><br />
At the same event I saw their tech presentation how they combine two displays to get super high resolution (better resolution than normal eye) to the center of view.</p>
<p><img src="http://www.epanorama.net/newepa/wp-content/uploads/2018/10/wpid-img_20181003_110950_burst0051945151766.jpg" class="failed alignnone wp-image-179970 size-full" width="3000" height="2250"></p>
]]></content:encoded>
			<wfw:commentRss>https://www.epanorama.net/blog/2018/10/12/finnish-startup-developing-human-eye-resolution-vr-and-xr/feed/</wfw:commentRss>
		<slash:comments>26</slash:comments>
		</item>
		<item>
		<title>Google publicly launches ARCore 1.0 on 13 phones, will begin expanding Lens preview &#124; TechCrunch</title>
		<link>https://www.epanorama.net/blog/2018/02/26/google-publicly-launches-arcore-1-0-on-13-phones-will-begin-expanding-lens-preview-techcrunch/</link>
		<comments>https://www.epanorama.net/blog/2018/02/26/google-publicly-launches-arcore-1-0-on-13-phones-will-begin-expanding-lens-preview-techcrunch/#comments</comments>
		<pubDate>Mon, 26 Feb 2018 05:16:36 +0000</pubDate>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
				<category><![CDATA[Mobile]]></category>
		<category><![CDATA[Virtual reality]]></category>

		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=176224</guid>
		<description><![CDATA[<p>https://techcrunch.com/2018/02/23/google-publicly-launches-arcore-1-0-on-13-phones-will-begin-expanding-lens-availability/ With ARCore’s launch, developers are now able to put their own augmented reality (AR) creations into the Play Store and users able to take a first shot at phone. Just a small subset of all of the Android phones out there today. Google has already been working with companies like Snap, Sony, Wayfare, Porsche <a class="moretag" href="https://www.epanorama.net/blog/2018/02/26/google-publicly-launches-arcore-1-0-on-13-phones-will-begin-expanding-lens-preview-techcrunch/">&#8594;</a></p>]]></description>
				<content:encoded><![CDATA[<p><a href="https://techcrunch.com/2018/02/23/google-publicly-launches-arcore-1-0-on-13-phones-will-begin-expanding-lens-availability/">https://techcrunch.com/2018/02/23/google-publicly-launches-arcore-1-0-on-13-phones-will-begin-expanding-lens-availability/</a></p>
<p style="box-sizing: border-box; margin: 0px0px1em; padding: 0px; color: rgb(62,67,62); font-family:" opensanshelveticaneuehelveticaarialsans-serif="opensanshelveticaneuehelveticaarialsans-serif" _="text-align:_" _16px="_16px" normal="normal" _400="_400" _0px="_0px" none="none" _2="_2" rgb249249249="rgb249249249" initial="initial" left="left">With ARCore’s launch, developers are now able to put their own augmented reality (AR) creations into the Play Store and users able to take a first shot at phone. J<span style="color: rgb(62,67,62); font-family: " opensans",helveticaneue,helvetica,arial,sans-serif; font-size: 16px; font-style: normal; font-variant-ligatures: font-variant-caps: font-weight: 400; letter-spacing: orphans: 2text-indent: 0px; text-transform: none; white-space: widows: 2; word-spacing: -webkit-text-stroke-width: background-color: rgb(249,249,249); text-decoration-style: initial; text-decoration-color: display: inline!important; float: text-align: left;">ust a small subset of all of the Android phones out there today</span>.</p>
<p style="box-sizing: border-box; margin: 0px0px1em; padding: 0px; color: rgb(62,67,62); font-family:" opensanshelveticaneuehelveticaarialsans-serif="opensanshelveticaneuehelveticaarialsans-serif" _="text-align:_" _16px="_16px" normal="normal" _400="_400" _0px="_0px" none="none" _2="_2" rgb249249249="rgb249249249" initial="initial" left="left">Google has already been working  with companies like Snap, Sony, Wayfare, Porsche and others to bring ARCore app experiences to life.</p>
<p><img class="alignnone wp-image-176223 size-full" src="http://www.epanorama.net/newepa/wp-content/uploads/2018/02/wpid-screenshot_20180225-1112591434634935.png" width="1080" height="1920"></p>
]]></content:encoded>
			<wfw:commentRss>https://www.epanorama.net/blog/2018/02/26/google-publicly-launches-arcore-1-0-on-13-phones-will-begin-expanding-lens-preview-techcrunch/feed/</wfw:commentRss>
		<slash:comments>3</slash:comments>
		</item>
		<item>
		<title>Arduino and Unity3D Interactive Experience &#8211; Arduino Project Hub</title>
		<link>https://www.epanorama.net/blog/2017/12/18/arduino-and-unity3d-interactive-experience-arduino-project-hub/</link>
		<comments>https://www.epanorama.net/blog/2017/12/18/arduino-and-unity3d-interactive-experience-arduino-project-hub/#comments</comments>
		<pubDate>Mon, 18 Dec 2017 18:17:06 +0000</pubDate>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
				<category><![CDATA[Arduino]]></category>
		<category><![CDATA[Computers]]></category>
		<category><![CDATA[Open source software]]></category>
		<category><![CDATA[Virtual reality]]></category>

		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=62055</guid>
		<description><![CDATA[<p>https://create.arduino.cc/projecthub/relativty/wrmhl-arduino-and-unity3d-interactive-experience-cc18b3?ref=platform&#38;ref_id=424_recent___&#38;offset=3 This looks like an interesting project to connect any Arduino interface to Unity3D.&#160; In transmitting data from Arduino to Unity3D, usually the main issue is&#160;INSANE LATENCY. With&#160;wmrhl&#160;you should be able to connect any Arduino interface to Unity3D, and it’s completely&#160;open source at&#160;https://github.com/relativty/wrmhl <a class="moretag" href="https://www.epanorama.net/blog/2017/12/18/arduino-and-unity3d-interactive-experience-arduino-project-hub/">&#8594;</a></p>]]></description>
				<content:encoded><![CDATA[<p><a href="https://create.arduino.cc/projecthub/relativty/wrmhl-arduino-and-unity3d-interactive-experience-cc18b3?ref=platform&amp;ref_id=424_recent___&amp;offset=3">https://create.arduino.cc/projecthub/relativty/wrmhl-arduino-and-unity3d-interactive-experience-cc18b3?ref=platform&amp;ref_id=424_recent___&amp;offset=3</a></p>
<p><span style="box-sizing: border-box; margin: 0px; padding: 0px; color: rgb(135, 145, 145); font-family: &quot;Typonine Sans Regular&quot;; font-size: 18px; letter-spacing: 0.3px; background-color: rgb(255, 255, 255);">This looks like an interesting project to connect any Arduino interface to Unity3D.&nbsp;</span></p>
<p><span style="box-sizing: border-box; margin: 0px; padding: 0px; color: rgb(135, 145, 145); font-family: &quot;Typonine Sans Regular&quot;; font-size: 18px; letter-spacing: 0.3px; background-color: rgb(255, 255, 255);">In transmitting data from Arduino to Unity3D, usually the main issue is&nbsp;</span><span style="box-sizing: border-box; margin: 0px; padding: 0px; font-family: &quot;Typonine Sans Medium&quot;; color: rgb(135, 145, 145); font-size: 18px; letter-spacing: 0.3px; background-color: rgb(255, 255, 255);">INSANE LATENCY</span><span style="box-sizing: border-box; margin: 0px; padding: 0px; color: rgb(135, 145, 145); font-family: &quot;Typonine Sans Regular&quot;; font-size: 18px; letter-spacing: 0.3px; background-color: rgb(255, 255, 255);">.</span></p>
<p><span style="box-sizing: border-box; margin: 0px; padding: 0px; background-color: rgb(255, 255, 255);">With</span><span style="box-sizing: border-box; margin: 0px; padding: 0px; color: rgb(135, 145, 145); font-family: &quot;Typonine Sans Regular&quot;; font-size: 18px; letter-spacing: 0.3px; background-color: rgb(255, 255, 255);">&nbsp;</span><span style="box-sizing: border-box; margin: 0px; padding: 0px; font-family: &quot;Typonine Sans Medium&quot;; color: rgb(135, 145, 145); font-size: 18px; letter-spacing: 0.3px; background-color: rgb(255, 255, 255);"><a href="https://github.com/relativty/wrmhl" rel="nofollow" style="box-sizing: border-box; margin: 0px; padding: 0px; background-color: transparent; color: rgb(0, 151, 156);">wmrhl</a></span><span style="box-sizing: border-box; margin: 0px; padding: 0px; color: rgb(135, 145, 145); font-family: &quot;Typonine Sans Regular&quot;; font-size: 18px; letter-spacing: 0.3px; background-color: rgb(255, 255, 255);">&nbsp;you should be able to connect any Arduino interface to Unity3D, and it’s completely&nbsp;</span><span style="box-sizing: border-box; margin: 0px; padding: 0px; font-family: &quot;Typonine Sans Medium&quot;; color: rgb(135, 145, 145); font-size: 18px; letter-spacing: 0.3px; background-color: rgb(255, 255, 255);">open source at&nbsp;</span><a href="https://github.com/relativty/wrmhl">https://github.com/relativty/wrmhl</a></p>
<p><span style="box-sizing: border-box; margin: 0px; padding: 0px; font-family: &quot;Typonine Sans Medium&quot;; color: rgb(135, 145, 145); font-size: 18px; letter-spacing: 0.3px; background-color: rgb(255, 255, 255);"><a href="http://www.epanorama.net/newepa/wp-content/uploads/2017/12/wpid-screenshot_20171218-2009371355338643.png"><img src="http://www.epanorama.net/newepa/wp-content/uploads/2017/12/wpid-screenshot_20171218-2009371355338643.png" alt="" class="wp-image-62054 alignnone size-full" width="1080" height="1920"></a></span></p>
]]></content:encoded>
			<wfw:commentRss>https://www.epanorama.net/blog/2017/12/18/arduino-and-unity3d-interactive-experience-arduino-project-hub/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
		<item>
		<title>5 Google ARCore Experiments That Inject Magic into Everyday Life – Road to VR</title>
		<link>https://www.epanorama.net/blog/2017/09/08/5-google-arcore-experiments-that-inject-magic-into-everyday-life-road-to-vr/</link>
		<comments>https://www.epanorama.net/blog/2017/09/08/5-google-arcore-experiments-that-inject-magic-into-everyday-life-road-to-vr/#comments</comments>
		<pubDate>Fri, 08 Sep 2017 14:05:49 +0000</pubDate>
		<dc:creator><![CDATA[Tomi Engdahl]]></dc:creator>
				<category><![CDATA[Mobile]]></category>
		<category><![CDATA[Virtual reality]]></category>

		<guid isPermaLink="false">http://www.epanorama.net/newepa/?p=59059</guid>
		<description><![CDATA[<p>https://www.roadtovr.com/5-google-arcore-experiments-inject-magic-everyday-life/ Google just released a preview version of&#160;ARCore for Android&#160;as&#160;the company’s answer to Apple’s ARKit.&#160;Since ARKit was released a few months ago, we’ve seen a bevy of&#160;really cool experiments&#160;and potential apps to come from developers from all over the world, but now it’s ARCore’s turn to shine. <a class="moretag" href="https://www.epanorama.net/blog/2017/09/08/5-google-arcore-experiments-that-inject-magic-into-everyday-life-road-to-vr/">&#8594;</a></p>]]></description>
				<content:encoded><![CDATA[<p><a href="https://www.roadtovr.com/5-google-arcore-experiments-inject-magic-everyday-life/">https://www.roadtovr.com/5-google-arcore-experiments-inject-magic-everyday-life/</a></p>
<p><span style="color: rgb(34, 34, 34); font-family: &quot;Droid Sans&quot;; background-color: rgb(255, 255, 255);">Google just released a preview version of&nbsp;</span><a href="https://www.roadtovr.com/google-releases-arcore-android-companys-augmented-reality-answer-apple-arkit/" target="_blank" rel="noopener noreferrer" style="box-sizing: border-box; background: rgb(255, 255, 255); color: blue; font-family: &quot;Droid Sans&quot;;">ARCore for Android</a><span style="color: rgb(34, 34, 34); font-family: &quot;Droid Sans&quot;; background-color: rgb(255, 255, 255);">&nbsp;as&nbsp;the company’s answer to Apple’s ARKit.&nbsp;Since ARKit was released a few months ago, we’ve seen a bevy of&nbsp;</span><a href="https://www.roadtovr.com/10-coolest-things-built-apples-arkit-right-now/" target="_blank" rel="noopener noreferrer" style="box-sizing: border-box; background: rgb(255, 255, 255); color: blue; font-family: &quot;Droid Sans&quot;;">really cool experiments</a><span style="color: rgb(34, 34, 34); font-family: &quot;Droid Sans&quot;; background-color: rgb(255, 255, 255);">&nbsp;and potential apps to come from developers from all over the world, but now it’s ARCore’s turn to shine.</span><br />
<a href="http://www.epanorama.net/newepa/wp-content/uploads/2017/09/wpid-wp-image-2028565809.png"><img src="http://www.epanorama.net/newepa/wp-content/uploads/2017/09/wpid-wp-image-2028565809.png" alt="" class="wp-image-59058 alignnone size-full" width="1080" height="1920"></a></p>
]]></content:encoded>
			<wfw:commentRss>https://www.epanorama.net/blog/2017/09/08/5-google-arcore-experiments-that-inject-magic-into-everyday-life-road-to-vr/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
