<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>
<channel>
	<title>Visual-Experiments.com &#187; gpu</title>
	<atom:link href="http://www.visual-experiments.com/tag/gpu/feed/" rel="self" type="application/rss+xml" />
	<link>http://www.visual-experiments.com</link>
	<description>ASTRE Henri experiments with Ogre3D and web stuff</description>
	<lastBuildDate>Mon, 16 Jan 2017 18:59:35 +0000</lastBuildDate>
	<language>en</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.1.2</generator>
		<item>
		<title>Remote Augmented Reality Prototype</title>
		<link>http://www.visual-experiments.com/2010/07/11/remote-augmented-reality-prototype/</link>
		<comments>http://www.visual-experiments.com/2010/07/11/remote-augmented-reality-prototype/#comments</comments>
		<pubDate>Sun, 11 Jul 2010 17:30:03 +0000</pubDate>
		<dc:creator>Henri</dc:creator>
				<category><![CDATA[ogre3d]]></category>
		<category><![CDATA[artoolkit]]></category>
		<category><![CDATA[augmented reality]]></category>
		<category><![CDATA[boost]]></category>
		<category><![CDATA[gpu]]></category>
		<category><![CDATA[sift]]></category>
		<guid isPermaLink="false">http://www.visual-experiments.com/?p=514</guid>
		<description><![CDATA[I have created a new augmented reality prototype (5 days experiments). It is using a client/server approach based on Boost.Asio. The first assumption of this prototype is that you&#8217;ve got a mobile client not so powerful and a powerful server with a decent GPU. So the idea is simple: the client uploads a video frame [...]]]></description>
			<content:encoded><![CDATA[<p>I have created a new augmented reality prototype (5 days experiments). It is using a client/server approach based on <a href="http://think-async.com/">Boost.Asio</a>. The first assumption of this prototype is that you&#8217;ve got a mobile client not so powerful and a powerful server with a decent GPU.<br />
<img src="http://www.visual-experiments.com/blog/wp-content/uploads/2010/07/remoteArToolKit.png" alt="" title="remoteArToolKit" width="467" height="205" class="alignnone size-full wp-image-528" /></p>
<table>
<tbody style="background-color: white; color: #4D4D4D; text-align: left; vertical-align: top;">
<tr>
<td>So the idea is simple: the client uploads a video frame and the server does the pose estimation and send back the augmented rendering to the client. My first prototype is using ArToolKitPlus in almost real-time (15fps) but I&#8217;m also working on a markerless version that would be less interactive (< 1fps). The mobile client was an UMPC (Samsung Q1).</td>
<td><img src="http://www.visual-experiments.com/blog/wp-content/uploads/2010/07/samsung.q1.jpg" alt="" title="samsung.q1" width="150" height="135" class="alignnone size-full wp-image-583" /></td>
</tr>
</tbody>
</table>
<p>Thanks to Boost.Asio I&#8217;ve been able to produce a strong client/server very quickly. Then I have created two implementations of PoseEstimator :</p>
<pre class="brush: cpp; title: ;">
class PoseEstimator
{
	public:
		bool computePose(const Ogre::PixelBox&amp; videoFrame);
		Ogre::Vector3 getPosition() const;
		Ogre::Quaternion getOrientation() const;
}
</pre>
<ul style="margin-left: 20px">
<li>ArToolKitPoseEstimator <em>(using <a href="http://studierstube.icg.tu-graz.ac.at/handheld_ar/artoolkitplus.php">ArToolKitPlus</a> to get pose estimation)</em></li>
<li>SfMPoseEstimator <em>(using <a href="http://cvlab.epfl.ch/software/EPnP/">EPnP</a> and a point cloud generated with <a href="http://phototour.cs.washington.edu/bundler/">Bundler</a>  -Structure from Motion tool- to get pose estimation)</em></li>
</ul>
<h3>ArToolKitPoseEstimator</h3>
<p>There is nothing fancy about this pose estimator, I&#8217;ve just implemented this one as proof of concept and to check my server performance. In fact, ArToolKit pose estimation is not expensive and can run in real-time on a mobile.</p>
<h3>SfMPoseEstimator</h3>
<p>I&#8217;ll just introduce the concept of this pose estimator in this post. So the idea is simple, in augmented reality <a href="http://www.midnightliaison.co.uk/">fake rolex</a> you generally know the object you are looking at because you want to augment it. The idea was to create a point cloud of the object you want to augment (using Structure from Motion) and keep the link between the 3D points and theirs 2D descriptors. Thus when you take a shot of the scene you can compare the 2D descriptors of your shot with those of the point cloud and so create 2D/3D correspondence. Then the pose estimation can be estimated by solving the Perspective-n-Point camera calibration problem (using <a href="http://cvlab.epfl.ch/software/EPnP/index.php">EPnP</a> for example).</p>
<h3>Performance</h3>
<p>The server is very basic, it doesn&#8217;t handle client queuing yet (1 client = 1 thread), but it already does the off-screen rendering and send back the texture in raw RGB.</p>
<p>The version using ArToolKit is only <a href="http://www.pursevillage.com/">Replica Handbag</a> running at 15fps because I had trouble with the jpeg compression so I turn it off. So this version is only bandwidth limited. I didn&#8217;t investigate this issue that much because I know that the SfMPoseEstimator is going to be limited by the matching step. Furthermore I&#8217;m not sure that it&#8217;s a good idea to send highly compressed image to the server (compression artifact can add extra features).</p>
<p>My SfMPoseEstimator is also working but it&#8217;s very expensive (~1s using the GPU)  and it&#8217;s not always accurate due to some flaws of my original implementation. I&#8217;ll explain how it works in my following post.</p>
<p><a class="a2a_dd addtoany_share_save" href="http://www.addtoany.com/share_save#url=http%3A%2F%2Fwww.visual-experiments.com%2F2010%2F07%2F11%2Fremote-augmented-reality-prototype%2F&amp;title=Remote%20Augmented%20Reality%20Prototype"><img src="http://www.visual-experiments.com/blog/wp-content/plugins/add-to-any/share_save_171_16.png" width="171" height="16" alt="Share"/></a> </p>]]></content:encoded>
			<wfw:commentRss>http://www.visual-experiments.com/2010/07/11/remote-augmented-reality-prototype/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
		</item>
	</channel>
</rss>
