http://www.librador.com/Martin Vilcans (Librador)2015-03-09T23:00:00ZMartin Vilcanshttp://www.librador.comtag:www.librador.com,2015-03-10:/2015/03/10/Asynchronous-timewarp-and-late-latching/Virtual reality tricks: Asynchronous timewarp and late latching2015-03-09T23:00:00Z2015-03-09T23:00:00Z<p><img src="/images/blog/lightsaber.jpg" class="aside-right" alt="Me trying out the Sixense STEM" /></p>
<p>Last week’s <a href="http://www.gdconf.com/">Game Developers Conference</a>
was very much about VR.
It looks like the timing of our new startup
<a href="http://www.resolutiongames.com">Resolution Games</a>
is perfect.</p>
<p>The talks by the two big desktop graphics hardware vendors gave me some
interesting insight into how much more goes into creating a good VR experience
than one might think.
I already knew that creating content for VR requires new thinking.
You can’t just take any game mechanics and make a straight port.
But I had not realized how much work there has been on the graphics hardware and
drivers to support VR well.</p>
<p>The first talk was by Nathan Reed of nVidia and Dean Beeler of Oculus,
and the second one was by Layla Mah of AMD.
What hit me was that although their presentation styles were different,
the information in their talks was very similar.
Apparently both nVidia and AMD have implemented very similar features in
their graphics drivers to support VR.</p>
<p>So why do they even need to update their drivers to support VR at all?
Can’t you just render a right eye and a left eye view and be done with it?
That’s definitely possible, but the main problem is that for VR,
it is extremely important to keep latency as low as possible.
If you move your head, the display has to be updated as soon as possible,
or there is a great risk that the user gets sick.
You can get motion sickness after only a couple of seconds of
high latency rendering, and it can stay for hours.
To avoid this, they say “movement to photons” should be below 20 milliseconds.
Traditionally graphics drivers have been optimized for throughput,
not for low latency, which is why they had to implement new features
to make VR painless.</p>
<p>With a screen update frequency of 60 Hz
(as on the Samsung Gear VR),
one frame is approximately 17 milliseconds.
Traditionally you start rendering right after one vblank,
and if you’re rendering fast enough the result will be visible
after the next vblank 17 milliseconds later.
That is dangerously close to the 20 millisecond limit,
so if there is any frame rate hickup, you’ll easily be over the limit.
If I understood it correctly, Mah even suggested a limit of 10 milliseconds, which would be
impossible to do if you read out the head orientation right after a vsync
and rendered a scene with that camera rotation.
In this case the head may have moved too much before the user sees the new frame
one vsync later.</p>
<p>The solution to this is <em>asynchronous timewarp</em>.
It works like this: You get the head orientation right before rendering the scene.
Then you render the scene to a frame buffer that is slightly larger than what will be visible.
This is the main rendering and can take several milliseconds, under which the user’s head
may move.
Then you get the head orientation again and warp the rendered image to adjust for the
new head orientation.
(This warping reminded me of good old <a href="http://en.wikipedia.org/wiki/QuickTime_VR">Quicktime VR</a>,
but of course you don’t need to render a full 360 degree panorama for asynchronous timewarp.)</p>
<p>To enable asynchronous timewarp without requiring the CPU to wait until right before vsync
to issue the draw calls for the warping, the graphics card vendors implemented
<a href="https://www.oculus.com/blog/optimizing-vr-graphics-with-late-latching/">late latching</a>.
Late latching allows the CPU to update constant buffers in the GPU <em>after</em> the draw calls
have been submitted.
So you can queue up the warping draw calls so they will run after the scene rendering,
but keep on updating the head rotation matrices continuously.
So when the CPU finally executes the warping, it will use the latest values.</p>
<p>I suppose late latching should be useful not only for head rotation,
but also to get low latency when using hand tracking systems such as
<a href="https://www.leapmotion.com/">Leap Motion</a> or
<a href="http://sixense.com/">Sixense STEM</a>,
so the GPU renders the user’s hand late as possible.</p>
<p>A lot of the complexity of all this comes from the old fashioned way that displays work.
Even if nothing has changed on the screen, the display updates at a fixed frequency
just like an old CRT. And just like on an old CRT, you have to wait for the next vsync
before your latest frame buffer can be visible (unless you accept tearing of course).
There are displays that get rid of this heritage
(using nVidia’s <a href="http://www.geforce.com/hardware/technology/g-sync">G-Sync</a> and
AMD’s <a href="http://www.amd.com/en-us/innovations/software-technologies/technologies-gaming/freesync#about">Freesync</a>)
but none of them are used for VR.</p>
<p>Also, late latching doesn’t help if your scene rendering
takes more than one frame.
In that case, you would want the asynchronous timewarp to run right before
vsync even if the new frame isn’t ready yet,
reusing the frame buffer from the previous finished rendering,
so you at least get the rotation updated if not the position and moving objects.
This requires the graphics driver to be able to preempt the current rendering
and run the timewarp before switching back to the original rendering
(like task switching on the CPU).
Both AMD and nVidia mentioned this in their talks, so apparently they are working on it.
Both of them also mentioned the possibility to use two graphics cards,
one for each eye, and submit the same draw calls to both of them,
instead of doing very much the same calls twice.
That’s only useful if you have two graphics cards,
but it seems they’re working on making this possible with a single GPU too.</p>
<p>I saw a comment somewhere that all of these solutions and workarounds are temporary,
and in the future we will not have to worry about frame rate because GPUs will be fast enough
to always render at a high frame rate…
I wouldn’t hold my breath.
For the foreseeable future,
artists won’t think the state of the art consumer graphics cards
are good enough for everything they want to do.
And until then, these tricks are useful to squeeze as much as possible
out of the GPU when doing VR.</p>
tag:www.librador.com,2014-09-24:/2014/09/24/Why-I-switched-from-Sublime-to-Vim/Why I switched from Sublime to Vim2014-09-23T22:00:00Z2014-09-23T22:00:00Z<p>I have been looking for the perfect text editor for quite some time.</p>
<p>I’ve been pretty happy with <a href="http://www.sublimetext.com/">Sublime</a> for a few years,
but it’s not free/open source and doesn’t work over ssh.</p>
<p><a href="http://www.librador.com/2005/11/02/My-acquaintance-Emacs/">Emacs and I are not friends</a>.
Also, in the last 15 years or so, I’ve never thought it to be particularly powerful.</p>
<p>I gave <a href="http://www.vim.org">Vim</a> a chance a few years ago
but was too confused by its behaviour: I kept on making off-by-one character/line errors.
So I gave up back then, but a few months ago I gave Vim another chance.
The reason was not so much that I wasn’t happy with Sublime,
but because I was unhappy of the evolution of keyboards.</p>
<p><img src="/images/blog/thinkpad-keyboard.jpg" alt="Keyboard with home and end keys in the wrong place" />
What were you thinking, Lenovo?
(Photo from <a href="http://www.wpcentral.com/review-thinkpad-x1-carbon-2014">Windows Phone Central</a>)</p>
<p>In a typical text editor, you are dependent on the arrow keys,
Home, End, Page Up, Page Down and backspace.
Unfortunately, the locations of those keys are inconsistent.
Sometimes they are too small and hard to reach.
I don’t want to limit the choices when I buy a new laptop
to the ones with those keys in the right positions.
Those models seem to become fewer for every year.</p>
<p>Therefore I need a text editor that only requires me to use the keys that don’t move around:
the letters, numbers and interpunctuation keys.
Vim, thanks to its modal interface, is very close to this.
The only other keys I need is Shift, Ctrl and Escape
(and it’s possible to avoid using Escape too, which I’ll probably do soon).</p>
<p>So I bit the bullet and switched.
But I still make off-by-one errors from time to time.</p>
tag:www.librador.com,2014-09-05:/2014/09/05/Book-review-The-ZX-Spectrum-ULA/Book review: The ZX Spectrum ULA: How to Design a Microcomputer2014-09-04T22:00:00Z2014-09-04T22:00:00Z<p><img src="/images/blog/zx-spectrum-ula.jpg" alt="Chris Smith: The ZX Spectrum ULA: How to design a Microcomputer" /></p>
<p>This book was my summer reading.
Maybe it’s the good timing with my relatively new interest in electronics
but this is one of the best technical books I have read.
The author Chris Smith has reverse-engineered the ULA,
the chip that handles the display, audio and i/o
of the ZX Spectrum computer from the early 80’s.</p>
<p>As an old assembly language programmer for the ZX Spectrum,
I have had a basic understanding
of what goes on in the circuits of a computer,
but it was only through this book I understood
how it actually works.
It describes the inner workings of the ULA
with timing diagrams and circuit drawings
at a level that was just right.
The only exception to this is that chapter 2 assumes familiarity with how semiconductors are constructed,
which made me worry if I’d get through the book.
Later chapters deal mostly with digital electronics and are approachable for an old-school programmer like me.</p>
<p>I learned a lot from this book and it inspired me to try to create my own
very simple Z80 based computer:</p>
<iframe class="vine-embed" src="https://vine.co/v/MLedA6ItDBX/embed/simple" width="480" height="480" frameborder="0"></iframe>
<script async="" src="//platform.vine.co/static/scripts/embed.js" charset="utf-8"></script>
<p>Have a look at
<a href="http://www.amazon.co.uk/The-ZX-Spectrum-Ula-Microcomputer/dp/0956507107/ref=cm_cr_pr_product_top">the book at Amazon</a>
but
buy it from <a href="http://www.zxdesign.info/book/theZXSpectrumULA.shtml">the author’s web site</a>.</p>
tag:www.librador.com,2014-09-04:/2014/09/04/Joining-Dramatify/Joining Dramatify2014-09-03T22:00:00Z2014-09-03T22:00:00Z<p>So, I’m in a startup again.</p>
<p>I’ve joined the team behind
<a href="http://dramatify.com/">Dramatify</a>,
an online collaboration and administering tool for film and TV productions.
The product is in beta. It has active users, and is almost ready for public launch,
so it’s a bit different from the other startups I’ve joined earlier
where the product were less mature.</p>
<p>After I left <a href="http://www.gootechnologies.com/">Goo Technologies</a>,
I’ve had a contracting gig at <a href="http://loadimpact.com/">Load Impact</a>,
working on the web site for their cloud-based load testing service.
Meanwhile, I’ve been approached by several people with startup ideas.
I like startups, and wish I had the time to work on almost all of them,
but I’ve tried to restrain myself.</p>
<p>Of course I chose Dramatify partly because of my interest in movies.
Also, it’s not entirely unrelated to my work on
<a href="http://www.screenplain.com">Screenplain</a>.
(It’s possible that I’ll find some mutual benefits between these two projects.)
Dramatify works closely with the production companies in the beta tests,
which is very inspiring.</p>
<p>I’m looking forward to improving the service!</p>
<p><a href="http://dramatify.com/welcome-martin-vilcans/">Read about me joining</a>
on Dramatify’s blog.</p>
tag:www.librador.com,2014-09-03:/2014/09/03/Does-anyone-use-RSS-Atom/Does anyone use RSS/Atom?2014-09-02T22:00:00Z2014-09-02T22:00:00Z<p>Are you reading this in a feed reader?
Maybe you’re the only one.</p>
<p>After Google Reader died,
I tried a few other similar services,
but it never became a routine, so soon my feed would be an unmanageble mess of unread posts.
Instead, my daily intake of information about what’s happening in the world has come mainly from
Twitter (for work, hobbies, etc.)
and Facebook (for staying in touch with people, events and politics).
As a fan of open standards, I hasn’t been happy with that.
Also, the signal to noise ratio on these media is pretty bad,
so lately I wanted to go back to staying up to date using feeds
(RSS and Atom).</p>
<p>So a few days ago I installed <a href="http://www.newsbeuter.org/">Newsbeuter</a> and imported my old feeds from Google Reader.
Several of these old feeds were invalid. In some cases, the whole website was gone,
but surprisingly often, the site was still around, and there was still a feed,
only that the feed URL had changed (probably because they switched to another CMS or blogging platform).
As they didn’t bother to redirect from the old feed URL,
web masters showed they don’t consider their feed subscribers an asset.</p>
<p>This made me doubt anyone subscribes to feeds any more.
I use <a href="http://feedburner.google.com/">Feedburner</a> for
<a href="http://feeds.feedburner.com/Librador">my feed</a>.
In Feedburner I can I get reports on the number of subscribers.
The number is bigger than it ever was back in the good old days of the blogosphere and RSS, but I doubt it’s valid.
Actually, I’m worried that Feedburner will be shut down any day now.
By killing off Reader
and <a href="http://techcrunch.com/2013/03/15/google-kills-rss/">removing feed subscription support in Chrome</a>,
Google (who aquired Feedburner in 2007) showed that they don’t care much for feeds.
(Unless they are the proprietary Google+ feeds of course.)</p>
<p>But maybe I don’t need to worry about Feedburner shutting down. Nobody subscribes to feeds anymore.</p>
<p>But please tell me I’m wrong.</p>