Wednesday, March 22, 2017

Encoding Animated GIFs from a WebRTC Video Stream

We recently released a project for a client where we integrated WebRTC video chat.  The goal was to make an app for both Android and iOS that could cross connect to run a simple multiplayer game with a live video stream.  The details of getting WebRTC up and running on both platforms are for another post, but here I'm going to focus on one specific client request for this project: recording video.

For reference, a lot of the original research and experimentation was carried out with Pierre Chabardes' AndroidRTC project and Gregg Ganley's implementation of Google's AppRTC demo.  We used the most recent versions of libjingle_peerconnection at the time of development (Android (Maven) 11139iOS (Cocoapods) 11177.2.0), which are not actually the most recent WebRTC sources.

Originally, we discussed going the whole nine yards: having a running video buffer with maybe the last minute or so of video, including sound, that we could save off to a .h264 MP4 video or some such.  WebRTC streams video and audio in two separate streams, and the SDKs for Android and iOS don't easily expose the audio stream.

For the sake of development time, we decided to restrict our video recording to simple animated GIFs.  Even though this was a vast simplification, it still proved to be a large development headache, especially on Android.  On iOS, at least, StackOverflow has some pretty straightforward answers, like this one from rob mayoff.  It was just a matter of getting things threaded and then we were off and running.

Actually, before I get to the GIF encoding, let me take a step back. Where are the frames we're going to use coming from?  On both platforms, WebRTC has a public interface that feeds a series of custom I420Frame objects from the backend streaming to the frontend rendering.  The I420Frames are really just YUV images.  Documentation is light, but we were able to dig through the WebRTC source, at least.  For Android, we have the VideoRenderer, which contains both the I420Frame class definition and the VideoRenderer.Callbacks interface, which is what actually gets handed a frame.  On the iOS side, we have the RTCVideoRenderer, which has a renderFrame method that can be overridden to get at the I420Frame (in this case called RTCVideoFrame).  More generally, the UIView you would actually use is a RTCEAGLVideoView, which can just be inherited, and you can grab the I420Frame when renderFrame is called.

Android is, again, trickier.  When you receive a new remote video stream from WebRTC, you need to have a VideoRenderer.Callbacks implementation wrapped in a VideoRenderer object that you apply to the stream.  The Android SDK provides a helper class (org.webrtc.VideoRendererGui) with static methods to create VideoRenderer.Callbacks implementations that can draw to a GLSurfaceView.  However, this implementation doesn't really play nice with inheritance like things do on iOS.  Fortunately, you can add multiple renderers to a video stream.  So we created our own implementation of VideoRenderer.Callbacks, wrapped it in a VideoRenderer, and added and removed it from the remote video stream as needed.  Now renderFrame would be called on it, and we had access to the I420Frame.  NOTE: We discovered we had to call VideoRenderer.renderFrameDone() at the end of renderFrame to clean things up.  The WebRTC SDK creates a separate I420Frame object for each video renderer, and each is responsible for its own cleanup.  Otherwise, you'll end up with a mysterious memory leak.

So all of that is done, and now we're getting I420Frame objects as they're sent over the remote video stream, which we can copy to a local streaming buffer, data store, or whatever you like for later.  But again, these are YUV images, not typical RGB, which means they need to be converted before they can actually be encoded using any sort of standard GIF library.  On iOS, this is comparatively easy.  Google developed a YUV converter that lives in the WebRTC library, and we can just use that.  We grabbed the header files, and then we could just use the various functions to copy frames (libyuv:I420Copy) and convert to RGB (libyuv:I420ToABGR).  Note the swapped order of ABGR.  iOS image generation expects RGBA, but empirical testing showed that the endian-ness was swapped, and converting with ABGR on the WebRTC side resulted in correctly ordered bytes when fed to iOS libraries.  StackOverflow again has answers for getting a usable UIImage out of a byte array, such as this one by Ilanchezhian and Jhaliya.

As is a running theme here, Android was not so easy.  Technically, it has the same YUV converter buried in the native library, but we're operating in Java, and things are not easily exposed at that level.  It turned out to be way easier to write a YUV converter class than try to get at the internal conversion utility.  Starting from this StackOverflow answer by rics, we created YuvFrame.java, which we've posted here.

Finally, we're at the point of actually saving the collection of WebRTC video frames to an animated GIF.  I discussed the iOS method earlier.  I also leave it as an exercise to the reader to record the variable framerate of the video stream and apply the frame timing reasonably to the animated GIF.  The main discussion is once again Android.

We started out with a Java-based GIF encoder with high color accuracy.  This got the job done well, but it had a drawback: on somewhat older devices, like the Nexus 5, encoding 2 seconds of video at 10fps with 480x480px frames (20 of them) could take upwards of 3 minutes to complete (though to be fair, with lots of background processes closed and a fresh boot, it could be down to 1 minute 15 seconds).  Either way, this was unacceptable.  All our tests on iOS, even with an older iPhone 5, showed much better quality encoding in 10-15 seconds.  Step one was the increase the thread priority, since we were using an AsyncTask, which defaults to background thread priority and takes up maybe 10% of the CPU.  Bumping this up to normal and even high priority got us around a 40% speed increase.  That's a lot, and given that the majority of phones have multiple CPU cores, it didn't affect the video stream performance.  However, our actual target was a 6 second animated GIF at 15fps, which means 90 frames to encode.  The next step was to dig up an NDK-based GIF encoder.  This got us a further speed increase, and we were looking at just over a minute for the full 90 frame encode.

I instrumented the whole encoding process, and there were two major time sinks: creating a color palette for each frame, and converting the frame to the color palette.  The former was maybe 20% of the frame encode time, while the latter was 70%-75%.  I played around a bit with global color palettes and only generating a new color palette every few frames.  The former resulted in a pretty bad quality reduction in certain cases, but when I generated the color palette once every 5 frames and stored it for the intervening frames, I got a decent amount of speed back without a serious drop in quality.  Still, this was only affecting the operation that comprised a bit of the total frame encoding time.  Actually going through all the pixels of each frame and finding the best match in the color palette was the most intensive part.

I can't say I came up with the idea (that credit belongs to Bill), but I did implement our final solution.  We multi-threaded the process of palettizing the frames.  We checked the device to see how many CPU cores it had (combining StackOverflow answers from both David and DanKodi), then set the encoding thread count to one less than that (so the video stream keeps running).  We segmented the frame by rows into however many threads we had to work with, and proceeded to palettize each segment concurrently.  Now you may be asking, what about dithering?  Well, strictly speaking, this method results in a slightly lower quality frame because we can't do dithering quite the same way.  We dithered each segment as normal, and for the later segments, we used the (un-dithered) row from the previous segment as a basis.  On its own, this would result in artifacts along the lines between segments.  So after all the threads were done, we did one more custom dithering pass along the boundary lines between segments to use the final dithered values from the previous segment to update the first row in the next segment.  This pretty much smoothed out all the noticeable artifacts.

We forked Wayne Jo's android-ndk-gif project with this new encoding method.  This got us yet another 40% increase in encoding speed, bringing us under 40 seconds on average to encode 90 frames on an old Nexus 5, which we deemed acceptable.  On a modern phone, this actually results in faster speeds than we saw on iOS.

In conclusion, I have failed to talk about other potentially useful pieces of this whole puzzle, including saving animated GIFs to the Android image gallery, saving animated GIFs to the iOS PhotoLibrary, getting WebRTC connections to persist across Android screen rotations, and the whole thing where we actually got the Android app and the iOS app to connect to each other.

Tuesday, March 7, 2017

Bulk Updating Google Play IAPs with a CSV File

I just spent way too long getting a CSV file formatted in such a way that Google Play's bulk in-app-purchase import would be happy about it.  In theory this is easy, but Google Play's example formatting is not so great.  Since I couldn't find a nice simple example template, I figured I'd provide one here.

Here's an XLS file, and here's a CSV.  I exported as UTF-8, with all text fields quoted, commas separating.

Also, don't forget: Descriptions can't be longer than 80 characters!

Sunday, November 6, 2016

FerroDynamics



A physics puzzle game that's sort of a spiritual sequal to our earlier TransPlan, FerroDynamics challenges the player to design their own mechanical environment!  Starting with a clean, minimalist UI for planning your machine, FerroDynamics builds a beautifully rendered 3D model in the background and shows you the results when you push the play button!

We've been working on this on and off for a while, and after finally settling on a name have created a Greenlight listing!  We'd love to get your vote, so we can take the last few steps to shipping a finished product.

FerroDynamics is currently very playable, with a couple of dozen levels designed and plenty of components with different shapes and affects.  In addition to gates, ramps, tracks, and other static elements, we also have switches and buttons, which can be wired to turn on and off features like electromagnets, lasers, and more.  All of this is represented in the resultant 3D view!

We've had good reactions to our earlier titles, and look forward to maintaining those high standards here.  Please give us a thumbs up over at our Greenlight listing!





Friday, August 5, 2016

Anode coming to XBox One

We've been keeping busy despite the silence on our web page here, and one of the upcoming results is Anode appearing on the XBox One!  It goes on sale August 10th for the same price as Steam: $3.99.

I've been busy the last few days getting preview keys sent out to folks and the early response has been very good.  I'm hopeful that the game does well on console, where the couch multiplayer should find an easier home than Steam.

By far the biggest wrinkle in bringing the game over was dealing with controller-specific profiles, which cause a surprising amount of trouble and were never an issue on the previous platforms.  It has its benefits though, as achievements like Anomaly mean a little more when each controller has its own account.  All the achievements and leaderboards made the transition, and the game looks better than ever.  We hope XBox One owners enjoy it.



Saturday, December 12, 2015

Anode Steam update

We've just pushed an update live for Anode that adds support for Turkish! This translation is courtesy of the folks over at OyunCeviri, who have a project dedicated to getting more games in their local language. Enjoy!

Check it out here!

Tuesday, October 27, 2015

Anode, Gentoo Linux, and Unity3D

One of our customers for Anode attempted to the run the game on their Gentoo Linux machine and found that it immediately crashed.  Unity did not provide a very helpful stack trace for us to dig into.

It took almost a week and a half of research, but we finally narrowed down what the problem was.  Our customer who discovered the problem was incredibly helpful during the process.

Suffice to say, when we built Anode using the most current version of Unity we had available (5.2.1f1, also tested more recently with 5.2.2f1 and 5.2.2p1) and had sound assets referenced in the scene file, the game would crash, but only if we ran on Gentoo Linux through the Steam client.  If we ran via the command line or a desktop shortcut, the game was fine.  We also confirmed this with a simple Unity project started from scratch.

One of the symptoms of this problem we noticed was during the Unity build process, where it would warn "Failed to add the asset file size for sharedassets0.resource".  If we referenced a sound asset, this warning would appear, and we'd get the crash on Gentoo.  If we removed the sound reference, the warning would not appear, and the game would not crash.  This is also happened when we tried to include sound assets in the Resources folder rather than a direct scene reference (though in that case the warning was "Failed to add the asset file size for resources.resource").

This Unity forum thread talks a lot about the warning, but in the end concludes that it's harmless, and in fact Patch 4 for Unity 5.2.1 suppresses the warning message during the build.  We had seen this warning earlier during the development of the game, but it did indeed seem harmless, since the game ran find on Windows, Mac, and Ubuntu Linux.  However, even with 5.2.1p4 suppressing the warning, we still found the same Gentoo crash with the builds it made.

We currently have an issue submitted to Unity about this, and will keep an eye on it.