Badass JavaScript

A showcase of awesome JavaScript that pushes the boundaries of what's possible on the web, by @devongovett.

WebKit.js: It’s happening for real, with Emscripten’s help

January 16th 2014

Remember that April fools post I wrote a couple years ago on WebKit.js? That someone had ported the entire WebKit rendering engine to JavaScript so you could browse while you browse? It was going to solve all our compatibility woes. That one was vaporware, but now it’s actually happening for real.  Trevor Linton is porting WebKit to JavaScript via Emscripten.  The real WebKit.js is here.

It’s an early build and there’s no demo yet, but you can go see the massive output file it produced, meaning it actually does compile.  The goals of the project are

  • Produce a renderer in pure JavaScript that supports rendering to WebGL/Canvas contexts.
  • Develop frameworks for image/webpage capturing and automated web testing (similar to PhantomJS).
  • Develop a framework for prototyping CSS filters, HTML elements and attributes.
  • Experimental harness for pure javascript rendering performance.
  • Develop a javascript based browser in nodejs (just for fun).

Those are interesting goals. I have no idea how realistic or practical they are, but its a great showcase of the power of Emscripten to compile just about anything these days.

Currently a lot is not working yet, but its interesting to see the vaporware of two years ago become reality today.  Go check it out on Github, and try your hand at getting various WebKit features working.

glsl-transition: iPhoto like slideshow transitions using WebGL

January 8th 2014

glsl-transition is a project from Gaëtan Renaudeau which provides a very flexible and extendable way to use WebGL shaders to create iPhoto like slideshow transitions.  It has a promise-based API, supports easing functions, and supports any GLSL fragment shader you can write.

image

The project is from the same author as slider.js, which is a general purpose slideshow framework for JavaScript.  slider.js supports WebGL transitions, as well as Canvas 2d and CSS transitions as well, in an extendible way, with a whole bunch of built in transitions that come with the library.  I’m not sure if glsl-transition is planned to be used by slider.js (it isn’t at the moment), but it provides only support for WebGL transitions obviously, and is a bit lower level, though it still has a pretty nice API.  glsl-transition provides no built-in transition effects, however there are a number of examples in the Github repo.

It’s obviously meant to be an extendible library, making it fairly easy to write your own transition effects.  You just have to implement a WebGL fragment shader with a couple special uniforms that will be filled by the library for you, including the starting image, the ending image, the progress in time, and the resolution.  The library will automatically render the transition using your shader as needed, controlled by a timing function.

glsl-transition provides support for not just bezier easing functions, but any function that takes a parameter t as a linear time, and it returns an easing value.  If you want a bezier curve, there is a library for that which returns compatible easing functions, and supports some built-in commonly used easing functions.  If you’re interested in how this works, the author wrote a really good article about implementing bezier curve easing functions.  Apparently it was good enough that even Apple is using the code from that article on their Mac Pro website.

I’m glad to see WebGL taking off.  We still have a little way to go on mobile, but WebGL support is now fairly widespread on modern desktop browsers with even Microsoft supporting it in IE11. Only Safari is left with it turned off by default.

You can check out glsl-transition on Github, and a nice demo with lots of example transitions.

ogv.js: An Ogg Theora and Vorbis Video Decoder in JavaScript

January 2nd 2014

Brion Vibber has been working on ogv.js, an Ogg video player in JavaScript, supporting both audio and video. We’ve seen video codecs in JavaScript before, such as Broadway (H.264), Route9 (WebM/VP8), and more but mostly without audio support to go along with it. We have audio codecs in JS too, just not combined with video yet. That changes with ogv.js.

You can check out the demo, which runs in all modern browsers, but best in Firefox from what I’ve seen, thanks to asm.js optimizations.  I got about 23 FPS at about 40% CPU usage.  Of course, Firefox and Chrome already support Ogg video natively, so the primary targets of the project are Safari 6+ and IE 10+.  Ogv.js is an Emscripten compile of libogg, libtheora, and libvorbis.  It’s actually a fork of a project I started to bring Ogg Vorbis audio to the Aurora.js suite of audio codecs.  However, at this point, I’d be hard pressed to find any of my code in the project.

The video decoding process actually starts with a streaming HTTP implementation, which allows videos to start playing before the entire file has been downloaded.  Unfortunately, there isn’t a good streaming XHR API cross browser, so they use a combination of Microsoft's MSStreamReader API for IE 10+ (hopefully going to become standardized?), Firefox’s proprietary moz-chunked-arraybuffer, and binary strings (ouch!) for Chrome and Safari.  This allows much increased perceived performance, since decoding can happen as the file is being downloaded.

Once the video stream has been received, it is fed into the C wrapper around libogg, libtheora, and libvorbis for decoding.  Then the audio goes to the Web Audio API (which means no IE support at this point), and the video is rendered in a canvas element.  At this point, the project is just using a simple 2d canvas for rendering pixels, but they say in the readme that a WebGL implementation could be used in the future.  I’ve seen huge performance increases (up to 20%) by moving the colorspace conversion (the last step of decoding) to the graphics hardware, which can do it in parallel, so I’m looking forward to seeing this.

Parallel decoding in web workers is also being considered, since decoding both the video and audio on the main thread can make for some stutters on slower machines (read: mobile).  Support for seeking is also on the roadmap.

Video decoding in JavaScript is an interesting problem, and not something on would have expected a few years ago.  But, it seems like people keep trying it, and with each attempt, we get better.  JavaScript performance keeps increasing, especially with things like asm.js, and the new SIMD API (more on that soon!).  One by one we’re checking off the things on this list.  Who knows how practical it really is, especially since HD video or anything large still basically kills any system, and on mobile it can drain your battery really quickly.  But its a good benchmark for performance, and it’s darn cool anyway!

You can check out ogv.js on Github, and a demo as well.

P.S. I hope you like the new site design, it should be much more readable on mobile thanks to the responsive layout. Let me know what you think!

TraceGL: A JavaScript Codeflow Visualization and Debugging Tool using WebGL

April 23rd 2013

image

Rik Arends has just released TraceGL, an interesting JavaScript codeflow debugging tool using WebGL for its UI rendering.  Described as “an oscilloscope, for code”, TraceGL is an improvement on the familiar step debuggers that can be found in browser dev tools like Chrome, Firebug, and now Firefox itself.

TraceGL works by instrumenting all of your code so it knows when calls took place, and all of the boolean logic that determined which code path to take.  Then it visualizes all of this, using WebGL for performance, showing you a high level overview called the “mini map” in the top left, a log of function calls in the top right, the call stack in the bottom left, and finally the code for the function in the bottom right.

As your code runs, TraceGL visualizes all of this data in real time.  The mini map is useful to see the ebbs and flows of the code, i.e. where the stack gets deeper and shallower again.  In this way, you can see where events are being processed, like mouse or keyboard events in the browser, or HTTP requests in a Node.js application, and then get to a section of the potentially very long call stack very quickly.  TraceGL even works over asynchronous events, unlike most step debuggers, which means that these operations are still shown as part of a single call stack under their originating calls, rather than as separate events.

Here is a video showing TraceGL in use:

TraceGL can instrument both browser based and Node.js applications, and integrates with various editors so that double clicking a line can open your favorite editor.  An interesting aspect of the UI is that it is written entirely using WebGL, apparently for performance reasons.  Of course, all of the text rendering (most of the UI) must have been done in a 2d canvas and then uploaded to WebGL as a texture since WebGL has no native text rendering capabilities, but clever rendering tricks like only re-rendering what has changed can make things fast.  And once the textures are on the GPU, moving them around, scaling them, etc. using shaders is very fast.

I think we’re probably going to see more and more WebGL user interfaces soon.  We’ve seen a lot of 3D stuff written on top of WebGL, and it is certainly good for that, but I’m betting that normal 2D user interfaces on the web will start being written with it too, just thanks to its great performance characteristics.  HTML and CSS is great for documents and applications, to a point, but for web apps to compete with native on performance, hardware accelerated UIs on top of WebGL will be important.

Of course, building user interfaces using WebGL means that any text rendering that is done won’t be selectable, copyable, or accessible to screen readers without lots of additional work, so I can see frameworks being developed to facilitate this.  I’ve already been working on and off on something similar to Apple’s Core Animation framework on top of WebGL (not public yet), and other interesting 2D frameworks like Pixi.js have been released recently.  Especially with WebGL’s likely support in Internet Explorer 11, I think the age of WebGL user interfaces is upon us, and it’s  exciting!

You can check out TraceGL on their website.  It costs $15 to buy, but not all good tools are free and it’s nice to support good developers, so give it a shot and let me know what you think in the comments!

Link: Excellent Article Clarifying Mozilla’s asm.js Project by John Resig

April 3rd 2013

John Resig has written an excellent article clarifying some of the questions that have arisen over the past few weeks regarding Mozilla’s asm.js project. If you haven’t heard of it already, asm.js is a highly optimizable subset of JavaScript designed mostly for compilers like Emscripten. I wrote an article about it a few weeks ago when the spec was first released right here on this very blog, so check that out if you want some more introductory details.

John’s article talks about some of the use cases for asm.js, some of the common misconceptions about it, and finally includes a question and answer section with Mozilla’s compiler engineer David Herman, who is one of the authors of the asm.js specification. It’s definitely a good read, so check it out!

I think asm.js will be really important over the coming months and years, and I’m excited to see other browser vendors already getting on board. I got even more excited about it when I saw Mozilla and Epic Games’ demo showing the Unreal Engine running in the browser at very good performance, thanks to Emscripten and asm.js last week.

I’m looking forward to trying out asm.js myself very soon as well, especially for the JavaScript audio codecs that I worked on as a part of Audiocogs (née Official.fm Labs). Unfortunately, asm.js isn’t really designed for human authors so it would be a very big task to convert one of our existing codecs to use it. However, it is very promising for new ports, and I’ve already starting playing around with using Emscripten to compile libogg and libvorbis to JavaScript to use with the Aurora.js framework. It will be interesting to see the performance and code size differences between the hand ports we’ve done and the Emscripten generated ones. Now I just have to find the time to actually do it! :)