Tag Archives: WebGL

Video: Rendering HTML via WebGL


In the recent time Web development community had a big discussion on “DOM is slow” topic. This thesis is truthful. DOM is a quite complex model which starts a ripple of events or chain reaction over document on every modification. HTML GL solves “the slow DOM problem” by creating WebGL representations of DOM elements and hiding actual DOM after. This speeds up HTML/CSS animations and transformations by using 3D hardware acceleration and allows to apply OpenGL effects as modern 3D games have.

In this talk:

– The “Slow DOM problem”;
– Solutions possible
– DOM optimization vs alternative rendering approaches(React-canvas, Netflix methodology);
– Seeking for an ideal solution;
– Rendering content via WebGL using HTML GL
– Limitations, recommendations
– Where to go further?

Experiment with ECMAScript 6 on BabylonJS with TypeScript 1.5


via Experiment with ECMAScript 6 on Babylon.js with TypeScript 1.5.

This article is part of a web development series from Microsoft. Thank you for supporting the partners who make SitePoint possible.

Since releasing babylon.js, the WebGL open-source gaming framework, a couple of years ago, we (with help from the community) are constantly exploring ways to make it even better. I’m definitely more than happy that we decided more than one year ago to switch over to TypeScript. For more on that decision, read why we decided to move from plain JavaScript to TypeScript for Babylon.js

Thanks to TypeScript, we’ve been able to improve the quality of our code, improve our productivityand create our fabulous Playground we’re so proud of: http://www.babylonjs-playground.com/, which provides auto-completion in the browser! We’ve also been able to welcome some new team members coming from a C# background and few JS skills with no pain. But thanks to the TypeScript compiler, we can also test the future without rewriting a single line of code!

We are still coding babylon.js using Visual Studio and TFS while pushing in a regular manner our code to the github repo. By upgrading our project to Visual Studio 2015 RTM, we’ve been able to upgrade it toTypeScript 1.5.

Creating an Accessible Breakout Game Using Web Audio and SVG


via Creating an Accessible Breakout Game Using Web Audio and SVG.

This article is part of a web development series from Microsoft. Thank you for supporting the partners who make SitePoint possible.

As the co-author of Babylon.js, a WebGL gaming engine, I was always felt a little uneasy listening to folks discuss accessibility best practices at web conferences. The content created with Babylon.js is indeed completely inaccessible to blind people. Making the web accessible to everyone is very important. I’m more convinced than ever about that as I’m personally touched via my own son. And so I wanted to contribute to the accessibility of the web in some way.

Build a High-Performance Mobile App With Famo.us and ManifoldJS


via Build a High-Performance Mobile App With Famo.us and Manifold.js – Tuts+ Code Tutorial.

For the last few months I’ve been wanting to dive into this new JavaScript framework since I saw its launch event in October of 2014. Famo.us includes an open-source 3D layout engine fully integrated with a 3D physics animation engine that can render to DOM, Canvas, or WebGL. In short, you can get native performance out of a web application, largely due to the way Famo.us handles the rendering of its content.

Jenn Simmons of the Web Platform Podcast recently had Famo.us CEO, Steve Newcomb on the podcast to discuss the mobile performance and their upcoming mixed mode. This was perfect timing, as Microsoft had just released ManifoldJS, a tool which allows you to package your web experience as native apps across Android, iOS, and Windows. I wanted to put these two technologies to the test.

In short, I wanted to determine if Famo.us does actually have great mobile performance, as well as have an understanding of how straightforward the process was for packaging my web application as a mobile app.

Creating PIXI.js filters using WebGL


via Creating PIXI.js filters using WebGL | Tizen Developers.

PIXI.js 3.0 is a great library for creating 2D canvas games and animations. It’s one of the most used canvas renderers over the Web. It also supports WebGL rendering, thanks to which most of the operations are executed by GPU instead of CPU. It comes with a nice filters feature that has great possibilites and is quite easy to use and extend. In this article we will focus on describing how to create a custom filter using Fragment shaders.

Introducing Four: It’s WebGL, but Easier


via Introducing Four: It’s WebGL, but Easier.

WebGL has been around for a few years now and we have watched it mature into the reliable and widely supported graphics technology it is today. With big companies like Google, Mozilla, and Microsoft advocating for its use, it’s hard not being curious about it.

Since its specifications were finalized in 2011, it has gained a lot of attraction. With the help of frameworks like ThreeJS, BabylonJS and Play Canvas, this area has become less daunting. Thanks to them it’s much easier to pick up, but it still requires a good learning effort as it is a different discipline altogether.

This article will briefly introduce you to what WebGL is and then I’ll cover Four, a framework I created to help developers delve quickly into the WebGL world. In case you want to see what Four and WebGL can do for you, take a look at this simple demo I built.

What Do You Mean by “Shaders”? How to Create Them with HTML5 and WebGL


via What Do You Mean by “Shaders”? How to Create Them with HTML5 and WebGL.

This article is part of a web dev tech series from Microsoft. Thank you for supporting the partners who make SitePoint possible.

You may have noticed that we first talked a lot about babylon.js last year and most recently we’ve released babylon.js v2.0 with 3D sound positioning (with WebAudio) and volumetric light scattering.

If you missed the v1.0 announcement, first you can catch-up with the keynote for day two here and go directly to 2:24-2:28. In it, Microsoft evangelists Steven Guggenheimer and John Shewchuk demoed how the Oculus Rift support was added to Babylon.js. And one of the key things for this demo was the work we did on a specific shader to simulate lenses as you can see in this picture: