In the previous post of this 2 parts series, I have analyzed the technical features of the codec VP9 and concluded that, technically speaking, VP9 has the basis to compete with HEVC in terms of encoding efficiency.
But, you know, theory is a different thing than reality and in video encoding a big part of the final efficiency is in the encoder implementation more than in the codec specification. In this regard VP9 is not an exception and what I see from my tests is that vpxenc (the open source, command line encoder provided by Google) is not yet fully mature and optimized for every scenarios. I’ll discuss about this latest distinction more over.
VP9 specification has many features that can be used to enhance perceptual-aware encoding (like “segmentation”, to modulate quantization and filters inside frames according to perception of different areas of each frame). But those features are not yet used in vpxenc and this is clearly visible in the results.
At the beinning of 2015 I evaluated the performance of several H265 encoders for my clients and published a quick summary of the advantages and problems I found in (that time) HEVC encoders compared to optimized H264. The main problem that emerged in that evaluation was the inefficiency of “Adaptive Quantization” and other psycovisual techniques implemented in the encoders under test. The situation has partially changed for HEVC encoders during last year (thanks to better psycovisual encoding, especially for x265) but grain and noise retantion, especially in dark areas, is always a challenge for codecs exploiting big “transformations” like H265 and, indeed VP9.
Vp9 today shows the same inefficiencies of HEVC 1 years and half ago. It is quite good in handling motion related complexity, thanks to advanced motion estimation and compensation and reconstructs with high fidelity low and medium spatial frequencies, but has difficulties in retaining very high frequencies. Fine film grain disappears even at medium bitrates and the “banding” artifact is very visible in flat areas, gradients and dark areas even at high bitrates. In this regard H264 is still much better, at least at medium-high bitrates. Those kinds of artifact are quite common on Youtube because they are using now VP9 everytime they can, so try by yourself a 1080p or 2160p video on Chrome and take a look at gradients and shadows.
The sad thing is that common quality metrics like PSNR, SSIM (but also the more sofisticated VQM) are more happy with a flat encoding than with a psyco-visually pleasant, but not exact – encoding, and at the end, VP9 may be superior in PSNR or SSIM to H264/H265 even in a comparison like that of Picture 2 below where is very evident the banding or “posterization” effect.
VP9 profile 2 – 10bit per component
Until now I’ve spoken about traditional 8bits/component encoding in H264, H265 and VP9. But vpxenc supports also a 10bits per component encoding known as VP9 profile 2.
Even if your content is at 8bit and everything remains BT.709 compliant, several studies has demonstrated that 10bit encoding is always capable of better quality/bitrate ratios thanks to higher internal accuracy. In particular the benefits are well visible in gradients and dark areas’ accuracy. See this example of VP9 8bit vs 10bit:
In the picture above we can see the better rendering of soft gradients when encoding at 10bits even if the source is 8bits. Grain (high freq, low power signal) is still not retained compared to the source but banding is pretty much reduced. Note also that in the case of VP9 profile 0 we need to increase the bitrate well above 3Mbps to have a good encoding of gradients (for 1080p) while at only 1Mbps the result is in this case sufficient when using profile 2.
The superiority of 10bits encoding has been always valid also for H264 (high10 profile), so why 10bits have started to gain momentum only with HDR and not before ?
The answear is “lack of players” on consumer’s devices. Let’s remember that H264 has become relatively early the standard in internet video only because Adobe decided to insert (at it’s own expense) a decoder inside Flash Player 9 (2007). This enabled a billion desktops to playback baseline, main and high AVC profile. Few know that originally it should support also high10 but a bug ruined the opportunity to actually use this function.
Apart this missed opportunity, H264 decoders on modern browsers, mobile devices, TVs, STBs are not capable to decode H264 high10 profile and the same is true for VP9.
Where is VP9 available now ?
Today VP9 is supported in lastest Chrome, Firefox, Opera (and Edge in preview) browsers on desktop (PC and Mac) and is supported in Android from version 4.4 on (software or hardware decoding depending by device). It is also available on an increasing number of Connected TV, but all the current (significative) decoders support only VP9 in mode 0, so 8bit.
The same problem is true for H265. On the mobile devices that support it, you can only deliver 8bit H265, but in this case it is also true that the large majority of 4K TVs support HEVC main10 profile as well.
So, when is convenient to use VP9 ?
The problem of “banding artefact” is directly proportional to the size of the display. It is irrelevant on small displays like that of smart phones and tablets. On laptop it starts to become visible and is pretty bad on big TVs.
So, concluding, I think that today VP9 is an interesting option for everyone who wants:
– The maximum quality-bitrate ratio on desktop even with some compromises in terms of quality. HEVC decoding will probably not appear on desktop for a long time, so VP9 is the only viable improvement over H264. The use case of live streaming can better fit the compromises.
– High efficiency on Android with a wide support base (Android >4.4). On an old, 100$ Android Phone I have, VP9 decoding works and HEVC not. Interesting option for markets of developing countries when bandwidth is scarce and Android has a bigger base than iOS.
If the current situation doesn’t change I doubt that players like Netflix will deliver high quality content on Desktop or TV using VP9 in profile 0, especially for 4K. And infact David Ronca of Netflix has said that they are evaluating VP9 especially to lower the level of access for mobile devices (they already use HEVC for HDR-10).
But fortunately the scenario is probably about to change quickly if it’s true that Youtube is planning to deliver HDR (=10bits) with VP9 during summer. This means that TVs with Vp9 profile 2 decoding capabilities are becoming a reality and this should open the way also for profile 2 on desktop browsers. In this case (and I’m optimistic), VP9 has really good chances to definitively become the successor of H.264 at least for Internet Video on Desktop and Android.
Remain to see what Apple will decide to do. In the while I’m starting to push VP9 in my strategies because Indeed I think that their choices are irrelevant. If we want to optimize a video delivery service it is increasingly clear that we will have to optimize for all 3 codecs.
I have already talked (perhaps too much) about the Future of Flash in this post. There I didn’t hide my perplexities about the Market position of Flash compared to alternative technologies. After the drop of Flash Player for Mobile there was a strong decline in confidence for Flash platform. But now the scenario is beginning to emerge sharply and I begin to understand the purpose of the Adobe strategy.
Yesterday Adobe has released a public beta of AIR 3.2 for mobile application development. This version implement the promised support for Stage3D in mobile platforms like iOS and Android. A number of demo video appeared on the web showing excellent 3D performance and a lot of renewed interest about mobile game development using AIR:
Square Enix’s [Barts] running on Android
The time will tell, but AIR has the potentialities to become a leader platform in 2D/3D games development. A single code base is sufficient to create a game for Desktop (AIR’s captivate runtime), Browser (someone named Facebook ?) and now iOS and Android. With ConnectedTVs and STBs support to come (already showed during MAX), the dream of the Open Screen project is becoming reality, at least in the game dev area (but also intensive graphic/media applications may leverage 2D/3D accelerations).
Therefore Adobe has concentrated the resources in a promising field where Flash could easily become leader. In 2D/3D browser gaming it is just leader (500Million players on Facebook may be sufficient as business card ?) . Try by yourself searching for Stage3D demo in YouTube to see the huge amount of interest for this technology from any game developers (big and small).
The second strong commitment of the platform is for video delivery where Flash has been leader in the past 5 years and is still today. The performance of video decoding in the browser has been widely improved with a completely redesigned pipeline that now exploits mult-threading heavily. But most important, the support for accelerated H.264 streaming has been added to AIR for iOS using the standard Apple HLS (already supported by FMS 4.5 and Wowza Server).
During the spring Adobe will release the new version of Flash Access (now Adobe Access 4) that will include content protection for iOS devices (both in AIR and native application) in the form of DRM on HLS. This move has the potentiality to make Adobe re-gain the favor of majors and big content providers that would have the possibility to uniform DRMs across Android, iOS, Desktop Apps, Browsers, Google TV and some STBs.
The support for HW accelerated 2D, 3D and video playback on mobile, plus an improvement in performance for Flex applications, plus the possibility to integrate HTML5 contents with StageWebView, plus the DRM, plus native extentions, **finally**, makes AIR (for Mobile) an interesting, efficient, effective and valuable solution for cross platform application development.
(Updated 03 March 2012)
I think the platform is 99% complete now, which is very good, but I would like to see the following issues addressed ASAP to complete the features list of AIR for Mobile:
- H.264/AAC on RTMP : necessary for effecient real time video application, especially now that FP supports H.264 encoding.
- Echo cancellation : see the previous point.
- Effective and Robust support for key native features like InApp Purchase and Notification. I like Native Extentions’s idea but I’d prefer an official API for critical features like these.
- Better integration/communication between AS3 and JS in StageWebView. No more hacks please.
Make a comment if you think that there’s something else of important to add to AIR for Mobile/AIR for iOS.
Long time passed since my last post on this blog. I have been very busy in an important video streaming project but this is not the only reason for my absence. I have also wanted to wait and take all the necessary time to analyze, ponder and “digest” the infamous Flash affair.
I will not hide my bitterness about the fact, but I’m also more optimitic now, after I have seen the real consequences and have had the time to elaborate on the future scenario. I’ts not all a bed of roses but I’m somewhat optimistic.
First of all, fortunately, I’m not limited to Flash technology in my consultancies. I work with .net technologies for many years and I have designed and deployed successful streaming services in HLS with both Wowza Server and FMS 4.5
You also know that I’m an encoding expert with important success cases and a deep knowledge of commercial and open source encoders like Ffmpeg, x264, Flip Factory, Telestream Vantage, Atheme KFE, Rozhet CarbonCoder, Digital Rapids to name a few.
I have created encoding pipelines and optimized existing ones for delivery platforms based on HLS, Flash HDS, MS Silverlight and ipTV and designed decoding and delivery optimizations for Flash and Silverlight.
So when I talk about my bitterness, it is not driven by the fear for the future but by the awareness of the big mistake that Adobe has done stabbing Flash in the back. I want to focus this post on the future prospectives for Flash and not on the disastrous announcement of Adobe (a masterpiece of masochism, at least from a PR point of view), however a brief summary of my thoughts on the topic is a good thing. I do two short considerations:
1. Adobe may also have had good, long term stategic reasons for dropping Flash for mobile browser, but they could choose modes and terms with much less collateral damages. Why not reduce progressively the commitments and the investments across the lifespan of FP11 to avoid harming the Flash Community ? After all, FP11 has been released for Android and QNX and it has brought important improvements in performance and stability. I know that Flash for mobile browsing has a lot of problems and those problems are due tot the excessive use of bad Flash coding that has been done over time especially for advertising. Obviously if you have a page with 5-6 Flash banners that can kill an old desktop computer, how can be able a tablet to handle this ?
A simple solution could be to put every swf of a page in an idle mode, with a clickable poster image that activates the swf only when touched. Simple, clear and always better than have no Flash support in mobile browsing.
2. Adobe just does not realize that is killing the goose that lays golden eggs. Have you even thought about the fact that Flash is used every day by 2 billion people! It’s probably the most pervasive peace of sofware after MS Windows. Giants like Steve Jobs would have exploited such competitive advantage in ways that the current Adobe management are not even able to imagine. Yet it is not difficult to imagine for example a marketplace of Flash and AIR apps on the model of the MacOS AppStore (but with 20 times more potential customers). What it is worth this kind of power ? Evidently near t0 zero for Adobe.
But now the damage is done and it worth nothing to complain, and so there will be some short, medium and long term consequences. The short term consequences are paradoxically positive for experienced Flash developers. This is because new developers, creative shops and consultancy firms are focusing interest to HTML5 because of the bad medium and long term outlook for the Flash technology and because of marketing reasons. But the demand for Flash technology is not decrasing as fast as the offer and so there is a burst in the amount of work available for skilled developers.
In a medium term I see an higher convergence between the demand and offer for Flash-based projects in general. Flash will mainain or increase it’s penetration in web gaming thanks to 3D (remember that the casual game market on Internet is completely Flash-centric today, how forget that every day 200+ million people play some Flash games in Facebook ?) and probably will remain the reference for video streaming, but in the RIA market and creative market HTML5 will definitely gain it’s momentum (in real terms, not like now where only a few important creative, video or gaming projects has migrated from Flash to HTML5).
Flash in the mobile market, as a cross platform mobile development technology, has not, in my opinion a clear outlook for the future. The sudden drop of Flash for mobile browser and the drastic reduction of commitment for Flex has been percepited as a treachery of Adobe from the point of view of the loyal base of sustainers and developers and as a definitive change in the wind from the point of view of customers and stake holders. How to blame them ? the lack of support from its own creator is a mortal stub for a technology and the message from Adobe is clear: in the long term we’ll substitute Flash with HTML5. Not only, we will focus more on tools than technologies (Flex docet).
No place for developers in the future of Adobe ? I don’t know but the long term perspective of Flash, Flex and other Flash related technology (FMS?) has been heavely perturbated by the infamous move. Flex is now an Apache baked project but is it a guarantee of evolution and support ? Who will invest time and credibility among customers in a technology for mobile development that has not a clear commitment from its creator and controller ?
Concluding, what I intends to do as a Flash developer ? In the short term I have to do a lot of Flash related projects, so no problem. In the medium term I think to continue using Flash/AIR for Mobile development. This is a clear path for me, I can capitalize on my AS3,Flash and Flex platform skills to develop desktop, browser and mobile apps. Now the level of features for Android and iOS has become good enough to be able to develop any kind of apps without the need for adding Java and Objective C to your skill portfolio (in my opinion, the recent support for notifications, in app purchase and HLS have cleared the top three entries of the most wanted and needed features list).
And in the long-term ? I dont’ have an answer, I think I’ll simply wait and see.
PS: Very interesting article about “migrating” from Flex to JS (Thanks to Anna Karim) – https://plus.google.com/109047477151984864676/posts/CVGJKLMMehs
Finally the applications I have developed with Flex and AIR for Playbook are officially in the RIM AppWorld market.
They are 6 media apps developed for Finelco, the owner of the biggest radio network in Italy.
– A selection of thematic web radio plus the live broadcast of the main radio channel
– A selection of podcasts (MP3) from the main programs of the radio
– The charts/playlist created by the sound designers or voted by the users
– A multi-touch photo gallery
– A selection of VOD contents like video clips, interviews, concerts
I’m very proud because the apps are collecting a lot of 5 starts reviews. Flex and AIR can assure an excellent UX, especially for multimedia (live audio, live video, vod and so on), and a easy customization.
Pictures from the Virgin Radio app:
If you have not a Playbook, take a look at the UX in this video:
In a few weeks (20-21 May) will start in Milan a Flash Camp dedicated to Mobile App development with the Flash Platform. The camp is hosted by whymca, a mobile developer conference that covers several mobile platforms and development tools. This year, thanks to the efforts of my friend Andrea Trento, an entire conference track will be dedicated to mobile development using the Flash Platform.
I’m one of the crew (4 Adobe Community Professionals + 1 Evangelist) that will speak at the Camp.
First of all Mihai Corlan (Adobe Evangelist) will open the track talking about Flex on Mobile and the new possibilities for cross platform development offered by the Flash platform. Andrea Trento will show how to create a cross-platform game with Flash, Luca Mezzalira will show the use of Design Patterns in mobile development, Piergiorgio Niero will focus on code optimization for mobile while I’ll speak about video encoding and delivery optimization for mobile.
It will be a great event for learning the cutting edge of technology for mobile development and for keeping in touch with the ever growing flash community.
The event will take place in Milanofiori – Assago – Milan (more info in the whymca web site), and is completely free. Due to the limited numbers of seats I suggest you to book early your presence.
If you work with audio and video streaming, one of the worst limitation of AIR 2.6 for iOS is that it is not possible to stream video encoded in H.264 (and audio in AAC) inside your AIR application. AIR 2.6 for iOS supports NetConnection and NetStream but can decode only Spark, VP6, MP3, NellyMoser and Speex formats. So no H.264 and no AAC (don’t ask me why).
This is a real problem. Who is using today VP6 for video streaming or MP3 for audio ? On top of that, the performance of audio/video streaming is not perfect in AIR for mobile today (even for Android) especially because you have a lot of dropped frames and no frame interpolation, so delivery a stream in VP6 and MP3 for iOS devices is a very sub optimal solution that cannot compete with the very good native streaming capabilities of iOS to which the user is accustomed.
The AIR for iOS documentation mentions that it is possible to launch the native iOS video player pointing to an .mp4 (or .m3u8) video file, but this is not handy because the video is opened outside the AIR application and especially for iPad the user experience is really bad.
Fortunately there’s a sufficiently working solution to integrate in the AIR app the flowless experience of the native player: use the StageWebView object.
The StageWebView object is a powerful way to integrate the elements that AIR is lacking today. Do you need a list with perfect scrolling ? AAC audio streaming ? H.264 streaming ? Well you can use a StageWebView to load “HTML5” code inside the AIR app and integrate this kind of content. Let’s do a simple example:
var webView = new StageWebView(); webView.stage = this.stage; webView.viewPort = new Rectangle( 0, 0, stage.stageWidth, stage.stageHeight); var path:String = new File(new File("app:/html/service.htm").nativePath).url; webView.loadURL(Path);
This code will open a fullscreen instance of the native browser (without UI) and load a locally stored .html file (notice that it is necessary in iOS to use some hack like that at line 4 to obtain a valid url to access local html). Now you can easily understand that the best is to mix AIR UI and this windowed browser to exploit HTML5 capabilities especially for media streaming. A simple <video> tag inside the html code can do the job and offer perfect H.264 progressive and streaming playback to your AIR apps both in window and at fullscreen.
Communicate with StageWebView
It is not so easy to communicate with the page loaded inside the StageWebView. The object does not provide specifics API. Fortunately exist a class (StageWebViewBridge) developed to overcome the limitations of the standard StageWebView object. With StageWebViewBridge it is possible to communicate bidirectionally with the hosted html page and so create something similar to PhoneGap with Flash AIR.
Waiting for the future AIR 2.7 (which could have yet problems in the video area if it does not implement StageVideo in iOS) this is the best solution I found to overcome the limitations of AIR for iOS.
Adobe announced today, during the Mobile World Congress, that Flash Player 10.2 will be available soon for Android Devices. This is very important because FP 10.2 introduced the Stage Video object which offers a direct control over hardware acceleration in video deconding. In my opinion the worst point of weekness of FP10.1 for Android is the performance of video deconding so I’m very happy of having Stage Video ASAP.
In Flash Player 10.1 for Android the decoding of H.264 can be hardware accelerated (depending by the HW of the device) but the color conversion, the blending and the compositing of the video on the display is still demanded to a software layer. This is because the canonical Video object is part of the display list and so it injected inside the display list rendering pipeline. Stage Video is a alternative way to access video layer and it is not part of the display object. At the cost of a lower flexibility you have a direct access to harware acceleration, from bitstream decoding to video compositing.
Stage Video will be available only on Android 3.0, this means tablets like Motorola Xoom, Samsung Galaxy Tab 10.1 and so on. The need for full hardware acceleration is much more important for a tablet which has a big screen compared to a smart phone but I suppose we will see even new smart phones equipped with Android 3.0 any time soon.
Now I feel only the need for a new, efficient and accelerated iOS packager…