You can download a free version for trial or non-commercial projects which currently has 3 limitations:
package {
import com.rainbowcreatures.*;
import com.rainbowcreatures.swf.*;
import flash.events.*;
import flash.net.FileReference;
import flash.display.MovieClip;
public class Helloworld extends MovieClip {
var myEncoder:FWVideoEncoder;
var frameIndex:Number = 0;
var maxFrames:Number = 50;
public function Helloworld() {
// init FlashyWrappers!
myEncoder = FWVideoEncoder.getInstance(this);
myEncoder.addEventListener(StatusEvent.STATUS, onStatus);
myEncoder.load();
}
private function onStatus(e:StatusEvent):void {
if (e.code == "ready") {
// FlashyWrappers ready
myEncoder.init(stage.stageWidth, stage.stageHeight, 24, 1000000);
myEncoder.setFrames(maxFrames);
addEventListener(Event.ENTER_FRAME, onFrame);
}
if (e.code == "encoded") {
// save the video
var saveFile:FileReference = new FileReference();
saveFile.save(myEncoder.getVideo(), "video.ogv");
}
}
private function onFrame(e:Event):void {
// animate
somethingInstance.x++;
// capture the whole stage every frame and send it to FlashyWrappers
myEncoder.captureMC(this);
frameIndex++;
if (frameIndex >= maxFrames - 1) {
// we've had enough of that, let's finish up!
removeEventListener(Event.ENTER_FRAME, onFrame);
myEncoder.encodeIt();
}
}
}
}
This is how easy it is to integrate FWSoundMixer:
package {
import flash.display.MovieClip;
import flash.events.Event;
import flash.events.MouseEvent;
import flash.net.URLRequest;
import com.rainbowcreatures.*;
public class Document extends MovieClip {
// our FW SoundMixer sound classes
var beep:FWSound;
var track:FWSound;
// the FW SoundMixer
var mySoundMixer:FWSoundMixer;
public function Document() {
// init the FW SoundMixer
mySoundMixer = FWSoundMixer.getInstance();
mySoundMixer.addEventListener(StatusEvent.STATUS, onStatus);
mySoundMixer.load();
}
function onStatus(e:StatusEvent):void {
if (e.code == 'ready') {
// play from the ByteArray soundbuffer to prove we have mixed the sounds
mySoundMixer.playSounds = true;
mySoundMixer.init();
mySoundMixer.startCapture(true);
// init the sounds, notice the third parameter is the Flash Sound which our class encapsulates
// the last (false) parameter tells the sound to NOT play "natively" (using Flash), since we're going
// to play it from our custom soundMixer buffer (set with playSounds = true). Normally, you would
// not do this and rather let Flash play all sounds natively while mixing them silently in the background.
// This is just for the test to "prove" mixing actually works.
beep = new FWSound(null, null, new BeepSound(), mySoundMixer, false);
// since FWSound extends Sound class, we can load it dynamically as well, just like the regular Sound class
track = new FWSound(null, null, null, mySoundMixer, false);
track.addEventListener(Event.COMPLETE, onComplete);
track.load(new URLRequest("piano.mp3"));
}
}
function onComplete(e:Event):void {
soundButton.addEventListener(MouseEvent.CLICK, onSoundButtonClick);
// specify starting position and number of loops, just like with Flash Sound
track.play(0, 100);
}
function onSoundButtonClick(e:MouseEvent):void {
beep.play();
}
}
}
As you probably know, there is no easy way to capture a video of Flash / AIR content. The most you can do is to capture webcam. But what if you want to capture MovieClips, gameplay, come up with Jukebox app with video sharing option...? Bad luck. The only solution was to use Lee Felarcas FLV encoder(or perform stunts by interfacing with command line ffmpeg)...the problem was Lee's FLV encoder produces uncompressed videos (100+MB for 30-40 second video), so sending videos over internet seemed impractical.
FlashyWrappers is trying to tackle this issue on several fronts. First of all, in itself FlashyWrappers is not a video encoder. It is, as its name suggests, a wrapper for other encoders / codecs to talk to Flash / AIR through native extensions or SWC(Flash library). When it began it was based solely on Theora / Vorbis codecs, then FFmpeg and now as we're going native and mobile on other encoding technologies as well: AVFoundation on iOS and MediaCodec for the upcoming HW accelerated Android version.
The "easiest" platforms for us are Windows and Mac (AIR Desktop). We "just" compile and run on native processors with plenty of power and speed. So it was only about creating the wrappers for getting images from AIR / Flash and sending videos back. The real challenges lie in these three platforms: Flash Player, iOS and Android.
Flash Player's VM (virtual machine) is quite slow. It is very comparable to running at native speeds on mobile device - the challenges are similar. We couldn't just go and knock together an AS3 encoder, not only because there are no AS3 ports of any serious codecs, but also of the horrible speeds this would cause.
If you're a long-time Flash developer, you've surely heard and used Alchemy libraries, famous for their speeds. Alchemy was usually used for intensive tasks, such as JPEG encoding. Alchemy is essentially C/C++ compiler, compiling straight into AVM2 (that's Flash Player VM language). So instead of writing AS3, you can compile C/C++ code and it will run as SWF.
Why would you do that? First advantage is obvious: C/C++ is a defacto standard for programming, and the majority of the codecs and video encoding frameworks are written in this language. The less obvious advantage: LLVM and compiler optimizations + fast memory access. This makes sure almost any memory intensive task is about 2.5x - 3x faster when written in Alchemy than in AS3.
So that's what we did, we used "Alchemy 2" - then called "FlasCC compiler" (do not confuse with "Flash CC"). We even made some tweaks to the theora/vorbis codec settings, knowing them from FlashyWrappers 1 quite intimately, for maximum speed. You'd think that guarantees realtime HD encoding even on older machines. Unfortunately, that's not the case. Thanks to all of this you'll be able to capture mostly at half-resolution of your Flash stage, and you'll need to take advantage of FlashyWrappers multithreading to spread the load. Then it works in realtime on a 1-2 year old laptop and an even older desktop. But it works! And the technology is improving all the time of course.
The challenges on all mobile platforms are similar to Flash Player: speed. Luckily, mobile platforms offer HW acceleration solutions. Essentially, your iPad or iPhone contain special chips dedicated for encoding mp4 videos. On iOS, the API talking to this chip is called "AVFoundation". While in the past, all of our code was based on FFmpeg, we felt it would be better to have absolute control over the code and especially on mobile devices write everything from ground up, based on the device API's. Not mentioning the FFmpeg LGPL license, which brings more bad than good in our opinion, especially on platforms like iOS.
Once FlashyWrappers AVFoundation prototype was working, it was great improvement. The encoding speed improved several times. As a bonus, FlashyWrappers produces mp4 videos on iOS - normally, mp4 is royalty based codec, but if you're using a chip inside iPad / iPhone, you don't have to deal with that. Apple already paid for the encoders embedded in those chips.
However, we did not stop there. There can be one encoder based on AVFoundation and another encoder based on AVFoundation, which is 3x slower. We know, because we went through both versions. There are many techniques through which to optimize video encoders, beginning with pixel buffer pools, texture caches and conversions of color formats (or better, avoiding those or doing them on hardware level such as GPU shaders).
Once we optimized everything, it was working pretty fine on latest iPad mini Retina but horribly on iPad 3. What was going on? There was a bottleneck we almost forgot about: Frame capturing in AIR. To take a "screenshot" in AIR and send it to video encoder takes a bit of time even on desktop PC, so think about mobile devices! You need to re-render your whole scene into BitmapData, including all the slow effects (that's by far the worst part), then convert it to ByteArray, then send it to ANE, then convert colors from ARGB to BGRA...it seemed our optimisations in AVFoundation were useless with this huge bottleneck, and we could do nothing to improve that, as it was entirely in AIR's hands.
We started to look for a way to capture a "screenshot" natively. We've learnt about such feature on iOS7 (the important part is it needs to take OpenGL layer screenshot). Unfortunately, the solution there was even slower than taking the screenshot in AIR. We've tried to make "glReadPixels" work, that was our first stage of attaching to OpenGL's rendering - but that didn't work either. All image data was lost at the time we were calling glReadPixels. Luckily, that was not the end of it.
We found out how to redirect the rendering of AIR to our own target, which is also a "texture cache". By doing that, the video frames automatically flow to pixel buffer pools and convert to BGRA. AIR is essentially drawing your app and at the same time drawing the frames into video, almost at the same time (but AIR doesn't know it!). An interesting side effect is, that of course AIR will stop rendering anything to screen as its doing that. So we had to learn how to render the temporarily "crippled" AIR's content to screen. After doing that we finally had all the pieces, and the result is FlashyWrappers 2.2 iOS most tricky (even still beta) feature - the accelerated "fullscreen" capturing. It supports Retina as well!
Android is the upcoming challenge - we've already made some actual work there. The approach will be similar to iOS, the problem is the big fragmentation on Android. Essentially, we gave up to support anything below Android 4.3 in the first HW accelerated release, the mess is too big and MediaCodec didn't include a muxer (ability to mix audio & video) prior to 4.3. As older devices phase out, it might be too much work for nothing, even in the future we are not ruling out supporting some specific, popular pre-4.3 devices for HW acceleration.
The current version of Android is what FW 1.0 iOS was - FFmpeg based solution, identical to the desktop one. That's not great (realtime encoding is practically out of question), and it's our next priority to introduce hardware acceleration for Android 4.3+. The accelerated version will be treated as separate platform since it will produce mp4 files (while the current one produces ogv files), so making a "fallback" version would be too incompatible.
The answer is: yes and no. The FFmpeg based FlashyWrappers, ie. Flash Player, AIR Desktop and unaccelerated Android can support any codecs that FFmpeg supports. iOS already supports mp4 exclusively. Accelerated Android will also support mp4's exclusively in the nearer future. As for the rest, there are 2 issues: