Wayfire Animations

Wayfire is maturing nicely with an animation infrastructure that makes it relatively easy to create new animations. However, even though wayfire has the nice fire animation, it only has a few other basic ones. Recently, wayfire gained support for a new option type of ‘animation’. This type allows you to select an easing as well as duration. One animation that wayfire users have been talking about for years, is the classic magic lamp minimize animation. And secretly, I’ve been wanting to write this animation. After a comment from a user in wayfire chat about this animation being the only thing missing from their wayifre-dots setup, I figured it was time to try my hand at implementing the effect. I decided to call it ‘squeezimize’, and started coding.

At first, I rendered the view in a loop, scissoring the transformed surface into horizontal lines. This worked ok, but the sigmoid curve seemed to be lost in the math, and worst of all, the implementation was slow. After bouncing around some ideas to make it faster, it seemed the way forward was to write a fragment shader to render the effect in a single pass per frame. After a lot more math and coffee, I came up with this very thing. A simple fragment shader that requires minimum glsl version and only uses basic math. It does not use builtin shader functions other than clamp(). This means it’s fast. And, the shader renders better curves than the slower line by line proof of concept.

TL;DR: Wayfire has a new shader called squeezimize that renders a magic lamp effect. Yes it’s been done, time and time again, but because it’s a single pass shader with very little input, it’s low on resources. After writing squeezimize, I wrote two more animations for wayfire open/close windows, called zap and spin. Now for a video:

EDIT: There are now two more animations, in addition to the above, helix and blinds:

 

Streaming RTSP directly to browsers

Let’s not sugar coat it, streaming video is a complex topic. If every browser had the same video player, it might make things a bit easier. But of course, they don’t. So, I’ll try to dumb this down, keep it short and make it as simple as possible.

If you have some RTSP H.264 streams that you want to stream to Firefox or Chrome with minimal latency, you might be in for a treat. Using nodejs and ffmpeg, we can serve up the streams while convincing the browser to play the stream immediately while buffering with minimal latency. How? As I found, it is not easy. But let’s act like it was, shall we?

In a nutshell, a browser first makes an initial GET for the url and the server replies with a video/mp4 mime type header with 200 status. We start up ffmpeg and have it write to stdout but we don’t send any of the data on this request. This behooves the client to make a range request. Next, on the range request, the client is expecting a file so it asks for portions of it. Here, we always tell it the start is 0 and do some math to tell it what range and length to expect, based on the bytes we have so far, and send 206 status. Then, we start sending the data. Getting this right was only half the battle though..

TL;DR, I checked the agent string to see if it’s chrome or not. If it’s chrome, we use `ffmpeg -f matroska` but for firefox, we have to use `ffmpeg -f mp4` since it doesn’t have matroska support. The confusing thing was that the mime type could not be video/x-matroska because that caused the browser to attempt downloading the file. Now on to the code.

The following is a snippet to stream RTSP directly to firefox or chrome browsers:

function serve_rtsp(req, res, ip, user, pass) {
    user_agent_string = req.get('user-agent');
    const range = req.headers.range;
    if (range)
    {
        if (!running)
        {
            res.end();
            return;
        }
        const start = 0;
        const end = Math.max(total_length, 1024 * 1024 * 32);
        const chunk_size = end + 1;
        res.writeHead(206, {'Accept-Ranges': 'bytes', 'Content-Range': 'bytes ' + start + '-' + end + '/' + chunk_size, 'Content-length': chunk_size, 'Content-Type': 'video/mp4', 'Connection': 'keep-alive'});
        ffmpeg.stdout.unpipe();
        ffmpeg.stdout.pipe(res);
        return;
    } else
    {
        res.writeHead(200, {'Accept-Ranges': 'bytes', 'Content-Type': 'video/mp4', 'Connection': 'keep-alive'});
    }
    if (running)
    {
        ffmpeg.kill("SIGINT");
        total_length = 0;
        clearTimeout(ffmpeg_timer);
    }
    if (user_agent_string.includes("Chrome"))
    {
        ffmpeg = child_process.spawn("ffmpeg",[
        "-i", 'rtsp://' + user + ':' + pass + '@' + ip + ':80',
        "-f", "matroska",  // File format
        "-c", "copy",      // No transcoding
        "-t", "60",        // Timeout
        "-"                // Output (to stdout)
        ]);
    }
    else
    {
        ffmpeg = child_process.spawn("ffmpeg",[
        "-i", 'rtsp://' + user + ':' + pass + '@' + ip + ':80',
        "-f", "mp4",       // File format
        "-movflags", "empty_moov+frag_keyframe",
        "-c", "copy",      // No transcoding
        "-t", "60",        // Timeout
        "-"                // Output (to stdout)
        ]);
    }
    running = true;
    res.end();
    ffmpeg.stdout.on('data', ffmpeg_handle_data);
    ffmpeg_timer = setTimeout(function() {
        running = false;
    }, 60000);
}

I hope this is helpful to someone out there. If you do find it useful and want to say thanks, you can donate a coffee on northfield.ws.