Overlaying Frames-per-Second on a Benchmark Video Using R, ffmpeg, and Kdenlive


Feral Interactive is a UK-based porting house that specializes in bringing Windows games to other platforms like Linux and macOS. One of their most recent projects was bringing the game Shadow of the Tomb Raider to Linux. I wanted to compare the performance of their native Linux version of the game versus running it in Linux using a popular compatibility layer called Wine. Running games on Linux with Wine often incurs some performance cost compared to Windows so there is still a market for native Linux ports that can recover some of that lost performance.

Conveniently, Shadow of the Tomb Raider contains a built in benchmark tool that will spit out its results to a text file where it can then be analyzed with R. The raw data looks a little like this:

  frame  time delta memory
  <int> <dbl> <dbl>  <dbl>
1     1   0     0     2341
2     2  14.4  14.4   4462
3     3  35.7  21.3   4462
4     4  53    17.3   4462
5     5  72.1  19.1   4462
6     6  91.6  19.5   4462

Frame is the id of the current frame, time is the milliseconds since the start of the benchmark, and delta is the amount of time it took to draw the frame. Most gamers don't really care about these numbers though; the most relatable metric is frames-per-second which is the number of frames that are able to be drawn in one second. To calculate this I just look a the time it took to draw the previous 50 frames, then 50 divided that time is the rolling FPS.

With FPS calculated, it's easy to use R and ggplot2 to make a nice graph showing the performance of the benchmark over time.


That's neat, but what I really wanted was to overlay the chart over footage of the actual benchmark so that people could see how different in-game scenes effect the frames-per-second. To do this I used a few tools: R again for the chart generation, ffmpeg to turn pictures into a video, and then Kdenlive to edit the video.

Generating Charts:

To embed a moving chart in a video, I used R and ggplot2 to generate 1 chart per video frame. That works out to 4000 individual charts due to the benchmark being 160 seconds long and wanting 25 frames per second. Each new frame shifts a window showing the next 1/25th of a second of data and 10 seconds worth data over the whole image.

To make things look a bit nicer in the final video, the background of the charts had to be a colour that could easily be chroma keyed out. Chroma keying can remove a certain colour from a video layer, basically green screening. So all 4000 charts looked something like the following beautiful image.


Turning Charts into a Video:

Thankfully, turning a series of images into a video is rather common problem and there are a lot examples online of using ffmpeg to do this conversion. So I shameless borrowed the following command to turn all 4000 charts into a video. I won't pretend to know what all of the arguments do, but importantly it is set to 25 frames-per-second to match the timing of the generated charts. Without this the scrolling chart would be too fast or too slow and would not line up with the benchmark footage.

ffmpeg -r 25 -f image2 -start_number 1 -i plots/fps_%d.png -vcodec libx264 -profile:v high444 -crf 0  -pix_fmt yuv420p sottr_fps.mp4

Overlay FPS Video on top of Benchmark:

Kdenlive is an open-source video editor for Linux. Video editing is still one of the areas of desktop Linux that is still a bit lacking, but Kdenlive crucially has a chroma key feature which is the key component in this step. The video generated from the bright green charts is overlayed on the footage of the Tomb Raider benchmark then the chroma key is applied.

In this screenshot you can see the chroma key effect being applied to the bright green of the chart video. It removes the background and turns it into a very nice looking overlay.


So that's it. I really enjoyed this little project because it was the combination of several tools (R, ffmpeg, and Kdenlive) that really made it possible. Each had a specific task and it all came together nicely.

Check out the final result on YouTube.


Tags:


More bittyblog Updates and My New Site


A lot of changes have happened to bittyblog over the past couple months. It has finally got to a place that I'm happy with so there probably won't be any more updates for a while.

So what has changed? Here's a quick list:

  • Tags: both posts and pages can have tags associated with them. Adding tags to pages will have that page show all posts containing that tag. A pretty handy feature for creating subpages on a blog for specific topics.
  • fastCGI: everything supports fastCGI now for faster response times.
  • RSS: adding 'rss' to the query string will return the page results in RSS instead of HTML. RSS isn't something that I really use but I think it's still pretty popular so it's a good feature to have on blogs.
  • Caching: bittyblog now has a built-in cache that can be activated for extra fast response times. I got about a 5-fold increase in the number of processed requests when testing on my desktop.
  • Misc: lots of other miscellaneous changes and refactors to improve the code and speed.

So with bittyblog in a good spot I've finally launched my new site: LinuxGameNetwork (logo at the top of this post). A blog focusing on all topics related to Linux, gaming, and Linux gaming. My plan is to keep up frequent updates for 6 months and see what how the readership changes; after then I will probably re-evaluate what my goals for the site should be.

In the meantime please check it out if you are interested in Linux gaming and subscribe to the RSS feed if you're into that.


Tags:


bittyblog - Big Updates to a Small Blog


Lately I've had a lot of motivation to upgrade bittyblog. It started as a simple CGI app to host my personal weblog, however, I have a larger blogging project in mind that I would also like to use bittyblog for. So to get it ready, I've made several nice improvements.

Template Support

Previously, all the HTML had been hard-coded into the bittyblog's C source files. I tried to abstract this away as much as possible but it just became too much. There is a very nice mustache implementation that somebody wrote in C and that I was able to import into my project. Now I can manage HTML completely separate from the C code which greatly speeds up development and layout changes since a recompile is no longer necessary for HTML changes.

Primitive CMS

Old bittyblog had no way to manage posts or images from the browser. Everything was done by manually uploading pictures and editing the database over ssh which was very time consuming and clunky. Now I can add new posts and images easily from the new bbadmin.cgi page.

Setup Script

For a project that aims for simplicity, setting up bittyblog was actually quite a hassle. In an effort to help this I made a small install script that sets up the database and fills in global variables automatically. This has been really useful when setting up bittyblog on different machines. I got used to having a bunch of hard-coded variables that I never had to touch when developing but were nightmare when redeploying.

Of course, I have to eat what I grow, so my personal blog is now running the new version of bittyblog. Even though it looks the same to you in the browser, behind the scenes there have been a lot of nice improvements that will feed nicely into my other blogging project. Next on the list is a round of code cleanup and then improving the CMS system to make it easy working with a large number of posts and media.

Check out the new updates on github.


Tags:


Mini Review - Conrad Raspberry Pi Advent Calendar


Please note: this article contains spoilers of the contents. Don't read this review if you plan on buying this product.

Living in Germany, there are a lot of choices when it comes to Advent calendars. Of course, there are lots of styles of chocolate calendars, but also calendars with beer, tea, Lego, and even a 'couples' calendar from the local drug store. One of the more interesting themes was a Raspberry Pi Advent calendar from Conrad.

Conrad is a chain of electronic stores in Germany that sell a lot of classic electronic bits. All the small things you typically associate with electronics: resistors, buttons, etc. They also have a decent selection of Raspberry Pi products, one of them being an Advent calendar. Naturally I couldn't resist and decided to get one for the holidays.

So what do you get for your 29EUR? Well, there isn't much to say about the presentation of the box itself. It looks like a typical calendar with 24 doors, one for each day in December leading up to Christmas. They're all well separated into small compartments so you won't see anything from other doors while opening each one. You also get a large cardboard nativity-scene cutout so you can probably guess what final project will look like. It's also worth noting that it doesn't contain the Raspberry Pi itself so you will need to supply your own Pi as well as a keyboard, mouse, and screen.

Each little door contains a few electronic bits that you will use the day's project. The variety in parts is pretty limited and most days there won't be anything except cables. I think the last 4-5 days opening doors was nothing but cables which was really disappointing. Even the larger doors that allude to something more interesting just contain cables. This was probably the biggest disappointment of the calendar. That said, there were some slightly more interesting pieces like a couple of tri-color LEDs that get used for some neat effects.

The programming is all done using a visual language called Scratch. You can build simple programs using basic visual blocks that represent things like loops and if-else statements. For basic things it works alright but programs can quickly grow very large and they become difficult to work with. By the end of the calendar the programs were so large that my wife and I stopped doing everything ourselves and just used the pre-made ones.


The included instructions are only in German but several other languages are available online. Beware, the English instructions are badly Google translated. My wife and I had to refer to the German instructions several times to clear things up. Annoyingly, all versions of the instructions contained some very obvious mistakes. On one day the project diagram showed all wires going to ground. If this is somebody's first experience with electronics then stupid mistakes like this can be really frustrating.

I would stay far and clear from this calendar in the future. There are just too many negatives between the instructions contained mistakes, the programming language being difficult to work with, and the disappointing amount of components.


Tags:


Bitcoin Prices and Hidden Markov Models

Lately, there’s been a lot of interest in Bitcoin, probably sparked by its almost unbelievable growth in December 2017. However, this past week, we saw the price of Bitcoin drop the just above $6000 which was the lowest it has been since November 2017. So I wanted to take a closer look at Bitcoin prices through the lens of Hidden Markov Models (HMM) to see what conclusions, if any, can be drawn.

Hidden Markov Models are similar to a standard Markov chain model but the where the current state is unknown. Instead of observing the actual state of the process, the only information available is the realization of some other output that is dependent on the current internal state. A somewhat contrived example would be trying to detect whether it is raining, or not, based on how many people you see with umbrellas. The hidden, unobservable state is the weather (raining or not) while the observable, realization of that state is the proportion of people carrying umbrellas (more people carry umbrellas if it’s raining).

Applying this concept to Bitcoin prices, there could be some internal state driving the change in price and different states produce different expected price changes. I assumed that the daily change in price followed a Log-normal distribution, which means that taking the logged value of daily returns should be normally distributed. This made the model slightly easier to interpret. I also used 3 internal states in an attempt to capture bear and bull states with differing volatility.

Below is a chart showing the most likely states during the 2017 and into the 2018 calendar years:


Here each of the three states are coloured. The blue state was characterized by positive average returns and low volatility. The red state also had generally positive returns but higher volatility. Finally, the green state had mostly negative returns and also high volatility.

I also ran a quick Shapiro-Wilk test on the log-valued daily returns which was unable to reject the null that daily returns come from a normal distribution. This means that there wasn’t enough evidence to disprove the assumption that price changes follow a Log-normal distribution.

This is all fine and good, but what would be really cool is if the fitted model could be used to predict the future price of Bitcoin. So I ran 10,000 30-day simulations to get an expected future price and a confidence interval. This is what it looks like:


This shows the predicted Bitcoin price, and the actual price change during the prediction interval. The shaded regions also represent the 95% and 80% confidence intervals, based on the 10,000 simulations. In this instance, the HMM was not exactly a great predictor. Bitcoin has been incredibly volatile and I think it’s extremely difficult to make any meaningful predictions using closing price alone.

If you’re interested in taking a closer look at the R code used to fit the HMM model and generate the charts, you can find it on my Github.


Tags: