Coding a REAPER Projects Generator Desktop App for Electronic Arts with Electron

Introduction

This summer, I get contacted by Pedro Alfageme, Loc Audio Engineering Manager at Electronic Arts, as a REAPER ReaScript expert, for creating a script which could generate new and update existing REAPER projects files based on CSV sheets and thousands of media files. It was the occasion for me to learn a lot of new things, from programming practice, to tech solutions: this is how I created my first multi-platform desktop app! Here what I have learned during this contract.

ReaScript VS Desktop App

As ReaScript functions are not accessible for non opened project in REAPER, I knew from the start that a script solution would have no advantage. And when I saw the complex GUI elements exposed on the app guidelines, I knew that I would have to go for an external app solution. I initially thought about a web-based solution, HTML, CSS and JavaScript to power the whole thing. But I faced a problem: JavaScript in browsers is sandboxed. This means that for security reason, JavaScript in browsers can’t do complex OS functions, like scanning files from repository; the kind of thing I needed to do. The kind of thing desktop apps can do nicely.

The thing is… I have made a tons of scripts, but I never made a desktop app before. I had an idea in mind, Electron, which allows to made cross-platform apps with HTML, CSS, JavaScript, but I wanted to be sure it was the best choice before proposing a solution. So I made a lot of research, in order to compare languages, frameworks…  I had to find something not too far from what I already knew (no low-level programming for me) and that could be deployed quickly, with a simple GUI library.

This research phase was complex. For Lua for eg, there is quite various GUI frameworks, but they are not all cross-platform, most are outdated, some are really complex, and in most cases, the documentation is minimal. It is possible to built nice desktop apps with Lua (like ZeroBrane), but it is a bit overwhelming when you don’t know where to start.

I took a look at Python (The 7 Top Python GUI Frameworks for 2017) (Kivy looks very interesting, Tkinter seems accessible, but as I didn’t knew a lot about Python, it seems to complex for me to handle at that time), and even at PHP based solution (this would have been a pretty unusual choice, but it can work, according to this article: 3 Ways to Develop Cross Platform Desktop Apps with PHP — SitePoint). At any other solutions, in fact (Cross-platform GUI Toolkit Trainwreck, 2016 Edition). There is interesting propositions, but for most of them, it would have needed to much new learning; I was ready to learn new frameworks, but not new languages, considering the deadline imposed by the project.

In the end, I fallback to my original idea: Electron. Indeed, one of the Tagline for Electron is:

If you can build a website, you can build a desktop app.

I read a lot of about it. It didn’t seems to have limitations that could impact the project. My code editor, Atom, is based on Electron, the GitHub Desktop app I used is based on Electron too, and so is the Slack app I used for chatting… I was already using Electron apps on every steps of my workflow. Time to make my own.There was still a lot to learn, but I was ready to jump!

Electron

To make an Electron app, you must have a clear understanding of what Node.js and NPM are. I only experimented few things with the later, and only had vague notions from the first. But when I understood the potential of these, by reading articles, documentations, and seeing various video tutorials, I understood how they become popular solutions, and here’s why: they are both effective and quick to deploy.

A good thing with Electron and NPM is that the documentation is very well done and that the community is huge and dynamic. Some package developers even have free chatrooms to support users in real time.

Electron allows to code with HTML, CSS, and JS because it brings a full web engine (chromium, the same which powers Google Chrome) on desktop, and brings it OS level functions (like file access, recursive folder scaning…) thanks to Node.js. Coding for Electron is a bit like coding for a website, but you have access to extra functions. If you miss some features, maybe someone already made them and share his results thanks to NPM. Installing these extra packages with NPM is only one single line in a CLI. The only downside with having a web engine is that a minimalist application (displaying Hello Word on a page) is very heavy (it can weight up to 40Mb, where it is only one line of code for others language – but it’s fair to notice that Electron Hello World page is a bit more than just a message, as you can change it’s size, select it, open a development console etc… It doesn’t return just a simple string in a popup or console).

As development set up was very fast to put in place, thanks to the electron-quick-start boilerplate app and documentation, I could start to work in few times on the core of the project: the RPP Parser.

Parsing RPPs

REAPER projects (.rpp or lets call them RPPs) are simply human readable text file (contrary to other DAW for which it is encrypted sources, like Pro Tools or Cubase). When you create a new empty project and save it from REAPER, you get a pseudo-XML document, with about 100 lines of codes, with settings of the project. But you can create one just by creating a .rpp file and add it only one line of code. Missing properties will fallback to user default settings at the project opening in REAPER. This is a killer feature for generating RPPs from scratch: you only have to put the info you need.

However, parsing RPPs is a whole other story. Indeed, the app needed to be able to update existing projects properties. This means that it should be able to abstract RPP text so data can be transformed, by creating functions like “insert an item named sound.wav as second item of track 3“, and then serialize the data back to the file. The difficulty is that RPP doesn’t use a usual data object format like XML (sadly, but there is surely a good reason for that), for which the parsing could have been handle by native JavaScript functions in one line of code. Instead, it uses a non-standard pseudo-XML format, some tags having different behaviors than others, and a custom syntax. And even more complicated, the information aren’t set in a nice data tree as you could have expect (like Regions tags which are in fact named MARKERS, and which can share same ID as another regular MARKERS tags, but are followed by an empty name MARKER tags of the same ID, and which aren’t even put in a data tree — found it confusing? It’s normal, it is).

Reverse engineering was pretty challenging. As it was critical for the app, it was the first thing I started to work on. I found some RPP parsers online for other languages like C# or a Python but they all were work in progress, so I decided to make my own rather than convert them. After quite some intensive coding, I successfully build a basic RPP parser, but it didn’t abstract data as data tree, just into a list of tags. By trying to manipulate them, I understood that this will not be optimized from a dev perspective. I found ways to make data tree from linear hierarchy (thanks to NPM), but right before I tried to implement one of them, someone came to the rescue…

John Baker is the developer of Vordio, a third party app that converts XML exported from an NLE video project into a REAPER audio project for audio post production. RPP parsing and generation is the core of his app. It is safe to say that he made the most advanced RPP parser out there, as it has been tested on complex projects by hundreds of users during years of evolution.

When I told him it was something I was trying to make, he decided to kindly help me! He wanted to port his parser to Lua, and as I know Lua, I could help him on that. So, we put most of his Java based code on a online collaborative editor, Kobra.io (note that finding a collaborative web app which matches our needs was already an adventure by itself), and during one full week every afternoon, we both converted his parser from Java to Lua. He explained me how his parser uses objects and methods, hand how it managed to get data tree right from the line per line reading of the file (very efficient), and how most of the Tags/Properties/Values I needed work. He shows me how to manage complex data like VST effects, Extensions, Proj Exstate, Notes… all the pitfalls of RPP parsing. His several years expertise in RPP definition deeply helped me to save time on the dev, and avoided me a lot of headaches (not all 😛 this was a pretty intensive week). It is impressive how he succeeded to make his code modular; it’s easy and efficient to add new projects manipulation functions to it once you get the logic.

But what I needed was a JavaScript parser, not Lua! Fortunately, converting from Lua to JavaScript, despite a very different syntax, isn’t too difficult, as they are both high level scripting language, so I was able to make the conversion on the fly the mornings. Also, Lua and JavaScript shares quite similar design: Lua is a table based language, JavaScript is object based.

I want to publicly thanks John Baker for his generous help. He didn’t only provide me code from his main app, but he also took time to help me understand it, and too understand his advanced RPP Project object oriented programming design. I learned a lot, and being able to ship the most advanced RPP parser logic out there in my app was incredibly stimulating. (Note that we have no plan yet about releasing it open source, as it is very specific and doesn’t answers common needs, maybe it is better to keep it private and build custom solution with it, for other clients. I already share a lots of more common-use things free and Open Source 😛 ).

Integration

Other difficulty… You remember when I said that REAPER handles missing property ? Well, for audio item, default property for length is… 0. This means that for inserting media items in RPPs, you have to know the length of the source media, or your items will be invisible. And having media length from file isn’t something JavaScript can do natively apart from loading the file in the page (which would be a performance issue, we are speaking about thousands of files).

The solution was to integrate FFmpeg and FFprobe binaries right in the app for analyzing media files. There is package for that on NPM. Nice!

So, now I had a way to scan files, a way to parse and update RPPs and to generate new ones, a way to analyze sounds, all major concerns were gone. At least, that why I thought.

On the performance side, I had to managed file disk access operations in the most efficient way possible, as it was the bottleneck of the app. This means complex sorting algorithm during the form process and CSV analyses to access or write files only the minimum times possible.

That’s where I faced a really complex issue I never had before, and to solve it, I really had to push my understanding of JavaScript callbacks a step higher. Indeed, because all files related functions (indexing, scanning, updating, writing, getting metadata…) are asynchronous (some actions take time and you have to trigger the next part of your code only when they are finished) to allows GUI to still be responsive while the app is making other things in background), I felt right in the famous JavaScript CallBack hell. I had some very complex nested async loops or async functions which needed to have their own callbacks… Quite tricky. I had to learned new ES6/ES7 syntax (promises…), watch conferences and used an external library to make all the things works.

As the app is able to modify some existing projects, I took some initiative and implemented a backup system, just in case. I also implemented a log system to let the users know every actions the app did after the form processing… Even more async functions, but ultimately, it worked.

GUI

The app is basically one long single-page form to fill with infos, and it processes them accordingly. I hesitated to use some famous CSS frameworks, but I finally decided to code my own solution, old fashion, it is way more concise and reducing the dependencies count is always good. You could think that because the HTML render engine is embedded in the app, you don’t have to care anymore about cross-platform display disparities, but it is not the case. Windows and Mac forms input are very different… I decided to fix this anyway, with some parts of form reset and style definitions to make it consistent across cross-platforms.

Considering my recent readings about design and UX, I made the form interactive, so that not all elements are present from the start (it guides the user through the process without overwhelming him), by preventing form submission if some fields are empty, and by preventing enter invalid data on a first place.

Also, I included OS notifications for the end of the process, so that a satisfying sound and popup appear if the RPP generation went fine (there is a NPM package for that too). Automatic scrolling to a log help the user to see what the app did, right after the progress bar reached 100%. I even added a way to cancel the process. Paths in entry log can be clicked and it open the files directly from the app.

Polishing this user interactions features was pretty fun and have IMHO a tremendous impact on the satisfaction feeling the user can have about the app.


Here is the full page after the form has been processed. Note that some fields have been rewritten to a more generic value (like “option”) to respect EA privacy policy.

Simple UI, but effective UX!

Delivery

Electron app packaging is not straightforward if you never compiled apps before (I tried to make Mac OS builds on Windows for testing until I understand it was simply not possible). There is various packages which propose to make it easier, but they require quite some configuration. I didn’t succeed on a first try… my FFmpeg integration broke at every compilation. As electron-builder dev has a support chat, I get in touched with him, and he helped me solve my issue pretty quickly. Of course, I made him a donation. This is how open projects work!

In about a month of coding, the app was ready for testing. After the first test session with EA team, I only had to adjust couple of things to make it works perfectly for their workflow: all their requests have been satisfied, even facultative ones (new requests always appear while using an app, things we couldn’t think of right on the first draft). I also added a lot of bonus features to make it even nicer to use (backup, log, progress bar, aliases for CSV header etc… the list is pretty long). I only skip the digital signing of the app cause it is both very complex and very expensive, without bringing new features to the app.

To polish the thing even further, and as I had to give my source code to EA, I wrote a JSDoc documentation and make all my JavaScript code Standard-JS compliant to respect Electron code style so that potential EA developers can tweak it in the future.

Conclusion

I learned a lot during this contract. It was very interesting and I’m glad that the results would be useful for EA localization audio teams (this surely means that some next EA localized versions of their games will used my app during their development process :P). I played a un-neglictable amount of Electronic Arts games, so this idea quite satisfying to me! 🙂

Once again, I want to thanks all open source developers who elaborated and shared their very nice solutions for free, which allows anyone to build very powerful apps quickly, and John Backer for his personal assistance. Cheers!

  • PanozK

    What a great read. It’s very open minded that you share such experiences with all of us. Nice!

    Game audio had always the culture of openness and collaboration and tool that speed up the workflow and free the artists are paramount to any quality production.

    Cheers!

    • Thanks you very much for your positive feedback 🙂
      Indeed, game audio people I met are very nice and always ready to share tips and tricks (especially in Indie Games), and so are REAPER users in general !

      • PanozK

        Thank you for sharing your experiences with the community! 🙂