Nvidia-XRun: An Alternative to Bumblebee

Despite the many advances the GNU/Linux operating system has made over the past few years, there’s one area with which I have frequently struggled: hybrid Nvidia graphics. Being the owner of a laptop computer using Nvidia Optimus technology (i.e. having both an integrated Intel GPU and a discrete Nvidia GPU), I’ve spent more time then I’d like to admit getting the best performance possible out of both cards. Unlike on Windows, where Nvidia’s drivers and software make it easy to launch applications with either GPU, accomplishing the same isn’t quite as easy on Linux. Recently, I discovered a project that can accomplish this: Nvidia-XRun.

Many years ago now, when I was still using Ubuntu, I had spent several months of trial, error, and research to discover what I still consider to be a good solution to utilize hybrid graphics on Linux: the Bumblebee Project. This allows one to run an application through the Nvidia GPU by prefacing the command with optirun (or more recently primusrun). This is accomplished by running the program on a separate X11 server instance hidden by the user, and (more or less) copying over everything that was rendered to the main server. Because at the time it was very tricky to have more than one set of graphics drivers installed, let alone use them both, this project assisted with the installation of the drivers too. Finally I had a solution to be able to run games and other graphically demanding applications with better performance. I became so dependant on this, that I ended up migrating to Arch Linux (with no regrets mind you) because the setup I had on Ubuntu broke, and I could not find a way of fixing it.

Over time, improvements were made. Support to natively install multiple GPU drivers at the same time was realized, and some desktop environments now have some support for hybrid graphics through the DRI_PRIME flag. Unfortunately this only works with open source drivers, and it’s no secret the the proprietary Nvidia drivers generally offer better performance than nouveau. Even so, bumblebee has maintained my pick as the best solution to this problem, despite not having seen any updates in over four years.

Recently, I stumbled upon Nvidia-XRun, which unlike bumblebee doesn’t allow you to run applications with each GPU in the same session. Instead, it starts up a fresh X11 session with the application you wish to run, and does so natively off the discrete GPU. The claim is that, since it’s running it directly, there should be less overhead, and thus better performance than through other solutions like bumblebee. One other benefit is that, unlike bumblebee, this one allows Vulkan applications to run. I figured it would be worth trying, seeing as I wouldn’t mind squeezing a bit of extra performance out of my GPU.

Installing and using nvidia-xrun

Thanks to a handy guide on this subreddit, installation was fairly simple. As I already had bumblebee installed, it was just a matter of installing the nvidia-xrun script (found on the Arch User Repositories), along with openbox. The latter is recommended for running games through steam. It wasn’t even necessary to reboot; the script was ready for use. In a separate tty (hold ctrl + alt + F# to change), run the command: nvidia-xrun openbox. This will start up an openbox session, which will essentially be a black screen with the cursor. Right-click on the background to bring up a context menu, where a terminal such as xterm can be opened. From there, applications can be run, e.g. steam (or steam-runtime on Arch Linux).

Performance comparison

One thing that should be noted is that it’s difficult to compare performance with bumblebee, since this will automatically cap your maximum FPS to your monitor’s refresh rate. This means that benchmarks aren’t generally useful to compare this with nvidia-xrun. What I can instead do is comment on the apparent performance of games, particularly ones that don’t reach 60 FPS in my case.

For my tests, I am using my normal setup with Gnome 3. I also tested with both the X.org and Wayland session, as these make a difference with performance as well. There are two categories of games I’d say are relevant here: ones that are power hungry, and ones that are not. The more demanding game with which I’ve tested is 7 Days to Die. On Gnome X.org with bumblebee, the game does not perform well. There is a significant amount of input lag (seems to approach half a second), and the FPS doesn’t go higher than 30. On Wayland however, the game performs significantly better. Input lag is hardly notable, and my average FPS sits at around 50. With an nvidia-xrun session, the results were essentially identical to bumblebee over Wayland. I felt like there was a bit less input lag (not measurably so), and the experience was a bit more smooth, but there was no overly objective factor to indicate it was significantly better.

With lower powered games, such as 2D platformers, the story is quite different. Apart from a minor increase in input lag with bumblebee over X.org, there isn’t much difference in terms of the overall experience. There are a few edge cases here, such as when the desktop environment experiences a lag spike, and thus the game as well; such things don’t occur on an isolated session.

Final thoughts

Seeing as that I’ve been using Wayland for about half a year now, I’m on the fence as to whether I’ll switch to nvidia-xrun, stick with bumblebee, or use a combination of the two. One limiting factor is that there is currently a bug with systemd that prevents the nvidia-xrun session to fully unload once it terminates. This means that, once a session is started, bumblebee cannot be used without restarting the systemd-logind service, which results in all sessions to be terminated.

One other benefit I’d hoped for was the ability to utilize my HDMI output, which cannot be currently used since it’s connected to the discrete GPU. Sadly, it doesn’t appear that nvidia-xrun can utilize this (at the very least I’ve had no success in trying). (See my followup for details) I’ll most likely stick with using bumblebee for the most part, and use nvidia-xrun when input latency is a bigger factor. If however I were to choose against the X.org session, I’d easily switch. The difference in experience is notable in most cases, and I find it enough to merit switching.

5 thoughts on “Nvidia-XRun: An Alternative to Bumblebee

    • I’m not so sure I understand what you mean by “setup”. I did no custom configuration to change the performance (unless you count using “primusrun” instead of “optirun”). Also, what are you using as a comparison for “good” performance?

      Like

Leave a comment