[Academic Tech] Surface Laptop Studio with an eGPU – performance and review.

My last post compared the raw performance of my new Surface Laptop Studio (SLS) against my old Desktop and my wife’s M1 Macbook Pro.  The results were not particularly exhilarating to say the least.

The SLS was mostly ok when plugged in, holding it’s own against the MacBook Pro in simple rendering/photogrammetry, but losing out to the Desktop with dedicated GPU. The exception was in a super heavy scene where the Desktop was limited by available RAM, but the MacBook Pro absolutely dominated.

I also briefly looked at gaming where the SLS was not good, losing out to the Desktop by a vast margin in the Hitman III physics-based benchmark (Dartmoor).

I ended that post by saying I wasn’t sure if I wanted to keep the SLS. Well, over a month later, the SLS has become my main machine, and my old Desktop is in bits, waiting to be eBayed off. I just love the screen, keyboard, touch and pen of the SLS, and when I took it on an undergrad field course last week, it let me make photogrammetric models in just a matter of minutes, on site, and present them in a lecture that day. It really is a stunning machine, and takes the place of my old desktop and my old surface pro X for moving around the house with.

But, I do a lot of rendering/photogrammetry and enjoy gaming, so it needed more power. Enter the external GPU enclosure (eGPU).

The Lenovo BoostStation

I picked up a brand-new Lenovo BoostStation from eBay for ~£100.  It’s a thunderbolt 3 enclosure, a little longer and about 3x as wide as a normal graphics card (let’s ignore the nvidia 40xx series for now). It’s a fantastically constructed machine, with relatively thick metal surrounding it, making it feel very robust, albeit that construction makes it pretty heavy at around 8kg.

On the back, you get the ports that come with your GPU, and the ports specific to the BoostStation, which comprise 4 USB-A ports, and a USB-C thunderbolt port for connecting to the laptop:

The handle at the back lifts, releasing a catch, and allowing the insides to smoothly slide out so you can fit a GPU:

There’s also room (and fixtures) inside for a full-size HDD or a couple of SSDs, with SATA connections, but I’m not using that. The PSU inside is 500W, and I gather from other reviews that [obviously] the more you stick in there and plug into the USB ports, the more power you’re going to draw. The BoostStation will charge the SLS when in use, so there’s 100W power delivery (note: normally 100W would be less than the total usage of the SLS – mine came with a 127W charger, but when you’re running the eGPU, you may not be taxing the internal GPU or screen so much. In my usage so far, I’ve not seen battery drop at all even during heavy usage, the eGPU providing enough power at all times). The BoostStation is pretty quiet when in use. Obviously if you really push the GPU, the fans on the GPU itself will rev up and determine how loud it is, but in idle or low-usage, it’s just a small PSU fan in the enclosure making any sound.

I will say that the 2060 super just dropped straight in (though the design of that card makes it really difficult to get to the PCI-E release). For the 3070, I was stuck for power – the card requires 2 8-pin leads, but the BoostStation only has 1 8-pin and 1 6-pin available. To solve this, I just grabbed a 6-pin to 8-pin adapter from amazon, and everything has worked perfectly fine.

Installation and Setup

WindowsCentral wrote about their experiences with an eGPU and SLS, and reported that they experienced quite a few black-screens and crashes. I did too, at first. But then I used DDU to clear all the nvidia drivers and prevent automatic updates. I then plugged in and powered on the eGPU, and ran the generic Nvidia installer for my A2000 – it then installed both the laptop and eGPU drivers in one go.

The one caveat here is that I wasn’t easily able to just switch out the GPU (I used a 2060 super and a 3070 below), and really to ensure everything went smoothly, I ran DDU to clear all the drivers then started again with the new GPU inserted.

After that, everything has been exceptionally smooth – just turn on the boost station and plug in. 2 seconds later, I get a notification that the eGPU is enabled. If I’m running a second monitor from the eGPU, it comes to life within a second or two.

It really quite a seamless experience – just turn on eGPU, plug it in, and you’ve got a full desktop experience. I leave my monitor plugged into the eGPU, and have my Logitech Craft and MX Master sat on the desk. I plug the eGPU into the laptop, and it’s exactly like working at a desktop computer.

The one awkwardness in all this is that the SLS only has USB-C thunderbolt ports on the left side, and the thunderbolt cable that comes with the BoostStation is pretty short (maybe 30-40cm), so I’m limited in where I can put things. I may look into a longer cable at some point.

Performance

Ok, the meat of this post is in the real-world performance. I tested the eGPU with a 2060 super (from my old Desktop) and with a 3070 to see what differences there were, and how any bottlenecks were holding things up. I ran the same tests as in my last post, and I include the previous results for comparison. I also made more stringent tests of gaming performance.

Photogrammetry and rendering.

As before, my 53 image Styrac dataset was run through Metashape. I chose Metashape just because it’s easier to time everything – Reality capture also works super well via eGPU.

If you go to the settings in Metashape, you can enable all GPUS – so I enabled the A2000 in the SLS, and the external GPU, leaving the integrated intel GPU unchecked, as reccommended.

Lower numbers are better.

Ok, now we’re getting somewhere – with either GPU there was a decent decrease in the total time taken, most of that on the alignment and meshing stages. the 3070 predictably outperformed the 2060 super (though not in texturing, for some reason). I mean… good, we’re using two relatively modern GPUs to do this.

Next test was Blender rendering, both simple and complex scenes. I only did these with the 3070, sorry.

Again, lower numbers are better.

As with Metashape, Blender can leverage more than one GPU at once. We saw a 30% performance improvement between my old Desktop, and the SLS with 3070 eGPU.

The heavy scene consisted of a 40-million polygon mesh:

As before, the Desktop could not render the scene, because it didn’t have enough RAM.

Now the SLS was really able to leverage the power of the 3070, and rendered in less than half the time of the MacBook Pro, and less than a quarter the time it takes on it’s own. However: I did notice in testing that these times were much more variable with the eGPU on repeated runs, taking anywhere from 7 seconds to 30. I don’t know why.

Either way, I’ve been really impressed by the performance of the SLS with eGPU for the kind of workflows I use.

Gaming

I also game quite a lot, so I wanted to take a look at how well things worked via eGPU for that. I tested three games – Hitman III, Shadow of the Tomb Raider, and Cyberpunk 2077. For each I [tried] to run it native on the SLS’s A2000, on the 2060Super, and on the 3070. Note that unlike Blender and Metashape, these games do not take advantage of multiple GPUs.

Also worth noting is that because of bandwidth limitations over thunderbolt 3, you can get higher framerates if you display to an external screen only, so that you’re not sending data to the eGPU, processing it, then receiving it over the same cable to display in on the laptop screen. We’ll come back to this a bit later.

I first ran everything on max settings, running at native resolution (2400×1600) on the SLS laptop screen:

Hitman IIIShadow of the Tomb RaiderCyberpunk 2077
SLS A20004.37137
2060 super egpu19.64835
3070 egpu27.254835.25
framerates, in frames per second. Bigger numbers = better.

The A2000 was appalling. I mean, it’s a low-ish powered laptop gpu, specifically for CAD, not gaming (I don’t know if the 3050TI would be much better). But you would not want to game on that card. True, this was on ultra settings with ray-tracing on in all titles, but still…

I then ran the 3070 tests outputting only to my external monitor (a 1440p monitor, so 5% fewer pixels than the SLS screen, and of course output straight from the GPU):

Hitman IIIShadow of the Tomb RaiderCyberpunk 2077
3070 (external monitor)27.255139.79

As you can see, there was a small improvement in framerate for Tomb Raider and Cyberpunk, but Hitman was steadfast in not improving.

Anecdotal “I reckon”s

What follows here was not rigorously tested (not that the last lot was rigorous, but this is even less so).

The Hitman benchmark I was running was the Dartmoor scene, which is very particle heavy. That relies on the CPU, not the GPU. If I ran the Dubai scene, which is much less particle heavy, framerates were in the 50’s. That tells me pretty conclusively that the 4-core i7-11730H in the SLS just isn’t that great.

The same effect was seen with Cyberpunk – whatever settings I used, from Low to Ultra, the framerate was about the same. And while it averaged at ~40, it clearly went higher and smoother when there were fewer npcs/cars on screen. Even turning off DLSS made very little impact (it may even have improved framerates by 1-2 frames, as some DLSS is done on the CPU). Tomb Raider happily hit 100+ frames per second, being the older game of the three, but again, in crowd-heavy benchmark scenes, it slowed down to the 30s.

For what it’s worth, I ran some tests with task manager open, and while the CPU was frequently maxed out, it didn’t throttle in any significant way, because the cooling on the SLS is excellent (and the SLS wasn’t using its own GPU for this).

Surface Laptop Studio running Cyberpunk 2077 benchmark via the Nvidia 3070 eGPU.

Summary

I really love using the SLS, and now that I’ve got a powerful eGPU, I’m much more confident that I can make this work as a complete replacement for my Desktop and Pro X. I have enough power to move around the house, or work in the field and do what I usually do, but I can very easily hook up to a lot more power to more quickly render, or to play games.

For work, this is ideal – I can leverage the eGPU and the A2000 and make my work stuff happen much quicker than I could. much of my more visual work-stuff is GPU limited, so this works well. However, for gaming, the CPU in the SLS is really holding things back. I would love to try an SLS with a 12th gen intel, and fingers crossed that comes in the next 12 months (then I can wait a further 12 months and get one second hand at half the cost!)

Final points then:

  • The Surface laptop studio is a superb laptop – there are more powerful laptops available, at less cost, but few with the build quality and excellent keyboard, trackpad, and with pen+touch.
  • The Lenovo BoostStation is also a quality machine, and available at a good price. Once you clean out old drivers and start fresh, it’s super reliable and plug-and-play.
  • For rendering/photogrammetry, there’s a direct benefit to the more powerful card you put in the eGPU.
  • But for gaming, the CPU in the SLS is the weak point of the chain, and unfortunately nothing can be done to improve that.

7 thoughts on “[Academic Tech] Surface Laptop Studio with an eGPU – performance and review.

Add yours

  1. I have often thought that an eGPU with CUDA cards might be a way to bypass the fact that Reality Capture won’t work on a combination of Apple Silicon and a Windows virtual machine. It’s a long way to go to avoid using a windows box….but there ya go.

    1. Trouble is, there’s no nvidia drivers for MacOS or for windows on Arm, so you wouldn’t be able to get it to work whether natively or through emulation on apple silicon.

  2. Hello! I have a problem, now I am considering a SLS for purchase, a version with 11370, 32 GB and A2000, the price that they offer me, if converted into pounds, is about 1800 pounds. The device is officially restored – consider new.

    Is it worth it to buy now at this price, or is it better to wait for an update?

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Website Built with WordPress.com.

Up ↑