Comments Locked

64 Comments

Back to Article

  • tipoo - Thursday, March 10, 2016 - link

    Pretty exciting that after three generations of Thunderbolt, it looks like this promise is actually becoming serious.

    Shame that those of us with TB2 will miss out, even if it capped out performance somewhere it could be a large enough upgrade. My laptops 4770HQ has enough CPU grunt for near every modern game, if only it had a better GPU.
  • mdriftmeyer - Thursday, March 10, 2016 - link

    Toss this as a BTO across OS X hardware options on the Apple Store and you'll get wide spread adoption in Apple market segments.

    The VR excuse by Occulus will fall flat on its face that OS X isn't VR ready.
  • dsumanik - Thursday, March 10, 2016 - link

    Um im pure apple, i only use windows on bootcamp for app testing and games...OSX is a nightmare for gaming, I can directly compare using steam on the OSX side, vs the windows side. Windows has faster fps, half the games don't work at all on mac, and another 30% are just plain glitchy. Don't get butthurt over it, apple doesnt want you gaming in anything but iphones and ipads
  • Oxford Guy - Friday, March 11, 2016 - link

    They're probably poorly done ports and/or software running in emulation.
  • Alexvrb - Saturday, March 12, 2016 - link

    Running in emulation? Hardly. We're talking about Steam titles which have OSX ports. OSX has its strong points but gaming isn't one of them.
  • TomAnon - Thursday, March 17, 2016 - link

    It's because OS X doesn't support DirectX 12. Without access to a proper graphics API, all ports have to be done using OpenGL, which is just a mess and results in poor performance and lots of glitches.
  • PainfulByte - Friday, March 11, 2016 - link

    Currently, it is not an excuse but a cold hard fact. However it was not so much OS X that was pointed out as lacking but rather the hardware (read GPUs) that are shipped with Apple products. If Apple implements this tech, that situation might change.
  • AS118 - Friday, March 11, 2016 - link

    Yeah, since Apple doesn't like to make laptops big enough for discrete graphics, this would be a good idea. They use Radeon stuff already, so why not?
  • TomAnon - Thursday, March 17, 2016 - link

    Except that we don't know how Vulkan is supported on OS X, or whether Apple will push that, especially when they already have a branded graphics API, Metal, which Vulkan would make obsolete, as it would be able to run on the iPhone as well and be infinitely preferable to Metal, owing to the fact that developers could use a single graphics API to develop for Android and iOS.
    Without DirectX 12, OS X is a joke for gaming. It doesn't have anything to do with hardware, people have been installing bootcamp for a decade now to get round that and some of the iMacs are real killers with it. The main problem is a lack of a proper graphics API on OS X, as OpenGL is just a mess and no-one wants to touch it - hence the very few, poorly performing ports.
  • Viddy - Sunday, May 22, 2016 - link

    You won't be entirely missing out. You can mod an AKiTiO Thunder2 or a HighPoint RocketStor 6361a to run a desktop GPU for only $200, and from personal experience I can say it works pretty damn well. Though I do get BSODs every once and a while. But still, pretty worth it IMO. Grab one of those with a GTX 960 and you'd be paying a hundred dollars less than the Razer Core chassis alone.
  • watzupken - Thursday, March 10, 2016 - link

    I think the concept is good, but I am not very certain this idea will fly. Price of the chassis is definitely one of the considerations for its success since one probably need to spend extra on a GPU too. And correctly raised, TB connection is not something we find in quite a number of laptops.
  • WinterCharm - Thursday, March 10, 2016 - link

    Cheaper than building a separate gaming desktop.
  • protomech - Thursday, March 10, 2016 - link

    And more importantly, more convenient.

    After you've dropped ~$1500 on a high-end ultrabook and ~$500 on a GPU, a few hundred bucks to build a gaming desktop vs a few hundred bucks to buy a TB3 chassis is a bit of a wash.

    But being able to use a single system will be a huge win, for some people.
  • ishould - Friday, March 11, 2016 - link

    Typically people will only have to make the initial investment in the high-end ultrabook once, and can upgrade the graphics as needed. Any system with an i7 and 16GB of ram should suffice for a long while
  • jospoortvliet - Saturday, March 12, 2016 - link

    An i7 in an ultrabook isn't exactly in the same league as a desktop i7, but if you're talking midrange it's true that if you start today with a i7 and a midrange gpu you can probably upgrade your way through 2019 or so at least without the cpu becoming the bottleneck.
  • frostyfiredude - Thursday, March 10, 2016 - link

    Yeah, it'll be a niche for sure. Costs are going down though and with mainstream support awareness will rise too. Once dedicated boxes can be bought that are tailored to a specific card and essentially plug and play for the customer I think this will have a decent market.
  • andrewaggb - Thursday, March 10, 2016 - link

    I'm hoping a vendor will come out with a box with the card built-in. I don't see any reason it needs to be a standalone graphics card in small case. I would think you build something smaller and less expensive if it was a single unit.
  • frostyfiredude - Thursday, March 10, 2016 - link

    I feel like its only a matter of time before someone does that, especially with a midrange GPU.
  • dj_aris - Friday, March 11, 2016 - link

    Why not even an all-in-one 4K monitor with an embedded beefy GPU which just connects over a single TB cable?
  • rev3rsor - Thursday, March 10, 2016 - link

    I think this is something that a lot of people have been looking forward to, so it's good to see a manufacturer actually preparing the eGPU idea as an accessible solution for more people. The way I see it, the target is basically the same as Razer's idea with the Core, but expanding support to all laptops rather than Razer's Stealth (for example, the XPS 13 which is around as good, if not better, than the Stealth in some respects).

    ULV processors in Ultrabooks may be a bottleneck, but a product category I'd like to see, and that could be feasible now, is 13", 14" or 15" laptops with a strong i5/i7 (maybe even with Iris Pro graphics) and supporting this. This is sort of what I liked about the MSI GS30 (but that came with its flaws), and for those of us who don't need high performance on the go, if it's an affordable package then it could be a good solution for people who need to travel but want to do more intensive tasks at home/work. The way I see it, ultrabooks can be suitable for mobile productivity, while these laptops (probably a more niche market) can make use of higher performance CPUs at the cost of power efficiency, which is improving today anyway. Sort of like the rMBP without the dGPU, but hopefully cheaper.
  • Murloc - Thursday, March 10, 2016 - link

    well if ultrabooks tend to be GPU-bottlenecked when gaming as of now (I'm not sure how much), it's still an improvement, at least when it comes to the cheaper docks with a dGPU inside, not the razer core with a top desktop part of course.
  • DanNeely - Thursday, March 10, 2016 - link

    Maybe we'll see an uptick in the use of Intel's dual core 28W mobile parts. Adjust the TDP down to the more common 17W baseline on battery to extend lifetime per charge; but when plugged into the wall let them open up again so they're running at about the same clockspeed as the 45W mobile chips. They'd still be somewhat limited in only having 2 physical cores; but with significant chunks of gaming code still being single thread performance dependent this would go a large way to narrowing the gap.
  • rhysiam - Thursday, March 10, 2016 - link

    And while we're at it, why not build an external GPU box that also incorporates additional cooling for the laptop to enable those higher CPU TDPs without excessive noise? Truly effective solutions would probably have to be proprietary I suppose, I'm not sure how effective those laptop coolers which just blow air on the chassis actually are... I'm guesing not very effective?

    Back to to the main topic though, I sure hope these solutions are more stable and receive better long-term support than previous iterations of switch able graphics, particularly from AMD.
  • Oxford Guy - Friday, March 11, 2016 - link

    Or get rid of the stupid obsession over thinness.

    Light weight? Yes. Thin? Why?
  • guidryp - Thursday, March 10, 2016 - link

    I wonder if we will ever see Mac Support for this. GPU is one of the biggest weaknesses in Macs.
  • jsntech - Thursday, March 10, 2016 - link

    It seems such support would need to come from Apple. As awesome as that would be (and it would be reeeeeeally awesome), I'm having a hard time imagining Apple getting behind it. We can hope.
  • tipoo - Thursday, March 10, 2016 - link

    Under Boot Camp it may work, some people got external graphics to work on TB2. Under OSX, probably not, but for a gaming external GPU you want Windows anyways to not lose 30% of your performance to the 5 year old OpenGL version in OSX
  • mdriftmeyer - Thursday, March 10, 2016 - link

    Apple could add OpenGL 4.5 support in a few months, if they wanted to do so. They focused on bringing Metal API to both iOS and OS X. Time will tell. But with all the work Game Engine devs have done with Apple on Metal API for iOS the vast bulk of that work will just transfer over to OS X.
  • mdriftmeyer - Thursday, March 10, 2016 - link

    It just needs a kext and how Apple's Metal API would leverage the GCN hardware is up to them.
  • sovking - Thursday, March 10, 2016 - link

    Why such solution should be limited to laptops ?
    Let's think about people using desktops for work and casual gamers can use moder cpu having graphics integrated (or APU), without the need of discrete graphics, saving power during the long work-day. At the evening they want to play to some recent game with hight details: they plug their desktop to eGPU, which have the horsepower needed. Next day they continue to use their desktop without their eGPU.
    In this way they can use smaller case, that produce less heat and less noise.

    For many desktop users could be nice and useful to plug and play with eGPU ;-)
  • Seikent - Thursday, March 10, 2016 - link

    In a desktop, internal switchable graphics (like in notebooks) would make much more sense (why would I want an extra box). I guess that the problem is that desktops are customizable and for Enduro or Optimus to work, they need to exist at a motherboard level (uneducated guess).
  • Murloc - Thursday, March 10, 2016 - link

    casual gamers who find themselves stuck with a brand-name desktop PC that has an unbalance towards the GPU (average situation, even for gaming-oriented PCs), will never be let to have support for this by the builders exactly because it stops them from buying a new PC entirely.
    And desktop gaming PCs have huge margins unlike the cheap office ones (which can have enough CPU power though).

    The power consumption argument is moot imho because an external solution costs more money, probably more than the expense for the energy you'd spare, if it's an average graphics card.
  • DanNeely - Thursday, March 10, 2016 - link

    There's no need of an enclosure for that; and it'd just add considerable expense for no real gain. An internal card and switchable graphics would achieve all the power savings; and 2 boxes are more clutter on the desk than just one.
  • Oxford Guy - Friday, March 11, 2016 - link

    Not when people demand thin laptops.
  • WithoutWeakness - Thursday, March 10, 2016 - link

    I'm not sure where you live but power in the US is pretty cheap. You would need to have significant power savings using your iGPU over your dedicated external GPU in order to actually save any money given the $200+ upfront cost of buying an external Thunderbolt graphics card dock. Even the highest-end enthusiast GPUs sip power at idle or while performing low-level GPU accelerated tasks like video playback. There likely wouldn't be enough of a power savings gap to make it make sense.

    Not that I'm saying it's pointless and they should never do it but I'd like to see it polished on laptops before putting any significant development effort into any changes they would need to make to support external graphics on desktop machines.
  • MadAd - Friday, March 11, 2016 - link

    Yes finally some traction in the external graphics area. I thought it was only me with a vision of a small ITX type desktop unit with the keyboard and mouse and a bit of storage, an external graphics unit in a thunderbolt box for some hours of gaming and the main storage in an NAS router for all the network and data saving needs.

    At that point the traditional ATX/uATX format doesnt need to exist any more and we can better use our energy, not just in only firing up graphics when we need gaming, but also savings in non gaming idle modes while afk, plus less heat (in areas where thats a problem).
  • ET - Thursday, March 10, 2016 - link

    I just happened to read the GeForce driver 364.51 release notes and it says: "Added external graphics UI (Windows 10)". So I'm guessing NVIDIA cards will work out of the box with this.

    Anyway, any hope that this Intel-AMD collaboration will also result in Thunderbolt being available on AMD motherboards?
  • Murloc - Thursday, March 10, 2016 - link

    if they come out with cheap docks and get all ultrabooks to support it this could sell some.
    A dock is useful on its own but not enough to warrant buying one for most people, if there's a dGPU in it that allows you to play MOBAs and simple FPS games on a road warrior kind of notebook (i.e. light 13'') then that's attractive and can help justify the cost.
    I mean, ethernet cable and USB port galore to leave cheap external storage, keyboard, mouse, lan and monitor connected is pretty good.
  • DanNeely - Thursday, March 10, 2016 - link

    "The catch, I suspect, will be getting laptop vendors to include Thunderbolt 3 support, as historically Thunderbolt has seen little traction outside of Apple laptops. The switch from the mini-Displayport connector to the USB Type-C connector is likely to help some with that, but a big consideration will be the space and power requirements of Intel’s Alpine Ridge controller, along with the costs of buying and integrating it."

    In the short/medium term Alpine Ridge also being one of the very few USB3.1 controllers on the market, when there's no chipset support for the newer protocol version, should help this by reducing the BOM cost.
  • Ryan Smith - Thursday, March 10, 2016 - link

    Laptop makers have so far not been tripping over themselves to add USB 3.1. Space and power are at a premium, so they like to avoid adding additional controllers. Though I agree that Alpine Ridge has a solid place even just as a USB 3.1 controller, I wouldn't consider it a given that it can do any more to convince laptop vendors to include it.
  • Laxaa - Thursday, March 10, 2016 - link

    I wonder if MS will ever make an external GPU for the Surface Pro 4, seein as the Surface Connect port is PCI-E based(at least the one on the Surface Book is)
  • tipoo - Thursday, March 10, 2016 - link

    I also wonder if they'll upgrade the GPU in the keyboard of the Surface Book, that would be neat.
  • Laxaa - Thursday, March 10, 2016 - link

    Yeah. Like beeing able to just buy the base if you need more computing power. I don't think they'll get rid of the Surface Connect-port for a while.
  • KoolAidMan1 - Thursday, March 10, 2016 - link

    I expect it would be Thunderbolt 3 using a USB 3.1 port. Anything proprietary will be DOA. The dGPU in the Surface Book keyboard is a buggy, underpowered, overpriced mess that shouldn't have happened in the first place. Those external GPU enclosures are a much better solution.
  • BrokenCrayons - Thursday, March 10, 2016 - link

    This is good news from AMD since standard desktop systems are becoming progressively less commonplace, but dedicated graphics are still important to a lot of people who don't want to wait for iGPU performance to increase with each processor release to make gaming without the burden of additional hardware necessary. If external GPU case prices fall to more practical levels and TB3 interfaces take off in laptops, AMD might be able to land additional sales. Good for them and good for those of us who would like to have a professional looking, lightweight laptop with good battery life, but might also want to dabble in a game or two without buying a second computer for games that looks like it's being marketed at a teenage boy.
  • hubick - Thursday, March 10, 2016 - link

    A lot of people attempting to use their Oculus Rift on a laptop are discovering that secondary GPU switching technology like Optimus is largely incompatible with the new Direct Mode drivers required by newer Oculus SDK's (you'll get an HDMI error and the HMD won't even connect to the PC). I imagine VR becoming one of the main drivers for people wanting technology like this XConnect, so I'm wondering if it's also incompatible?
  • 06GTOSC - Thursday, March 10, 2016 - link

    External graphics have taken far too long to final get focus. I'm glad it finally is though. I've always wanted to be able to have a good CPU and a lot of memory in a laptop with decent graphics, then be able to plug in a graphics card for when I want to do serious gaming. I realize that mobile CPUs won't match a desktop CPU, but for most of us, we can live with it. I'll take a few FPS less or slightly less detail to not have to maintain 2 machines.
  • yannigr2 - Thursday, March 10, 2016 - link

    Take a 2 in 1.

    Remove it's keyboard, you have a tablet.

    Add an external keyboard/mouse you have a nice desktop.

    Add also an external GPU you have a full gaming desktop.

    On the go just attach it's own keyboard and you have a light notebook.
  • Wolfpup - Thursday, March 10, 2016 - link

    Switchable graphics do not work. Even Nvidia can't do it, much less AMD. So the whole point of this is with an external monitor, and for that, it would be really cool IF it's actually supported for real.

    I find it bizarre that Intel wouldn't have been pushing for this from day 1. It's not only the primary POINT to the Thunderbolt technology, but also higher end GPUs need higher end CPUs to drive them...and who stands to gain from selling better CPUs?

    Just bizarre this wasn't their focus when Thunderbolt 1 rolled out.
  • rev3rsor - Thursday, March 10, 2016 - link

    I imagine that if they've announced it already then they're confident of its implementation, and Razer AFAIK is aiming for how-swapping capability.

    As for TB 1 and 2, I'm not too familiar but Intel seemed to be against the idea for a long time, for some reason. I suppose with enough bandwidth it would've worked, but it would need something like today's technology to make it seamless (or at least mostly smooth).
  • rev3rsor - Thursday, March 10, 2016 - link

    Sorry, hot-swapping*. And while I'm nowhere near a power user, Optimus (I think?) seems to work for me as far as being able to game on GPU without problems (on a laptop, HD4600 and 740M).
  • YukaKun - Thursday, March 10, 2016 - link

    Well, let's see:

    PCIe 3 x4 (at best), lower latency (how bad is it?), less performing CPU+RAM (than in Desktop), USD$300+ Desktop GPU and custom made chassis by a vendor (assuming Razer won't let other TB3 devices use it, for example). Uhm... Nope. Does not tempt me in the slightest.

    Cheers!
  • ruthan - Thursday, March 10, 2016 - link

    Maybe too late, now with fast internet, cloud saves and Dropbox like services - with my work data in cloud, i really dont need carry notebook to home at all.. and i can use more desktops in different locations quite seamlessly.
  • MrSpadge - Thursday, March 10, 2016 - link

    Let's build PCIe cards for desktop to use this. 4x is nice, but a 16x slot could house 4 external GPUs with this technology!

    Now take the smallest socket 2011-3 CPU with 40 PCIe lanes and outfit it with cards for 10 GPUs. That would be one hell of a number cruncher, at least for tasks which only require moderate communication. Usually such a setup wouldn't work due to space constraints, even with riser cards. But with TB cables one could use proper GPU housings (cooling & power supply).
  • unrulycow - Thursday, March 10, 2016 - link

    Does anyone know if this can work with an Optimus laptop? So, turning off both the Intel integrated video and the dedicated Nvidia card to have better performance from an external card?
  • Oxford Guy - Friday, March 11, 2016 - link

    Some of those may be using kludgy emulation. OS X should be able to handle games just fine, only with a bit less performance because OpenGL is not as high-performance as DX as far as I've seen.

    But, if games aren't ported well (especially if they're using emulation) then the best OS in the world can't fix that.
  • Khenglish - Friday, March 11, 2016 - link

    The only successful eGPU systems to date were the hwtools PE4H and PE4L, and the EXP GDC. What made these products successful is that they were enclosureless and cheap. You grab an old desktop PSU you have from a closet, get a desktop GPU, and for under $50 you have the desktop card running off your laptop. The VillageTronic, Alienware, MSI, and soon this Razer Core solution all missed the point of an eGPU, and that point is spending a small amount to turn your laptop into a gaming rig when at home. When you include a PSU and fancy enclosure the system becomes too expensive to justify over just building a cheap desktop system.
  • beginner99 - Friday, March 11, 2016 - link

    Cool from a technology standpoint but I don't see me buying such a thing anytime in the next 5 years. For thin-laptops it will be the CPU limiting games and many of these ULV CPUs are still only dual-cores. Intel really needs to up the core count. 4 on Laptops and 4-8 on desktop. And even a quad will probably throttle a lot. It would just be a huge compromise. I bet you could get a better performing desktop and cheap and thin laptop for the same price as an expensive laptop with TB3 and the external GPU dock. At that point it makes no sense to go the later way.
  • beginner99 - Friday, March 11, 2016 - link

    Instead of such a GPU dock i would find some kind of Smartphone dock much cooler. It adds some external connectors (eg for a display) + cools the smartphone so it throttles less. For email and web browsing this could completely eliminate the need for a laptop or PC altogether.
  • zodiacfml - Friday, March 11, 2016 - link

    Very interesting. I could see this becoming more prevalent if the chassis and Thunderbolt interface comes down in price. For me, we need a GPU that can be used and shared between our multiple devices e.g. desktops, laptops, and tablets.

    Aside from that, there is a growing TDP gap between modern CPUs and GPUs that having them on separate power supplies and enclosures is an elegant idea.

    Looking forward, I hope for a variant of the Thunderbolt to finally use an optical medium.
  • tmr3 - Sunday, March 13, 2016 - link

    I feel like this has such a huge amount of potential, it just remains to be seen whether that potential is put to use.

    At this point in time I'm envisioning a 15.6" class notebook like the Dell XPS 15. Drop the GTX 960M, offer some chips with Iris graphics like the 6870HQ and tidy up the internals so you have a quad-core 45W laptop that's fairly thin and fairly light with longer battery life, an exceptional screen and M.2 PCIe SSD storage.

    Then reconfigure the ports so you've got maybe two Thunderbolt 3 ports on either side, 2-3 USB 3.1 ports, an SD card slot and a headset jack. Then, if it's compatible with eGFX you have the flexibility of deciding whether you need dedicated graphics, and what level of performance you want.

    Personally I'd be jumping for a dock/enclosure with a few USB ports and Ethernet to connect it up to a permanent desk setup. Then I'd nearly have the best of both worlds - a high performance, portable laptop with enough battery life to last a day, and then a reasonably powerful system when docked at home.
  • uhh - Sunday, March 13, 2016 - link

    Isn't this just like the Alienware Graphics Amplifier?
  • seananamous - Wednesday, March 30, 2016 - link

    I just bought the Acer Predator 15 which has a TB3 Slot.... am I safe with that?
  • sortep2 - Friday, June 17, 2016 - link

    will i be able to connect an external GPU on a Mac mini late 2012 model with thunderbolt ? but only if i'm running windows using bootcamp.

Log in

Don't have an account? Sign up now