External GPUs can be a good idea. If it is affordable, easy, and not too big, users can augment their laptop CPU, which is probably good enough to at least run most tasks, with a high-end GPU. While GPUs are more efficient that CPUs, the tasks that they are expected to do are so much larger that a decent graphics chip is difficult to cram into laptop form factor… for the most part.
Image Credit: Tom's Hardware
Preamble aside, it's been tried and dropped numerous times over the last decade, but the last generation seems to be getting a little traction. Razer added the feature to their relatively popular Blade line of laptops, and AMD, who was one of the companies to try it several years ago, is pushing it now with their XConnect technology. Even Microsoft sort-of does this with their Surface Book, and it's been a small source of problems for them.
Now Gigabyte, at Computex, announced that they are investigating prototypes. According to Tom's Hardware, their current attempt stands upright, which is likely to take up less desk space. Looking at it, I could see it hiding in the space between my monitors and the corner of the room (because my desk slides into the corner). Of course, in my case, I have a desktop PC, so I'm not the target demographic, but who knows? It's possible that a laptop user might have a similar setup to me. It's still pretty big, though.
Currently, Gigabyte limits the power supply to 250W, which drops GPU support to under 175W TDP. In other words? Too small for a GeForce GTX 1080. The company did tell Tom's Hardware that they are considering upping that to 350W, which would allow 260W of load, which allows all 1x PCIe 8-pin graphics cards, and thus many (but not all) GTX 1080s.
No pricing or availability yet, of course. It's just a prototype.
So is there much cost &
So is there much cost & engineering on cooling hinderance that gigabyte opted for such low TDP?? Barely supporting latest finfet only reference…sorry… ahem….i meant founder edition cards seems like a very limiting factor…