This laptop (I'm guessing) is actually available at Newegg but under the old gx60 name, it has the new A10-5750m but for the GPU its spec sheet says HD 7970m.
Looking at Meaker10's comment I now understand, MSI came out with a slightly upgraded version of their current gx60 but will replace it with a new version that has a 17.3" screen and new hardware in June or so.
I just pre-ordered my GX70 3BE, you are correct Jarred Walton my vendor said it will ship from the factory 05/31/13. If anyone wants to pre order one, just do a google search for "GX70 3BE-007US" Mine ran me 1,329.61
Wow that's a good price for this kind of graphical punch in a 17" laptop.
Jarred: Your idea that CPU-side performance improvements are limited to the 10% clock bump only may be flawed. Don't forget that they've also made improvements to their Turbo. The main additions are temp sensors that help their factor actual temp into their calculations, and the much-needed improved granularity. These changes alone will allow Richland to run at/near its maximum in more scenarios and stay at higher turbo speeds more often.
This may apply even more so when running games, as the integrated GPU is powered down in favor of the discrete chip. This will give the CPU cores of the APU more thermal headroom than a laptop relying on the APU to do everything by itself.
It's possible. One of the major issues with Trinity and Llano is that even though there was the potential to Turbo, there was no way to actually monitor how fast the CPU was actually running. On both Llano and Trinity, CPU-Z and other utilities (including AMD's own System Monitor or whatever it's called) would just report the base clock speed all the time -- so 2.3GHz on A10-4600M, even though it might be running at 2.5 or 2.7 or even 3.1GHz. Most laptops also provided no way to disable Turbo Core. Best-case, Richland may end up having ~20% more performance than Trinity, but I'd guess 10-15% is a far better estimate.
I'm OK with 15%. Better than 10%. Anyway, I just wanted to point out that there's more to Richland than just a clock bump. As far as stopgap chips go, I think they did OK, especially since they seem to be hitting the same price points as Trinity.
Perhaps, but unless AMD have tweaked Piledriver noticeably, I wouldn't expect more than 15% faster in lightly threaded titles, falling to less than 10% under duress. It'll bottleneck the 8970M less, but it will still bottleneck.
200Mhz is a nice incremental jump, however they should've considered at least 2.7/2.8GHz base in my opinion - with a 8970M on board I think they can afford a slightly higher CPU TDP. The iGPU clock increase is also quite marginal; we're talking 35MHz faster which is 5%. Every little helps, I guess.
The iGPU bump is irrelevant to the laptop in question, so I didn't bother addressing that. I suppose it's a nice freebie coming from Trinity. With that being said, bumping the base clocks isn't quite as necessary now with the improved Turbo I already mentioned above. If the iGPU is sitting idle, then it has more TDP headroom and it will increase clocks to whatever extent it can, even with all cores loaded. Plus the thermal sensors allow it to actually see what kind of headroom it actually has, so they don't have to be as conservative as Trinity. The base clocks are 200Mhz higher, the max turbo is 300Mhz higher, and if the slides are anything to go on, it will be better at sustaining higher speeds - especially with the iGPU idle.
Now with that being said, will this amount to more than a 15% boost? No. But for a budget gaming machine that already had potential, this is a nice improvement. It's hard to beat this machine for the money.
Personally I always thought they should sell a small number of GPU-less mobile variants, like the "Athlon" FM chips on the desktop. They would lose Enduro, though, and that's not acceptable to them. I wouldn't care though. They could make them 45 watt chips, clock them aggressively, and sell them as "FX" chips or something. The complete lack of a GPU and the extra TDP would allow for some nice clock improvements. I wouldn't run heavy games on it away from a wall socket anyway.
Any chance of reviewing a SLI equipped Lenovo Y500? It's in the same budget range, but judging from Dustin's comment from your CES post about it I'm guessing the answer is no :(. It's too bad, because the older 650M SLI versions seem like a decently powerful bit of gaming kit for $999. I'm on a bit of a Lenovo kick lately since they seem to be the only one supporting mSATA (and now NGFF) across most of their lineup.
I currently own a MSI gaming laptop and I won't be getting another one when it's time to upgrade.
The build feels very plastic, the touchpad is useless (no multitouch and scrolling is done by touching the corners) and to actually get any performance out of it I have to use MSIs power profiles.
I've only ever benchmarked it in 3DMark, but I get around 1.5x the score when I turn on 'gaming' mode. This switches the power profile in Windows to something predefined by MSI which means that any changes I've made to the power profile is lost.
I regularly hook up the laptop to my TV to watch movies or play some games, if I switch to the gaming profile, hook it up to the TV and then close the lid it goes into sleep mode. I have to change the power settings in the advanced profile settings to not sleep when I close the lid and if I restart the laptop or switch power profiles I have to do it all over again since the power profiles are hard-coded in some MSI application and are loaded and then removed again when you switch. The only power profile I can control is the standard one which has gimped performance.
That pricing is way too high. To consider a laptop with that CPU and it then be above $1k, you'd have to be brain dead. If you go above $1k, you shouldn't be considering an AMD CPU of any kind unless you are just a diehard fanboy who can't imagine a world where you bought an Intel CPU.
You'd have to be braindead to declare such a fact without the full review. I'd bet there aren't any 17" gaming laptops out there at this price that can outperform it... in games.
I highly discourage anyone from getting a mobile radeon card. I have an m18x r1 and I have done a TON of experimenting with 6990m's and 7970m's single and crossfire. One story I will tell is how AMD did not have a driver that supported crossfire 7970m's for almost 6 months after its release. Crossfire barely could even function for almost 6 months then when it finally somewhat functioned it micro stutters WAY more than sli gtx 680m's. This will essentially be a crossfire if it combines the 8650 integrated with the 8970m discrete which means terrible god awful driver support for 2 connected amd gpu's and terrible micro stuttering and frame lag spikes.
Do yourself a favor and save all the driver headaches and stutters and go with a mobile nvidia card. Nvidia just has smoother gameplay. If this comment saves even 1 person from buying a radeon gpu it was worth it. Yes you will pay more for the nvidia gtx 680m/780m over the 7970m/8970m but as the ole saying goes you get what you pay for. Nvidia spends much more money and labor hours on developing top of the line drivers. AMD cuts corners and employees for driver development to cut costs and push their gpu's out cheaper and you suffer for that.
After the hell i went through with 7970m's in crossfire and how much smoother and more enjoyable gameplay became when i put in sli gtx 680m I will NEVER EVER EVER EVERRRR use Radeon products again.
hmm.. I will NEVER EVER EVER EVERRRR understand people like you. I've worked with and used 100s of video cards by both companies and both have had their fair share of issues thru the years. What happens when (not if..) you have a bad experience with Nvidia? Will you only use Intel? When that doesn't work out so well.. than what? Virge, Cirrus, Matrox, are not really in the game any longer so hmmm..
I have researched this laptop extensively (and I am going to buy). One of the reviews I read specifically stated that this was the perfect machine for a student who wanted to play almost any game at 40FPS on a budget. As for SWTOR... the linked article indicates you'd have no issues with the game on the GX60.
Its good you highlighted to shortcomings of,the apu, but how come you didn't test other games like Crysis, or any of those. They arent CPU bound as much as SC, or GTA. The apu shortcomings are definitely apparent when running at lower details but at max settings, when the gpu is getting pounded not so much. Also, are yall going to review the Lenovo y510p with the dual 750m? I would like to,see how this stacks up to the higher end single gpu laptops, and whether its worth dealing with the hassle of sli, or crossfire. I haven't really read much into sli or crossfire. Does this laptop allow you to utilize the igp and dgpu at the same time?
The spec lists two HDMI ports, but the photo of the back of the unit shows one full sized HDMI port and one Mini DisplayPort. That makes me wonder if I can hook this up to a 1440p monitor via DisplayPort and output at 1440p. That would make this a heck of a workstation.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
23 Comments
Back to Article
xinthius - Monday, May 13, 2013 - link
What an ugly laptop. Granted it's got to have a lot of thermal dissipation.cairbram - Monday, May 13, 2013 - link
This laptop (I'm guessing) is actually available at Newegg but under the old gx60 name, it has the new A10-5750m but for the GPU its spec sheet says HD 7970m.JarredWalton - Monday, May 13, 2013 - link
I assume you're talking about this GX60 model, which isn't the same as the above:http://www.newegg.com/Product/Product.aspx?Item=N8...
GX70 3BE will have 8970M only, according to MSI. I'll let you know in a couple days what that actually means. :-)
cairbram - Monday, May 13, 2013 - link
Looking at Meaker10's comment I now understand, MSI came out with a slightly upgraded version of their current gx60 but will replace it with a new version that has a 17.3" screen and new hardware in June or so.nigelpreece - Monday, May 13, 2013 - link
I just pre-ordered my GX70 3BE, you are correct Jarred Walton my vendor said it will ship from the factory 05/31/13. If anyone wants to pre order one, just do a google search for "GX70 3BE-007US" Mine ran me 1,329.61Alexvrb - Tuesday, May 14, 2013 - link
Wow that's a good price for this kind of graphical punch in a 17" laptop.Jarred: Your idea that CPU-side performance improvements are limited to the 10% clock bump only may be flawed. Don't forget that they've also made improvements to their Turbo. The main additions are temp sensors that help their factor actual temp into their calculations, and the much-needed improved granularity. These changes alone will allow Richland to run at/near its maximum in more scenarios and stay at higher turbo speeds more often.
This may apply even more so when running games, as the integrated GPU is powered down in favor of the discrete chip. This will give the CPU cores of the APU more thermal headroom than a laptop relying on the APU to do everything by itself.
JarredWalton - Tuesday, May 14, 2013 - link
It's possible. One of the major issues with Trinity and Llano is that even though there was the potential to Turbo, there was no way to actually monitor how fast the CPU was actually running. On both Llano and Trinity, CPU-Z and other utilities (including AMD's own System Monitor or whatever it's called) would just report the base clock speed all the time -- so 2.3GHz on A10-4600M, even though it might be running at 2.5 or 2.7 or even 3.1GHz. Most laptops also provided no way to disable Turbo Core. Best-case, Richland may end up having ~20% more performance than Trinity, but I'd guess 10-15% is a far better estimate.Alexvrb - Tuesday, May 14, 2013 - link
I'm OK with 15%. Better than 10%. Anyway, I just wanted to point out that there's more to Richland than just a clock bump. As far as stopgap chips go, I think they did OK, especially since they seem to be hitting the same price points as Trinity.silverblue - Tuesday, May 14, 2013 - link
Perhaps, but unless AMD have tweaked Piledriver noticeably, I wouldn't expect more than 15% faster in lightly threaded titles, falling to less than 10% under duress. It'll bottleneck the 8970M less, but it will still bottleneck.200Mhz is a nice incremental jump, however they should've considered at least 2.7/2.8GHz base in my opinion - with a 8970M on board I think they can afford a slightly higher CPU TDP. The iGPU clock increase is also quite marginal; we're talking 35MHz faster which is 5%. Every little helps, I guess.
Alexvrb - Tuesday, May 14, 2013 - link
The iGPU bump is irrelevant to the laptop in question, so I didn't bother addressing that. I suppose it's a nice freebie coming from Trinity. With that being said, bumping the base clocks isn't quite as necessary now with the improved Turbo I already mentioned above. If the iGPU is sitting idle, then it has more TDP headroom and it will increase clocks to whatever extent it can, even with all cores loaded. Plus the thermal sensors allow it to actually see what kind of headroom it actually has, so they don't have to be as conservative as Trinity. The base clocks are 200Mhz higher, the max turbo is 300Mhz higher, and if the slides are anything to go on, it will be better at sustaining higher speeds - especially with the iGPU idle.Now with that being said, will this amount to more than a 15% boost? No. But for a budget gaming machine that already had potential, this is a nice improvement. It's hard to beat this machine for the money.
Personally I always thought they should sell a small number of GPU-less mobile variants, like the "Athlon" FM chips on the desktop. They would lose Enduro, though, and that's not acceptable to them. I wouldn't care though. They could make them 45 watt chips, clock them aggressively, and sell them as "FX" chips or something. The complete lack of a GPU and the extra TDP would allow for some nice clock improvements. I wouldn't run heavy games on it away from a wall socket anyway.
Meaker10 - Monday, May 13, 2013 - link
Note:GX60 = 15.6"
GX70 = 17.3"
Bob Todd - Tuesday, May 14, 2013 - link
Any chance of reviewing a SLI equipped Lenovo Y500? It's in the same budget range, but judging from Dustin's comment from your CES post about it I'm guessing the answer is no :(. It's too bad, because the older 650M SLI versions seem like a decently powerful bit of gaming kit for $999. I'm on a bit of a Lenovo kick lately since they seem to be the only one supporting mSATA (and now NGFF) across most of their lineup.Raniz - Tuesday, May 14, 2013 - link
I currently own a MSI gaming laptop and I won't be getting another one when it's time to upgrade.The build feels very plastic, the touchpad is useless (no multitouch and scrolling is done by touching the corners) and to actually get any performance out of it I have to use MSIs power profiles.
I've only ever benchmarked it in 3DMark, but I get around 1.5x the score when I turn on 'gaming' mode. This switches the power profile in Windows to something predefined by MSI which means that any changes I've made to the power profile is lost.
I regularly hook up the laptop to my TV to watch movies or play some games, if I switch to the gaming profile, hook it up to the TV and then close the lid it goes into sleep mode. I have to change the power settings in the advanced profile settings to not sleep when I close the lid and if I restart the laptop or switch power profiles I have to do it all over again since the power profiles are hard-coded in some MSI application and are loaded and then removed again when you switch. The only power profile I can control is the standard one which has gimped performance.
HisDivineOrder - Tuesday, May 14, 2013 - link
That pricing is way too high. To consider a laptop with that CPU and it then be above $1k, you'd have to be brain dead. If you go above $1k, you shouldn't be considering an AMD CPU of any kind unless you are just a diehard fanboy who can't imagine a world where you bought an Intel CPU.And even then you should be reconsidering.
Alexvrb - Sunday, May 19, 2013 - link
You'd have to be braindead to declare such a fact without the full review. I'd bet there aren't any 17" gaming laptops out there at this price that can outperform it... in games.Laststop311 - Wednesday, May 15, 2013 - link
I highly discourage anyone from getting a mobile radeon card. I have an m18x r1 and I have done a TON of experimenting with 6990m's and 7970m's single and crossfire. One story I will tell is how AMD did not have a driver that supported crossfire 7970m's for almost 6 months after its release. Crossfire barely could even function for almost 6 months then when it finally somewhat functioned it micro stutters WAY more than sli gtx 680m's. This will essentially be a crossfire if it combines the 8650 integrated with the 8970m discrete which means terrible god awful driver support for 2 connected amd gpu's and terrible micro stuttering and frame lag spikes.Do yourself a favor and save all the driver headaches and stutters and go with a mobile nvidia card. Nvidia just has smoother gameplay. If this comment saves even 1 person from buying a radeon gpu it was worth it. Yes you will pay more for the nvidia gtx 680m/780m over the 7970m/8970m but as the ole saying goes you get what you pay for. Nvidia spends much more money and labor hours on developing top of the line drivers. AMD cuts corners and employees for driver development to cut costs and push their gpu's out cheaper and you suffer for that.
After the hell i went through with 7970m's in crossfire and how much smoother and more enjoyable gameplay became when i put in sli gtx 680m I will NEVER EVER EVER EVERRRR use Radeon products again.
just4U - Friday, May 17, 2013 - link
hmm.. I will NEVER EVER EVER EVERRRR understand people like you. I've worked with and used 100s of video cards by both companies and both have had their fair share of issues thru the years. What happens when (not if..) you have a bad experience with Nvidia? Will you only use Intel? When that doesn't work out so well.. than what? Virge, Cirrus, Matrox, are not really in the game any longer so hmmm..just4U - Friday, May 17, 2013 - link
edit cyrix (gahh.. its been so long..) Via should have been mentioned as well.. their still kickin.Bothai - Sunday, May 19, 2013 - link
gLM56q3whKBothai - Sunday, May 19, 2013 - link
Would a GX60 be okay for a Uni student wanting to play some games like SWTOR on the go at above 40FPS?Or am i asking too much with that Bottleneck?
D.O.A. - Wednesday, May 22, 2013 - link
I have researched this laptop extensively (and I am going to buy). One of the reviews I read specifically stated that this was the perfect machine for a student who wanted to play almost any game at 40FPS on a budget. As for SWTOR... the linked article indicates you'd have no issues with the game on the GX60.http://www.tomshardware.com/reviews/star-wars-gami...
Manch78 - Monday, July 1, 2013 - link
Its good you highlighted to shortcomings of,the apu, but how come you didn't test other games like Crysis, or any of those. They arent CPU bound as much as SC, or GTA. The apu shortcomings are definitely apparent when running at lower details but at max settings, when the gpu is getting pounded not so much. Also, are yall going to review the Lenovo y510p with the dual 750m? I would like to,see how this stacks up to the higher end single gpu laptops, and whether its worth dealing with the hassle of sli, or crossfire. I haven't really read much into sli or crossfire. Does this laptop allow you to utilize the igp and dgpu at the same time?rburnham - Wednesday, September 25, 2013 - link
The spec lists two HDMI ports, but the photo of the back of the unit shows one full sized HDMI port and one Mini DisplayPort. That makes me wonder if I can hook this up to a 1440p monitor via DisplayPort and output at 1440p. That would make this a heck of a workstation.