Any announcements about BRIX/NUC type small form factor PCs with Kaveri?
Kaveri unfortunately doesn't sound like what we hoped for, essentially the XboxOne/PS4 APU on an open platform, but it does look like it might be appropriate for a HTPC or _maybe_ a low-end primarily in-house streaming target steambox. Bay Trail atom is fast enough for HTPC use but its GPU is way too slow for any gaming at all.
Kaveri was never targetted as an XboxOne/PS4 APU; the GPU is more far more modest than even that of the former. Even with Mantle dragging up performance, it couldn't perform on an equal footing. The key here is that Kaveri features far more powerful CPU cores than those in the consoles, meaning you can have a capable work platform along with decent performance in relatively new games provided you don't go mad with the higher settings modes. At $170-ish, it was never meant to compete with the consoles in the first place - the GPU in the XBox One alone is equivalent to an R7 260 and thus about $130, and that doesn't mention anything about the eSRAM or CPU cores, but you try computing with eight 1.75GHz Jaguar cores - you'd hate it, but their presence doesn't matter on the consoles.
Kaveri is currently one of the best generic all rounders, but isn't that the fastest shrinking market in terms of *purchase*? IMHO it competes mostly with the hardware people already own and aren't likely to replace in near-term.
I keep wondering: Why on earth did they miss that chance to capture the high-end gamer desktop and kick Intel where it hurts most?
I'm running Trinity, Richland, Phenom II X4/X6, Sandy Bridge i7-2600, Intel Core2 on desktops and gaming PCs in the family and it is quite clear that the CPU performance on Kaveri will be good enough for just about any game out there at least up to 1080p. But the GPU performance runs into a wall at 720p.
So in terms of pure CPU performance there is basically nothing out there, which requires more than Kaveri: All the *really* compute intensive desktop tasks like real-time video editing/transcoding really are better dealt with by GPU compute or a real VPU (video processing unit).
However, once you add a dGPU to enable today's minimum game resolution (1080p or 2k), you pretty much loose the benefit of half of the silicion real-estate on Kaveri APUs and might as well go with a Haswell i3. And I simply can't imagine game engine builders starting to partition their code into three distinct pools (dGPU, APU-G and APU-C).
In other words you waste the first €150 on your dGPU just to catch with the APU.
What's missing is the ability to stack Kaveris to do 2K and 4K resolutions properly.
We know that GDDR5 support is in the chip and that means you could build Kavery "blades" in a GPU form factor, much more capable at 1080p.
And now you'd just need the ability to stack as many of these as your target resolution requires: Two GDDR5 equipped Kaveris running CrossFire might do quite ok on 2k and four at 4k, costing perhaps €300 each with 8GB of GDDR and 100Watt maximum power.
Most important would the easy programmability of these slices for games developers, with CPU and GPU power growing proportionally.
Sprinkle in Mantle and SteamOS and you got me drooling.
We know they already can do something quite similar within a single SoC on the PS4 and Xbox3, which today probably would result in uneconomic die sizes for a 2x Kavery part (and still miss the 4K target).
Running cache coherency on 2-8 Kaveri type APUs at GDDR5 bandwiths may be a little rough but games should be extremely NUMA friendly.
AMD needs critical size for HSA to pay off and without an up-scale vision I can't see how they'll get that.
"though this was apparently with a /huge/ spreadsheet" What do you consider to be a "huge" spreadsheet? I work in Information Management and the largest spreadsheet that I've seen was 700MB. A SEVEN HUNDRED MEGABYTE XLSX file that took about five minutes to open and consumed a gig and a half of memory when opened.
I know what it meant by huge spreadsheets. I'm currently a postgraduate student working on liver cancer genomics and the bioinformatics data generated from next generation sequencing data are gigantic. Some Excel files reaches file sizes of around 7XXMB with 3XX,XXX rows times 1X spreadsheets each file took 5-10 min for our laptops to open. Simply using the VLOOKUP and Filter>Sorting functions will make my laptop load for several minutes and sometimes Excel even crashed. I'm using my Core-i5 430M laptop and these simple tasks with the colossal amount of data stressed the CPU usage to 100% and bended my laptop to its knees. If there's acceleration that uses the GPU (ATI Mobility Radeon HD 5470) too it should help a bit.
The thing is the bioinformatics data are Excel files generated by the bioinformaticiens from the raw next generation sequencing files after using R with their high performance computers to run various bioinformatic analysis algorithms. As we are cancer biologists, we perform wet lab most of the time so we don't have time to learn using the bioinformatic analysiswith R/Python whatever in either Windows or Linux environment so we have to fall back on Excel to do some simple compilation/sorting work to find our candidate genes for studying. Actually we know how inefficient Excel, especially running on our staff or students' laptops, is when doing such kind of bioinformatic analysis/statistical analysis since the amount of data is huge, but asking the scientists with Life Science/Medicine background to learn using R/Python/matlab is rather difficult. Some of us may not even heard of matlab since most of the time we're busy doing wet lab for validation and functional studies of the genetic aberrations found in cancer unless we have the time and that our supervisor ask us to learn these. If Excel/other spreadsheet programs can expand their functions and hardware utilization, why not? Most consumers do not have the proper IT training and it benefits a lot when more commonly used programs can utilize that extra hardware resources such as the GPU.
Sounds like your university's course work should include programming. My college even required undergrad biology students to learn programming. Maybe spend some money to work with the engineering department to write software for you. All the time saved on not using slow spreadsheets may outweigh the costs.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
13 Comments
Back to Article
camelNotation - Tuesday, January 7, 2014 - link
Here's an article about Libreoffice APU acceleration:http://www.datamation.com/applications/libreoffice...
The spreadsheet software Calc should be a great way to showcase the advantages of HSA in easy-to-grasp terms. Probably a good investment for AMD.
kyuu - Tuesday, January 7, 2014 - link
Is there any information on those Beema/Mullins tablet design wins; brand, specs, pricing, availability?DanNeely - Tuesday, January 7, 2014 - link
Don't forget power consumption; AMDs perennial Achilles heel in mobile.schizoide - Tuesday, January 7, 2014 - link
Any announcements about BRIX/NUC type small form factor PCs with Kaveri?Kaveri unfortunately doesn't sound like what we hoped for, essentially the XboxOne/PS4 APU on an open platform, but it does look like it might be appropriate for a HTPC or _maybe_ a low-end primarily in-house streaming target steambox. Bay Trail atom is fast enough for HTPC use but its GPU is way too slow for any gaming at all.
silverblue - Tuesday, January 7, 2014 - link
Kaveri was never targetted as an XboxOne/PS4 APU; the GPU is more far more modest than even that of the former. Even with Mantle dragging up performance, it couldn't perform on an equal footing. The key here is that Kaveri features far more powerful CPU cores than those in the consoles, meaning you can have a capable work platform along with decent performance in relatively new games provided you don't go mad with the higher settings modes. At $170-ish, it was never meant to compete with the consoles in the first place - the GPU in the XBox One alone is equivalent to an R7 260 and thus about $130, and that doesn't mention anything about the eSRAM or CPU cores, but you try computing with eight 1.75GHz Jaguar cores - you'd hate it, but their presence doesn't matter on the consoles.schizoide - Tuesday, January 7, 2014 - link
From a HTPC/steambox stance, I would be very happy indeed with a xbox1/PS4-type box available for $500. CPU power doesn't really matter.abufrejoval - Monday, January 27, 2014 - link
Kaveri is currently one of the best generic all rounders, but isn't that the fastest shrinking market in terms of *purchase*? IMHO it competes mostly with the hardware people already own and aren't likely to replace in near-term.I keep wondering: Why on earth did they miss that chance to capture the high-end gamer desktop and kick Intel where it hurts most?
I'm running Trinity, Richland, Phenom II X4/X6, Sandy Bridge i7-2600, Intel Core2 on desktops and gaming PCs in the family and it is quite clear that the CPU performance on Kaveri will be good enough for just about any game out there at least up to 1080p. But the GPU performance runs into a wall at 720p.
So in terms of pure CPU performance there is basically nothing out there, which requires more than Kaveri: All the *really* compute intensive desktop tasks like real-time video editing/transcoding really are better dealt with by GPU compute or a real VPU (video processing unit).
However, once you add a dGPU to enable today's minimum game resolution (1080p or 2k), you pretty much loose the benefit of half of the silicion real-estate on Kaveri APUs and might as well go with a Haswell i3. And I simply can't imagine game engine builders starting to partition their code into three distinct pools (dGPU, APU-G and APU-C).
In other words you waste the first €150 on your dGPU just to catch with the APU.
What's missing is the ability to stack Kaveris to do 2K and 4K resolutions properly.
We know that GDDR5 support is in the chip and that means you could build Kavery "blades" in a GPU form factor, much more capable at 1080p.
And now you'd just need the ability to stack as many of these as your target resolution requires: Two GDDR5 equipped Kaveris running CrossFire might do quite ok on 2k and four at 4k, costing perhaps €300 each with 8GB of GDDR and 100Watt maximum power.
Most important would the easy programmability of these slices for games developers, with CPU and GPU power growing proportionally.
Sprinkle in Mantle and SteamOS and you got me drooling.
We know they already can do something quite similar within a single SoC on the PS4 and Xbox3, which today probably would result in uneconomic die sizes for a 2x Kavery part (and still miss the 4K target).
Running cache coherency on 2-8 Kaveri type APUs at GDDR5 bandwiths may be a little rough but games should be extremely NUMA friendly.
AMD needs critical size for HSA to pay off and without an up-scale vision I can't see how they'll get that.
AndrewJacksonZA - Tuesday, January 7, 2014 - link
"though this was apparently with a /huge/ spreadsheet"What do you consider to be a "huge" spreadsheet? I work in Information Management and the largest spreadsheet that I've seen was 700MB. A SEVEN HUNDRED MEGABYTE XLSX file that took about five minutes to open and consumed a gig and a half of memory when opened.
Oh, and also, thanks for the highlights. ;-)
Gordon Chan - Tuesday, January 7, 2014 - link
I know what it meant by huge spreadsheets. I'm currently a postgraduate student working on liver cancer genomics and the bioinformatics data generated from next generation sequencing data are gigantic. Some Excel files reaches file sizes of around 7XXMB with 3XX,XXX rows times 1X spreadsheets each file took 5-10 min for our laptops to open. Simply using the VLOOKUP and Filter>Sorting functions will make my laptop load for several minutes and sometimes Excel even crashed. I'm using my Core-i5 430M laptop and these simple tasks with the colossal amount of data stressed the CPU usage to 100% and bended my laptop to its knees. If there's acceleration that uses the GPU (ATI Mobility Radeon HD 5470) too it should help a bit.moltentofu - Wednesday, January 8, 2014 - link
Good lord spreadsheet programs are the thing I love to hate...It's going to probably make me sound like a snob, but what is in Excel that isn't in Python / R / matlab / py extensions?
Gordon Chan - Wednesday, January 8, 2014 - link
The thing is the bioinformatics data are Excel files generated by the bioinformaticiens from the raw next generation sequencing files after using R with their high performance computers to run various bioinformatic analysis algorithms. As we are cancer biologists, we perform wet lab most of the time so we don't have time to learn using the bioinformatic analysiswith R/Python whatever in either Windows or Linux environment so we have to fall back on Excel to do some simple compilation/sorting work to find our candidate genes for studying. Actually we know how inefficient Excel, especially running on our staff or students' laptops, is when doing such kind of bioinformatic analysis/statistical analysis since the amount of data is huge, but asking the scientists with Life Science/Medicine background to learn using R/Python/matlab is rather difficult. Some of us may not even heard of matlab since most of the time we're busy doing wet lab for validation and functional studies of the genetic aberrations found in cancer unless we have the time and that our supervisor ask us to learn these. If Excel/other spreadsheet programs can expand their functions and hardware utilization, why not? Most consumers do not have the proper IT training and it benefits a lot when more commonly used programs can utilize that extra hardware resources such as the GPU.hortonj - Thursday, January 9, 2014 - link
Sounds like your university's course work should include programming. My college even required undergrad biology students to learn programming. Maybe spend some money to work with the engineering department to write software for you. All the time saved on not using slow spreadsheets may outweigh the costs.hortonj - Thursday, January 9, 2014 - link
How's Linux support for Beema and Mullins?