Power Consumption and Thermal Performance

The power consumption at the wall was measured with a 1080p display being driven through the HDMI port. In the graphs below, we compare the idle and load power of the GIGABYTE GB-BNi7HG4-950 with other gaming mini-PCs evaluated before. For load power consumption, we ran both our custom stress test and the AIDA64 System Stability Test with various stress components, and noted the maximum sustained power consumption at the wall.

Idle Power Consumption

Load Power Consumption (AIDA64 SST)

The power consumption numbers tally well with the capabilities of the system.

Our thermal stress routine starts with the system at idle, followed by four stages of different system loading profiles using the AIDA64 System Stability Test (each of 30 minutes duration). In the first stage, we stress the CPU, caches and RAM. In the second stage, we add the GPU to the above list. In the third stage, we stress the GPU standalone. In the final stage, we stress all the system components (including the disks). Beyond this, we leave the unit idle in order to determine how quickly the various temperatures in the system can come back to normal idling range. The various clocks, temperatures and power consumption numbers for the system during the above routine are presented in the graphs below.

According to the official specifications, the junction temperature of the Core i7-6700HQ is 100C. We see that the temperature of the package is kept well below that number, without any throttling of the clocks. In order to make sure that we weren't overstimating the cooling capabilities of the system, we also processed our custom stress test that proesses a more strenuous workload for the GPU, RAM and the GPU (but, not the other parts of the system).

It is heartening to note that the thermal design is indeed very effective even in our unnatural power-virus test. The cores keep running at higher than the rated base clock (3.1 GHz instead of 2.5 GHz). The other interesting aspect is that the temperatures go down to below 30C for all the components in less than 30 minutes after the load is removed. The drives also maintain very reasonable temperatures in the system. On the whole, the thermal design of the unit is very impressive.

HTPC Credentials Concluding Remarks
Comments Locked

50 Comments

View All Comments

  • StevoLincolnite - Friday, October 28, 2016 - link

    nVidia must be giving these GPU's away. Such a missed opportunity not going with Pascal.
  • aj654987 - Wednesday, November 2, 2016 - link

    Alienware Alpha r2 with the gtx 960 desktop GPU is a better deal than this.
  • Samus - Wednesday, November 2, 2016 - link

    I don't think you can get an i7 in the Alpha r2...not that it really matters for gaming, but the extra horsepower of the i7-6700HQ in the Brix might help its GTX950 creep up on the GTX960 in the Alpha r2.

    But I agree, they are similar in almost every other aspect (even size) and the Alpha r2 is cheaper.
  • setzer - Friday, October 28, 2016 - link

    Regarding the last comment about going with the Skull Canyon NUC + External GPU.
    I'm not sure that is really a better solution.
    It's true that it gives the user the option of adding more graphics power (and easy upgradability), on the other side it also requires buying a discrete graphics card which is not as straight-forward as on desktops. This is because you are restricted on one side by the soldered CPU (which you can not change, thought the Skull Canyon NUC cpu should not be a problem for some time) and on another side by the bandwidth between the system and the external enclosure (just 4 lanes of PCIE 3.0 bandwidth).
    This last point makes it hard to figure out on what graphics card is actually the best for your restrictions. So instead of a selection of all the graphics cards up to the power limit of the enclosure you have to figure out which ones do actually offer the best price-performance. I.e of course you can drop a Titan there but will the difference to a GTX 965M (over the 16 lanes of PCIE) be significant?

    Regarding this last point, would it be possible to test external enclosures and figure out actual metrics for the performance gains?
  • wavetrex - Friday, October 28, 2016 - link

    I wonder if I can build a house out of these bricks ... excuse me, Brix :)

    Joking aside, very few people would know it's an actual computer.
  • nico_mach - Monday, October 31, 2016 - link

    It's a SQUARE trash can! Progress! Where's the pedal, tho?
  • hubick - Friday, October 28, 2016 - link

    I'm typing this on my Skull Canyon NUC, and have a Razer Core, and having read the benchmarks before buying, the PCIe4x limitation is surprisingly small. IIRC, it's somewhere in the ballpark of 10-15% or so, and that doesn't really change when going from a 980's to a 1080 either. It makes sense when you think about it... you're essentially transferring textures, shaders, and a bunch of vector information to the GPU for rendering... and that will be pretty much constant regardless of if you're rendering the output at 720@30hz or 4K@60hz.
  • aj654987 - Wednesday, November 2, 2016 - link

    why would you even bother with that, might as well build an itx for less money and less clutter.
  • hojnikb - Friday, October 28, 2016 - link

    Guys, are there any passive mini pcs coming out with kaby lake ?
  • TheinsanegamerN - Friday, October 28, 2016 - link

    there are no PCs period with kaby lake yet. kaby lake isnt out yet.

Log in

Don't have an account? Sign up now