This is my new workstation, aimed at being a jack-of-all trades. Used for security research, reverse engineering, development, music production, video editing, streaming, and a bit of gaming.
Quick spec overview:
- 2x Intel Xeon Platinum 8276L (28c/56t, 2.2GHz base, 4.0GHz boost, large memory support)
- 192GB (12x16GB) ECC DDR4 PC4-21300 2666MHz CL19
- EVGA RTX 2080 Ti 11GB FTW3 Ultra
- Asus WS C621E SAGE Dual Socket 3647 Workstation Motherboard
- Samsung 970 Evo Plus 1TB (OS) + 4x Corsair MP510 960GB in RAID0 (applications)
In total I have 56 cores (112 threads) and 192 GB of hex-channel ECC DDR4, across 2 NUMA nodes. This is sufficient to run a lot of VMs, which comes in handy for the kind of work and research I do. Just for fun, I created a full Windows 10 Pro x64 1903 VM install on a RAM disk, and didn't even use half of my installed memory in the process.
The Xeon processors are qualifying samples (QS) which operate exactly the same as the retail chips, except with unlocked clocks, and they cost significantly less. Cursory overclocking attempts did not yield stable results but I have not exhausted this avenue. I will report back if I manage to get a stable overclock out of it.
The 4x MP510 960GB NVMe SSDs are mounted in an Asus HYPER M.2 card, operating in an RAID0 array with Intel VROC. The motherboard supports 4-way bifurcation (splitting a 16x slot into four 4x devices) in order to make this happen. Performance is excellent in multi-threaded IO loads, exceeding 12.4GB/s reads and 11.9GB/s writes (sequential).
Additional storage (several SSDs and a couple of HDDs) has been re-used from a previous build, but is not listed.
For monitors I'm running a Dell P2715Q and a pair of Acer CB240HYK (one either side of the Dell). All are running at native 3840x2160 with no OS scaling.
Fan configuration is as follows:
- Front: 8x 120mm NF-S12B redux-1200 PWM, in a 2x4 top/bottom control split. All configured as intakes.
- Rear: 2x 120mm NF-S12B redux-1200 PWM, controlled together. Configured as exhausts.
- Top: 3x 140mm NF-P14s redux-1500 PWM. Rear two are configured as exhausts, front one is an intake. Separately controlled. This avoids exhausting cool air from the front of the case before it can reach the CPUs.
A few minor problems during the build:
- I almost put liquid metal thermal compound on an aluminium cooler, before realising at the last second that it was gallium-based. If you're not sure why this is a bad idea, search "gallium and aluminium" on YouTube. I had to stick to the pre-applied paste, which seems to do a good job anyway.
- The case does not have proper support for SSI CEB form factor motherboards. Only around two thirds of the motherboard's mounting holes line up with screw holes in the case. This is workable but not ideal.
- The case is HUGE and requires extension cables for the ATX 24-pin and both 6-pin CPU power cables in order to reach all the way around. Discovering this after getting the board in the case was frustrating as it delayed me by a couple of days until I could get cables delivered.
- The whole system weighs about 40kg, mostly because the case is made of steel and tempered glass. Moving it from my build area to where it's living now was a two-person job, and a task which I would prefer not to do again. No wonder most retailers deliver the case on a pallet. (Fun fact: I can climb inside the box it came in and close the lid)
- I originally bought a Gigabyte RTX 2080 Ti Aorus Xtreme, which was either faulty or incompatible with the Intel C621E chipset, as DisplayPort functionality was intermittent. Props to CCL who refunded me for the card despite being unable to replicate the issue during RMA. I'm honestly glad to be rid of the Gigabyte card, though, because my experience with RGB Fusion was not an enjoyable one. EVGA Precision X1 is far superior.
- The Asus C621E SAGE WS board appears to have a minor bug wherein the "Intel C620 series chipset PCI Express Root Port #1 - A190" device fails to start under Windows. This does not appear to impact functionality in any way, which is rather surprising but I'm not complaining. I suspect this is a firmware bug. I have filed a bug report with Asus but do not expect much to be done about it without me having to RMA at least one board first, and I really don't want the hassle.
As fast as advertised. Works with Intel Virtual RAID on CPU (VROC). Four of these in VROC RAID0 with an Asus HYPER M.2 card, in a PCI-e 16x slot with bifurcation enabled, gets me to around 12.5GB/s sequential read and 11.9GB/s write on CrystalDiskMark's Q64T64 test.
Upgraded to this from the Corsair 900D because I needed a case with SSI CEB form factor support. It has a ton of space and a very premium feel, but incredibly heavy once you've built a system in it and put the sides on. There are 10 year old kids that weigh less. The inbuilt fan/LED controller is very useful.
Couple of words of warning:
First, if you buy this case, buy an ATX 24-pin extender and some 6-pin CPU power extenders. You need almost a whole meter of cable to get from the PSU, round the back of the cable management area, up through the top, and back down to the board.
Second, while SSI CEB form factor support is technically available in this case, it comes at the expense of not having standoffs for quite a few of the screw holes on the motherboard. This leaves quite a lot of the board unsupported. It's not a problem for general operation, but it does leave you with the risk of flexing the board while putting new memory in or taking it out.
Works perfectly and I've never heard it spin the fan up. Overkill even for a dual-CPU workstation with 192GB of RAM, a 2080 Ti FTW3 Ultra, five NVMe drives, 17 fans, a bunch of SSDs, and a couple of HDDs.