Thanks, I'll study these things a bit more.
Well the manuals are not perfect. For example, in multiple points they say "Matisse Ridge", even though there is no such processor line. The real name is just "Matisse", and its predecessor was "Pinnacle Ridge".
I'm not an expert in this, but there might be some key differences in these two similar boards. U2-2T uses 10GbE and the performance needs to come from somewhere. Also the max RAM is 64GB, while the other board can take 128GB.
Thank you for your comments, but let's make something clear here. RAID10 is dramatically faster and better configuration than RAID6.
When "N" is the number of drives and talking about throughput,
RAID10 = N times faster read, (1/2)N times faster write.
RAID6 = N-2 times faster read, no write improvement.
In IOPS when "X" is write performance, if a single drive has IOPS of 125,
RAID10 = NX/2 = 500 Write IOPS.
RAID6 = NX/6 = ~167 Write IOPS.
In RAID6 every write operation requires the disks to read the data and both parities, then write the data and both parities. That is where the performance loss comes from. Also, RAID10 has no system overhead, so effectively it has no latency impact on the system. RAID6 has computational latency and impact, though it is small (as you said CPUs are getting faster), but it's still measureable.
Due to parity complexity, rebuilding a RAID6 array is hard on the remaining drives, so 2-disk-failure is surprisingly common. So it's not actually twice as safe as RAID10, it's slightly safer in the best of circumstances. With some luck RAID10 can lose one drive per array. There are also a lot of things that could mess up all the drives at once, so living without backup is simply dangerous. Cloud backup is so cheap nowadays versus the build price of this NAS for example, there's very little reason against going for it. (I pay ~$2/month for Backblaze B2 for important files and an external HDD in a fire safe offsite location.)
More information is on page 22 of the motherboard manual: https://download.asrock.com/Manual/X470D4U.pdf
With my Ryzen 3600 (Matisse) the motherboard supports 2400MHz on Dual Rank or 2666MHz on Single Rank, if 4 DIMMs are installed. On Raven/Picasso/Pinnacle Ridge the frequencies are a bit lower. The maximum speed is 3200MHz with Matisse at 1 Single Rank DIMM installed in A1 slot.
I have a couple of Ubuntu server virtual machines, which are running PiHole, random Python/Nginx/Flask projects, an environmental sensor stuff (temperature etc), databases, Grafana etc. And a Minecraft server.
I actually got the idea to the motherboard, case and actually the whole project on a GamersNexus-video, where Wendell from Level1techs was guest starring. Great guy.
You are correct, USB stick is not a recommended way to go. The main reason why so many people still do that is based on the time when SSDs were more expensive. I just mentioned it as an option if every penny counts. Mirrored USB drivers (2) might be a reasonable approach, if you already have them.
Converter brackets and adapters are fine, but one should be careful choosing them because FreeBSD (FreeNAS) doesn't natively support as many adapters as Windows or Linux. Even my 10Gb network card required some manual poking to install the drivers.
Raid6 has less read/write performance than Raid10, which was one of the main reasons for me. Also Raid is NOT a backup, so it doesn't matter much if the very unlikely worst case scenario happens and a second drive fails within few days.
Also it should be noted that all Raid is software Raid, the only difference is if the software is in the operating system or in a separate controller card.
After some more thinking, if I had to save some money on this build there are some things that could be change. The case is the easiest choice, since one could save almost 300€/$ if using just some big but basic ATX case. If noise doesn't matter, you can just keep the noisier but more powerful default case fans (unless you went for another case, of course) and you could use the default AMD CPU fan too instead of the Noctua. If you don't edit photos/videos directly from NAS, you could just use the 1Gbps network integrated on the motherboard, instead of buying 10Gbps NICs. USB thumb drive as boot instead of mirrored M.2s would save you some money too, but make sure you have a backup.
Yes, RAID or equivalent disk pool causes a disk space penalty because part of the space is used for fault tolerance. The capacity penalties are approximately 50% for RAID 10, 25% for RAID 6 (dual parity) and 12.5% for RAID 5 (single parity). RAID10 is the closest equivalent to my setup, where 8 drives are installed in 4 vdevs (virtual devices) of 2 disks each. In theory this gives me 8x read and 4x write speed gain (versus 1 drive normal speed), and the tolerance for 1 drive failure out of 8 without losing data.
If you want to keep more storage space, you could go for RAID5 or equivalent for example. With 8x 3TB drives like my build it would give you roughly 21TB of usable space, 7x read speed and fault tolerance of 1 drive failure, but it would not give any write speed gain at all.
Thanks for comment! The build went surprisingly well, so I wouldn't probably change anything. However, I would make sure the WD Red drives are not SMR since they have inferior performance. These drives are EFRX-model, so they are not SMR, but at the time of building this I didn't know the difference so it was just luck. Also the processor (3600) might be a bit overkill, or there might be some other new AMD CPU that would be even better in terms of price-to-performance ratio.
The motherboard has 8 normal SATA-ports, which are then combined using two 4-to-1 SATA-Mini-SAS-cables, one for each backplane of 4 drives. There is also a non-reversed model of the cable available, which should be used if you connect 4 SATA-drives to 2 Mini-SAS-ports on the motherboard.
If I want to connect any more drives, I'm going to need a PCI-E controller.
And if a Dremel is not cheap enough, search a unknown clone version with keywords "rotary tool" or "rotary multitool". Of course the quality may vary.
Good luck with your build!
There are 4 expansion slots so a 2-slot GPU should fit if it's not too long. It's hard to take any measurements because the case is now in a tight spot. I would have to take out other devices in the rack to be able to access the NAS case.
I'm actually referring more to the one photo and one sentence description. That's just lazy.
Thanks for the comment! The LH-N9a AM4 is 37mm high with fan, so it fits perfectly. Perhaps you checked on some CPU cooler.
The other dimensions of my cooler are 114 x 92mm and I didn't have problems installing it. The motherboard backplate is glued on, so it needs to be removed with a hair dryer or heat gun, so that the Noctua backplate can be installed. The screw holes match, but the screws were not compatible.
Here is another photo of the cooler installed, so you can see the distance to other components: https://www.dropbox.com/s/z2e971vcwba6w7u/20200312_193807.jpg?dl=0
This build is so bland, it doesn't even have a capital letter in the title.
Like others said, the build is unbalanced. You could have gone for a less powerful Seasonic Prime and get a better SSD. The GPU is great but overkill for your 1080p monitor, 144hz or not.
I have the same monitor and it's great. The only downsides are it's too heavy for most Ergotrons and the resolution beats the crap out of your GPU while gaming. Also the image quality or colors for graphical work are great but not the best available.
If you're wondering why I have a hair dryer on one photo, the CPU cooler backplate was glued on the motherboard and I had to remove it to install the Noctua.
Just came back to tell you I finally understood this after 20 months.
Indeed, I just wish I had a workshop with a laser cutter and a ton of wood :)
The screen does not need a raspberry pi. It's basically just a small HDMI-display.
No complaints, it's a great monitor. I can't keep my fingers off of those curved displays for long though...
11 years since Breaking Bad started, feeling old yet?
Thanks for the input!
Laser or waterjet, both are fine for me. I just found a place which laser cuts brushed and sandblasted stainless steel (316) in 1-3mm thickness, so it might be quite nice for a case side panel, even if rest of the case is aluminum. I'll test it in the near future.
Temps and load only, at least at the moment. The touch part of the screen is broken and the glass panel of the case prevents input anyway.
CAM is a bit annoying yes, but you don't have to have it autorun on startup after you have set up your cooler. The only time I run it is when I want to change the cooler led colors.
Easiest way is to clone the drive. The easiest way to achieve that is to get a hard drive dock which supports both M.2 and your old drive (SATA?). These docks are usually around 30 $/€. Then you just plug your old drive into source slot and the new M.2 drive into target slot and either clone it in the dock itself or use some software, depending on your dock model.
This is usually done when both drives are the same size, but it can also be done from smaller to larger or larger to smaller drive, as long as the used space of the source drive can fit into the target drive.
After the process is complete (only takes few minutes), you should have two drives with identical contents and you can just install the M.2 into your PC and make it boot drive. Then continue like nothing happened.
Ergotron LX dual side-by-side arm might be good for you. It officially supports 2x 27" displays, but I think there is room for even bigger. Another option is 2x single display arms.
Thanks for the comments! The screen is a touch screen, but I didn't install the USB cable so touch doesn't work (and it's behind the glass panel of the case).
Absolutely go for 16GB.
I'm using 8GB right now with only Chrome (5 tabs) and some background processes open.
I have a 298mm video card and Kraken X62 in the S340 Elite and still have about 30mm space between the radiator and GPU. You will be fine.
Here you go :)
(Notice the price versus Amazon)
Almost as easy as putting your socks on. The AIO isn't opened in any way, you just put the sleeve on the pipes and close them with plastic parts in the ends.
Perhaps Antlion ModMic 5? (or 4 if you need even cheaper)
It's a headset mic that attaches to your existing headphones. You can choose between directional and omni-directional input.
Another vote for the NZXT S340 Elite.
Considering you are preparing to spend nearly $1000 for a CPU and $2000-$5000 for GPUs, not to mention other components, I'm guessing you have already researched that GPU rendering is faster in your particular type of rendering and it supports that many simultaneous GPUs. Or even supports GPU rendering at all.
What GPU is best for you depends a bit on what your specific need is. Usually the most important characteristic to look out for in a rendering card is the number of cores. Second, sufficient RAM to fit all the geometry and textures inside. If the frame buffer is not large enough, then some assets will overflow into main system RAM which runs much slower.
You should start looking at Titans. Even the older Titan XP slightly beats the GTX1080Ti, but the Titan V absolutely murders them in VRAY for example.
That being said, such money would buy you some serious rental time/credits on cloud rendering services.
If there are no unusual sounds coming from you PSU, you don't need to change it just yet. The fan might be louder than it used to be, but unless there are whines or squeals that weren't there before, I don't see any reason for replacement just yet.
The HX Series is targeted towards advanced users for use in gaming, overclocking and other demanding environments. It should be a solid PSU. That being said, even unknown brands can have a lifespan of 7-10 years. Some PSUs have warranty of 12 years.
A new PSU won't give you anything new for the money, unless you are seeking something specific like fanless operation. I'd give your Corsair a couple more years of service.
Backup, backup, backup!
I've had HDDs with long warranties, which still broke down 6 months after purchase. New noises are never a good sign.
SSDs are quite excellent in durability and shock resistance (not to mention speed), though nothing lasts forever.
Yes. Most likely it will work fine, but it cannot be guaranteed.
If you're interested in the (a bit technical and boring) explanation why this happens, you can read more in the link below :)