Storage, Performance, and Conclusion
Since we're not using an OS like FreeNAS, and since we don't want to use something like Windows Storage Spaces, we elected to use a hardware RAID card to build our primary storage array. We went with Allyn's favorite RAID company, Areca, and picked up a used 16-port ARC-1261 on eBay for only about $150. We upgraded its onboard RAM to the maximum of 2GB.
We got lucky with the timing of our build. Just as we were planning out and budgeting for storage, the Western Digital easystore line of external drives went on its first big sale. These 8TB drives, available exclusively from Best Buy, carry a retail price of about $300 each, but we picked up ours during that first sale for about $190 each, a huge discount (and they've since fallen to as low as $150 in subsequent sales — don't pay retail price for these drives!).
The key is that these external hard drives contained a standard Western Digital Red drive inside (at least, they did at the time; more recent models now contain white label drives that share the Red's performance characteristics but may have some compatibility issues with certain devices due to the power pin layout). So we canvased all of the Best Buy stores in our area, shucked 'em all, and ended up with a nice big stack of 8TB Reds.
To hold all of our drives and hardware, we chose the iStarUSA M-4160-ATX, a 4U rackmount chassis with room for 16 3.5-inch drives. One nice feature of this chassis is that it uses SFF-8087 miniSAS connectors for the storage backplane, the same used by our Areca RAID card. That meant only four data cables were needed to connect all 16 drives, helping keep our case neat and tidy.
With our drives installed, we used the Areca management interface to configure all of the drives into a single RAID 6 array with a raw capacity of 112TB. From there, we used Windows sharing and permissions settings to separate access to the Plex data and our PCPer files. And yes, we know, RAID is not backup. So we also have a Synology NAS onsite for local nightly backups (using the handy app Bvckup 2), as well as a Backblaze account for a second cloud-based backup.
Our primary storage array isn't going to be the absolute best performer due to its use of slower WD Red drives and an older RAID card, but it's still more than adequate for our needs. In terms of sequential transfers, we can achieve average speeds of about 640MB/s reads and 720MB/s writes. That's both locally on the server as well as for large sequential transfers via the 10GbE network.
Such speeds are overkill for Plex alone, but they make accessing our PCPer video and data files a much more pleasant experience when combined with the faster network.
As for our processor, we couldn't be happier with the Threadripper 1950X. When it comes to measuring streaming performance capability, Plex gives general guidance of about a 2000 PassMark score per simultaneous 1080p stream. The actual requirements will of course vary based on the complexity of the specific media file and your transcoding settings, but the "2000 per stream" rule is a good place to start.
We ran the PassMark Performance Test on our completed build and received a CPU score of 23,602 at stock frequencies. According to the PassMark database, the average score for the 1950X is 21,941, so we're sitting quite pretty.
Based on Plex's guidance, our score means that we should be able to handle at least 10 simultaneous 1080p transcodes, and that's really pushing it since at least some of our Plex clients will be direct streaming or direct playing media with little hit on the CPU. In short, when your Plex server is powered by a Threadripper 1950X, the limiting factor quickly becomes your Internet bandwidth, or even the speed of your storage array, rather than your CPU's transcoding horsepower.
Why Not a Used Xeon Server?
The recent trend among Plex enthusiasts is to pick up used Xeon-based servers in lieu of building a custom system from scratch. As companies both large and small upgrade to newer hardware, it's not uncommon for them to try and recoup some of their costs by selling their old hardware. The servers on the market now are generally V1 and V2-era E5 Xeons, and many of these servers, including dual-processor models, can be had for well less than the cost of a Threadripper 1950X alone.
So why not go this route? While the used server approach is a great option, it has a few drawbacks that we wanted to address. First, these Xeon processors are now several generations old and obviously don't offer the same level of performance as their modern counterparts. For example, here's a server listed for $700 that includes two first generation 8-core Xeon E5-2660 processors. With a PassMark score of 11,107 each, they fall just short of our 1950X's score, and that's assuming perfect scaling between the processors which isn't possible in most workloads.
Another issue is reliability. Enterprise-class components are certainly built for reliability, but the used processors and systems being sold have already given a "lifetime" of service. When buying these parts and systems, you likely won't know what types of workloads they were given, how adequately they were cooled or maintained, or any other issues that could affect their performance and longevity. While new components aren't immune to technical issues, we don't have to worry about any abusive past they may have experienced, and we have the protection of a manufacturer's warranty that will at least get us through the first few years.
Finally, there's the issue of noise. These used servers were tuned for cooling performance and intended for dedicated datacenters; there wasn't an issue in their former life if they sounded like jet engines. The same probably isn't true of your home or small office Plex server. In our new custom built server, the Noctua cooler runs at just over 20dB, and we opted for Noctua case fans for the drive array as well. It's certainly possible to modify the used servers for quieter operation, but in our case our server was optimally designed from the start.
So, in summary, there's absolutely nothing at all wrong with pursuing the used server route for your own Plex Server build, but we found the benefits of a single, modern, faster processor to best suited for our needs.
Quirks and Conclusion
Our new server has now been running for several months and has proven itself to be a fantastic upgrade, both in terms of productivity as well as entertainment. But it's still not perfect, and there are some changes and upgrades that we may consider in the future.
First, while Windows 10 Pro meets our needs, it's not the ideal operating system for this type of server. Those Windows Updates that we thought we could handle continue to be a pain, with the recent Fall Creators Update causing a significant amount of frustration when it unexpectedly broke compatibility with some of our apps and workflows. A better solution, and one that we just didn't have time to deal with initially, would be to use a storage-focused OS like FreeNAS or Linux Server, and then virtualize any other operating systems we may need. There are no immediate plans to take that step, but it's something we know we'll need to take care of sooner or later.
Our storage performance could also be better. Newer RAID cards, faster drives, solid state caching, and the latest PCIe 3.0 10GbE NICs could all help in this regard. For comparison, we're currently testing a QNAP NAS that, when populated with 7200rpm datacenter drives, can max out our 10GbE network with real world speeds of over 1GB/s.
In all other respects, however, our new Ryzen Threadripper-based server performs like a champ and has significantly improved things here at the office when compared to our old, slower server confined to a gigabit network. We've learned that even though AMD's Threadripper line is primarily targeted at high end workstations, it can make a heck of a server platform for small offices like ours.