isoblog

Building a Relatively Sensible Server

Posted on 2024-10-03

It has become clear to me that the laptop NAS I built in December of last year, while a fun project and a good learning experience, is unfit for the purpose that I need it to fulfill and is best stripped for parts and replaced. Here I'll detail the main reasons why it sucks and my plan to build something just a little bit more sensible.

What In God's Name Have I Done

To get you up to speed in case the limit of your attention span is somewhere between one blog post and two blog posts, as of late last year I had been using an old gaming laptop as a home server for Minecraft and such things to great effect for a while, and had decided that I also wanted a proper NAS. Rather than buy a whole new CPU, motherboard, and RAM for this purpose, I saw fit to attempt to upgrade, or rather modify, or rather pervert, bastardize, and generally corrupt my existing laptop to have the storage capacity I wanted.

The end result of this initiative was a gaming laptop with a hole cut in the bottom to accept connections to a 2-port SATA controller card occupying the space where the WiFi card used to be, perched atop a custom chassis hewn of hardware store aluminum rods and 3d-printed brackets, within which was contained four high-capacity 3.5" HDDs, a pair of dubious SATA "splitters" to connect them all to the laptop, an ATX PC power supply to power the drives, and the pared-down mainboard out of a Supermicro enterprise-grade drive shelf to interface the power supply with a power button.

It was beautiful, and for reasons that are entirely beyond my ability to grasp, it didn't end up working very well.

This is where we find our intrepid hero today.

This Whole Enterprise Was Fundamentally Ill-Conceived

I was expecting this thing to have problems, but I was unable to anticipate the specific problems it ended up having. Performance is... fine, actually. Better than expected, even. That horrible little 2-port SATA expander, or, failing that, the SATA multiplexers, are almost certainly bottlenecking the bandwidth to the HDDs, but it's fast enough to saturate the gigabit network infrastructure in my house, so it doesn't really matter.

In fact, this little fucker is absolutely perfect for hosting an SMB share. It's been great for backups and the extensive collection of Linux ISOs that I install to machines on my network using Jellyfin.

The real problem is stability. Crashing has been a constant problem, especially when I touch anything software-wise. I try to update an app and it crashes. I try to install Syncthing and it crashes. I browse for new apps to install in the TrueNAS web UI and it bloody crashes. It crashed while running a system update once. I briefly attempted to spin up a Minecraft server on this thing and it crashed when I went to turn Minecraft off. Sometimes I'll go to use Jellyfin and it just won't be there and I'll have to troubleshoot it, which usually involves a good couple reboots. This must be how people felt dealing with old versions of Windows. Put shrimply, this is fucking unusable.

Like the Windows Vista users of old, I have no clue where to even begin diagnosing this. The one possibility I've got is that it could have something to do with the fact that my laptop keeps throwing "SMART errors" during boot, but like, I'd expect TrueNAS to complain if something was actually wrong and even then I've tried and failed to diagnose the SMART errors anyway!

Whatever this is, I feel pretty well hopeless to fix it, and even then, it's also been getting some worrying temperature readings as of late and when I tried to re-paste it I stripped one of the screws holding the cooler on. So I guess short of some surgical dremel-ing of some tiny screws that's never gonna improve. With all that to contend with, I'm about ready to give up on this thing and start from scratch, properly this time.

However, despite all the problems with the laptop itself, all the other hardware has been nothing short of totally rock-solid. All of it has worked as advertised and I'll be carrying the drives and, if possible, the PSU1 over into the new system.

Something Just A Little Bit More Sensible

Angle of Attack

Firstly, as previously mentioned, the PSU and storage are accounted for already. I have eight total drives I want to put in the system: two 256GB SSDs as mirrored boot devices, 2 1TB SSDs, also mirrored, for TrueNAS apps and such things, and four big boy hard disks for bulk storage. Most of that stuff was bought for the original build, but the mirrored SSDs proved infeasible in the end. I don't strictly need them, but it would be very nice to have. Anything in serious consideration will need some way to accommodate my preponderance of SATA devices.

This narrows down the possibilities considerably. Namely, a lot of the common advice for cheap or beginner home servers doesn't apply to me. A mini PC or almost any used consumer desktop wouldn't have enough space for drives, and anything new is right out for the same reasons plus cost — half the reason I did the laptop NAS project in the first place was to save money.

I briefly considered getting a used workstation PC of some description. A used Lenovo Thinkstation P520 would have mounting space for all my drives and plenty of performance, but the proprietary motherboard, PSU, and power delivery system in general gave me pause, both for repairability reasons and because I wasn't confident I could actually run power to all the drives. And that's the best option I've found.

I could get a used rack-mount server, like a Dell PowerEdge or similar. This is also a decent option, and it would be quite cost-effective, but I've got a few good reasons not to. First, most of these servers have old Xeon datacenter chips in them that are still good for many home server tasks, but aren't particularly fast or efficient. Take this PowerEdge R730 for instance:

An Ebay listing for a Dell PowerEdge R730, with two Xeon E5-2623 v4 CPUs and 128GB of RAM, for $187 CAD, plus $57 CAD in shipping.

This has two Xeon E5-2623 v4 datacenter CPUs. At 4 cores/8 threads and 85 watts apiece, this server would have the same core count as the Ryzen 5700G in my desktop for almost three times the power draw, and half the single-threaded performance. In many cases, especially for servers, lackluster single-core performance isn't a problem, but if I want to run game servers on this, which I do, these chips may struggle.

Plus, rack-mount servers kind of suck for home use. They're very unweildy if you don't have a rack, and I don't, nor do I have anywhere to put one. They also tend to be loud. If you think the gaming laptop you've been using in your bed for the past three years and never cleaned sounds like a jet engine, you ain't seen nothing yet. Having one of these in my room would be intolerable. Of course, the R730 is a 2U server with no provisions for my 3.5" drives anyway, but a unit that could accommodate them wouldn't be much better.

Logically, then, the only remaining option is to build a system from scratch, out of used parts, to my specifications. That, therefore, is what I shall do, and I think I can do it for a similar amount of money to the GPU upgrade I was planning to get before this whole thing really started becoming an issue. Here's my plan:

My Grand Design

In spite of the machine it's in, I've found that the i7-9750H in the laptop performs admirably for my use case. I don't really need a CPU upgrade here — hence my efforts to repurpose the existing one — unfortunately, however, the BGA form factor of this particular chip is incompatible with 100% of consumer desktop motherboards, so I'll be needing a new CPU regardless. Though I don't want or need a major upgrade, I also don't want a downgrade if I can avoid it, so I'll primarily be looking at other high end gaming chips from the era.

I initially gravitated towards the 9750H's desktop counterpart, the i7-9700K, for these reasons, but the significantly lower price, improved energy efficiency, superior upgrade potential, support for unregistered ECC RAM, and vastly superior multicore performance2 of its contemporary, the Ryzen 7 3700X, made the latter a bit of a no-brainer. I found one on Ebay for a cool 100 Canadian Prize Tokens and a matching X370 motherboard with eight whole SATA ports, which astute readers will recall is the exact amount I need, for about a hundred of God's True Imperial Dollars. As for cooling, the 3700X is a 65-watt chip, so the box cooler I've got lying around from my 5700G should suffice.

The 3700X lacks an integrated GPU, so I'll need a dedicated one in some form. Since I'm installing a server OS that's gonna output plain text to a display exactly once per install, I won't need a ton of graphics horsepower. I considered something like a GT 730 for a token display out and nothing more, however I'd like some transcoding capability for Jellyfin as well. After some research into cheap, low power GPUs for this I settled on an Nvidia Quadro P400 for about 60 CanuckBucks™ after shipping.

I've seen some redditors complaining about the P400's performance doing two 4k HDR streams at once, but all that tells me is that it can handle at least four simultaneous 1080p SDR streams, which is what I intend to use it for, without breaking a sweat. If I really need more, I can always upgrade to an Arc A310 once driver support for those in TrueNAS is less sketchy.

TrueNAS likes a lot of RAM for various ZFS-related purposes. The laptop seems happy with 32 gigs, so I'll start there and upgrade down the road if need be. I'll grab something cheap from a reputable brand, and I'll probably get it new since used prices on DDR4 are currently not really better.

I'd also like to get a pair of 10Gb network cards at some point to attach this thing directly to my main computer for maximum transfer speeds between the two, but that's not really critical and I'll worry about it later.

Finally, I need a case. Fortunately, this time around I don't have to build one myself. After a cursory search on PCPartPicker for cases with four each of 2.5" and 3.5" drive mounts didn't turn up anything for particularly cheap, I went to look at used ones and didn't initially have much luck there either. I scrolled through about a dozen listings for off-brand gamer-spec swill that wasn't even priced better than the name brand stuff before filtering by used and finding a bunch of listings just labeled "PC case", none of which I could confirm met my exacting needs.

The first case I found that obviously fit my use case was this thing, the weathered old shell of a Pentium III tower from the dawn of the ATX standard:

A beige PC tower with four 5.25" bays and two external 3.5" bays, populated with an optical drive and a floppy drive respectively. The only visible branding is a Pentium III sticker and a badge that reads "TOUCH".
Based on my research, this is an Enlight 7237, or something closely related, but the front panel seems unique. I'm curious about the branding here. If anyone recognizes that "TOUCH" badge, get in the comments.

Honestly, I kind of love it. I'll transfer the 5.25"-to-3.5" adapter cage I bought to mount the drives in my homebrew case to the bays in this thing wholesale, and I can 3D print some brackets to mount the SSDs in the 3.5" bays. And, somewhat counterintuitively, it might actually be easier to build in than those modern RGB glass side panel power supply basement having ass Gaming™ cases:

The inside of the same old PC case. Its layout is extremely pragmatic: motherboard in the very bottom corner, PSU in the top corner above it, and all the drive bay mounts are in a cage lining the front side. In other words, it looks like an old PC case.

Just look at it. Everything's right there. I can take the side panel off and instantly get to every component with my hands without spinning the case around and I don't need to run any cables through any fucking gromets to get them where they're going. It looks supremely convenient. As the type of person who enjoys building computers and not just using them as decoration, this is beautiful to me.

The biggest issue with this case is the cooling. It's absolutely pitiful by modern standards. Based on these photos, it looks like there's a single 92mm, possibly 80mm, fan each for intake and exhaust, with a passive vent in the side. I'm already bringing some extra cooling to the table — my drive cage has a mount for a 120mm fan, and I can use the fan in my much better ventilated modern PSU for extra exhaust, and with a 65W CPU and no meaningfully powerful GPU that might just suffice. If not, I've got some ideas involving 3D printing and/or a hole saw to make this thing a little less stuffy.

Actually Building The Sucker

Everything above that heading there was in large part written a few weeks ago, when I was ordering all this stuff and it was all fresh in my mind. We now rejoin our intrepid hero as she receives the last of her Ebay orders and prepares for initial testing.

When I got it, the case was pre-loaded with all the stuff in those photos up there: a CD-ROM, a floppy drive, a power supply, a single 80mm intake fan3, and a generous helping of, to quote the PC repair technician from my Youtube shorts feed, Swamp Gooch™. The first thing I did was take the case apart and remove all that stuff, because I certainly wasn't going to be using any of it, least of all the dubious PSU. The next thing I did was hose it down to get rid of the almost thirty years of Home/Office Residue that had accumulated inside.

Once it was clean, the case did, admittedly, need some updating. The set of motherboard standoffs in the case was incomplete — six of the usual nine were present — so I ordered new ones. I also had to design that 3D printed adapter bracket for my 2.5" SSDs myself, as well as a mount for another 80mm fan to eke some more airflow out of those unused front expansion bays4. After the case was prepared and the mounting brackets printed, there wasn't much to do until all the other parts arrived. I learned about some interesting limitations of the motherboard in my main computer while testing to make sure the GPU worked. I think I might be the sort of person who buys motherboards with the high-end chipset now.

The last major component to arrive was the motherboard, and it arrived missing some important hardware, namely the M.2 standoff and the stock mounting bracket for the CPU cooler, the latter of which was in the photo on the Ebay listing! That's what I get for buying used, I guess. Fortunately both parts are pretty cheap to order online.

Assembly went smoothly; not much to say here that you couldn't get from Linus Tech Tips. It turned out that both the original standoffs and the ones I ordered were the wrong height — the original ones were too long, and mine were too short. I ended up putting the original standoffs back in because neither were so far off as to render the rear I/O unusable and my standoffs made installing the GPU properly impossible. I also had to use a couple of breadboard wires to hook up the power LED, since the pins on my motherboard were right next to each other and the connector in the case was 3 pins wide with a blank one in the middle, for some reason. However, unlike some cases I've worked in, accessing the modular PSU cables was an absolute joy5.

What didn't go smoothly was actually booting the thing. On first power on, it failed to POST and got stuck on a code "8" on the little seven-segment diagnostic code display. I initially assumed it was training the memory, since this can take a while on first boot, but after about five minutes of no change I started getting worried. I read a lot of very worrying forum posts that suggested that either the CPU or motherboard were straight up dead, and I very nearly bought another CPU to see if that was the case. First, though, I decided to try updating the BIOS using the "BIOS Flashback" feature, and that fixed it immediately. I wasn't done yet, though.

Astute readers will have figured out by now that I was using TrueNAS on the old machine and intended to continue using it on the new one. Installation went extremely smoothly, except for one small wrinke: I had decided to take the NVMe boot SSD from the old laptop to use as storage for my game servers6, and even though I had done a fresh install of TrueNAS to a pair of mirrored SATA SSDs, the NVMe one, naturally, still had TrueNAS on it. This, from what I understand, caused the OS to get stuck trying to boot because there were two ZFS pools marked as the boot pool, and it would just give up and drop me into an initramfs shell instead. Booting into a live environment and using GParted to clear all the partitions on the drive didn't fix it, and after a bit of googling I learned I had to use the initramfs shell to clear the volume data for the disc directly.

After that was cleared up and I was booting into TrueNAS, setup went off without a hitch. I imported the config I had saved from the old server and everything was instantly up and running like nothing had ever happened, and suddenly, I had a working server again.

Postmortem

The first natural question is: Did it work? Was I right? Did our intrepid hero succeed in her quest to finally have a server that's actually stable enough to use? And the answer is yes. Tremendously so. In the couple days it's been running, this thing hasn't crashed once, even after I installed Syncthing7 and set up an entire VM to run Minecraft. Whatever was causing all that instability with the old server didn't make it over to the new one, and I should bloody well hope not considering the only parts I carried over were a bunch of drives that were in perfect health according to both ZFS and the SMART tests they kept passing.

There is one major issue, though. See, since I was planning to run a stock cooler in a case with airflow that was subpar at best short of cutting holes in it, I made sure to get the strongest fans I could find to fill the available mounts, which for me meant three 80mm Arctic P8 Max's — one in front, one in back, and one in the 3.5" cage, plus a 120mm fan in the 5.25" bays.

These fans have a maximum speed of 5000 RPM, which makes them quite capable at compensating for the sub-optimal airflow conditions, but this was also the basis of my one fatal miscalculation: these suckers are loud. Not 1U rackmount borderline-jet-engine level loud, in fact the noise is more or less tolerable at low loads, but under significant load, such as, say, the Minecraft server I've been so fixated on, they get quite distractingly loud, which is a problem since the only feasible place for me to put it at the moment is on my desk.

I soon plan to implement a multi-step plan to attempt to fix this issue: First, in my typical fashion when I find my equipment is slightly inadequate, I'll replace the AMD stock cooler I used with something gratuitously overbuilt — I'm thinking a Thermalright Peerless Assassin. Then, if that doesn't work, I'll tear the whole thing down and go at it with the hole saw, and mount a couple nice, big, quiet fans in my fresh new speed holes. If the 80mm fans are still causing a ruckus after all that, I'll just remove them. They probably won't be of critical importance at that point anyway.

Overall, the case was a bit of an impulsive and perhaps sub-optimal choice, but I'd do it again in a heartbeat. I love this thing. I might be a little more liberal with the hole-making, though.

I don't regret my choice of core components at all, at least not yet. The CPU, despite the noise, runs Minecraft as spectacularly as I'd hoped. At least, after some fiddling with my VM configuration. Its efficiency certainly also helps with cooling, given the circumstances. The motherboard, being a fancy X-series one, has various Gamer Lights and a myriad of overclocking features that are a bit silly to have in a server, but it also has the expansion I need and that's what really counts. Plus, that fancy 7-segment display was very helpful with diagnosing it.

The RAM is RAM. It's fast enough for my porpoises with or without XMP, which I left off to optimize for stability. I might get more down the road. The graphics card is exactly what I need, and Jellyfin seems to like it fine. Couldn't have asked for a better value there. I'm still waiting on the GT 730 I foolishly ordered before looking into it properly. That's fine, though, it was even cheaper than the P400 and it's always nice to have spares.

I ended up getting those 10 gig ethernet cards, but I couldn't use them like I wanted. As it turns out, a GPU and 2 NVMe drives is all the PCIe bandwidth the B550 chipset in my main rig has to offer. My next upgrade might just be a used X570 board. I did still install one in the server, though, and since the fan on it was very loud on account of being very dying, I replaced the shambling corpse of the stock fan with a 40mm Noctua, spliced onto the original header and mounted using one of my best zip-ties:

A 40mm wide, 10mm thick Noctua fan zip-tied to the heatsink on a PCIe Ethernet card. The zip-tie is screwed into one of the original fan mounting holes on the heatsink. As in, the screw is going through the zip-tie.
Yes, I attached the zip-tie to the heatsink by screwing it in. Clever, right?

Would I recommend you follow in my footsteps? Absolutely. This thing is great. In order to make this work, I had to build a normal computer. Short of a rack-mount case and a rack in which to mount it, this is how you build a home server. Frankly, I was absolutely right that violently contorting my existing gaming laptop into a TrueNAS box would be cheaper than this, but I can now say with complete certainty that the savings were not worth it. This is less entertaining, sure, but I'm glad I finally did it properly. Well, at least to the greatest extent that I seem to be capable of "properly".


1. I'm not referring to the Seasonic S12III from the last installment. That thing is honestly a little sketchy for me to be using it in a proper server if I don't have to, and I don't. When my PC's power supply, a Seasonic GX650, exploded, I RMA'd it and then bought a beefier one to replace it with, and the replacement unit Seasonic sent me went in my server, and will be going in the new one. The S12III is serving as an improvised benchtop PSU using one of these bad boys at the moment.

2. If cpubenchmark.net is to be believed, anyway.

3. I could, in theory, reuse this fan, but I don't want to. It's so nasty that attempting to run it seems like a good way to invent a new respiratory illness, and even if I cleaned it the bearing is geriatric and the performance probably isn't very good, plus my modern motherboard might not even have a header for it. Its replacement will suit me much better.

4. You can download the model for the fan mount and corresponding grill I designed here. I'd publish the drive bracket but the one I printed needed some modification in order to work, and I don't want to spend the time and filament to refine it right now.

5. Mind you, this case is well north of 20 years old. This case shipped to me with a Pentium III sticker on it. This thing may well predate modular power supplies as a concept, and somehow it accommodates them infinitely better than the 2-year-old case on my personal machine, which barely has room to insert a modular PSU, let alone actually get at the cables afterwards. I don't understand how this is possible.

6. Generally you'd use this for L2ARC cache for ZFS, but from my research I don't have enough RAM for that to be useful anyway, and game servers are the only thing I currently want to do that's likely to benefit from the added speed of an NVMe SSD.

7. Finally, at long last, I have automated backups to the enormous HDD array in my NAS! Took me bloody long enough.


Comments