My New Laptop Finally Showed Up

It feels like forever ago that I mentioned I ordered a new laptop. I was expecting to get it before Christmas, and initially at least, the post office agreed. It quickly made its way from Long Island to a post office distribution center in New Jersey. Where it then sat for almost two weeks. It appeared to be lost in transit. Eventually, someone must have tripped over it and got it on the right truck, because it showed up on Saturday. It’s pretty much what I was expecting so far.

I picked an excellent condition open-box option, which saved about $70. The computer arrived in its original box, and appeared to include all the original items, which was just the charger and a couple papers. My HP Envy x360 15″ is equipped with and AMD Ryzen 4500U, 8 GB of RAM, a 256 GB NVMe SSD and a 250 nit 1080p display. I ordered a 16 GB RAM kit which should be in some time next week. I picked a Ryzen system to take advantage of the excellent on-board graphics capability, and I’ve been pleased with that so far.

I’ve only tried out Civ 6 and Lego Star Wars: The Force Awakens, but both have been completely playable at 1080p on medium settings, getting 30-45 fps with no trouble. Neither of those are terribly intense games though, so I’ll have to try out something heavier in the near future. I should also try them out on the old Surface Book to see what a big leap it is.

Other things I like: the keyboard is excellent, and it has a number pad which is nice. It has a little cover for the webcam that’s electronically controlled. The display takes up almost the entire lid. The battery life is good, and it’s very quiet. It’s a new computer, so of course it’s still speedy and uncluttered. I almost love the trackpad. Its clicks are great, and it’s a nice large size. It would be perfect if it was glass instead of plastic.

Which brings me to things I don’t care for: The overall construction, while solid, appears to only use metal on the lid exterior and it doesn’t feel nearly as nice as the Surface, or my wife’s old Lenovo Yoga, which are both all-metal. The screen, which does have nice colors, seems kind of dim to me, but it’s not problematic indoors. This particular laptop has the lowest-end 250 nit display, and there are 300, 400 and 1000 nit versions available. I couldn’t find a 400 nit version anywhere for a reasonable price. Maybe some day in the future I can swap in a used one. The display is of course in the 16:9 format, which frankly sucks for a laptop. It’s way too wide. The 3:2 of the Surface is maybe a tad narrow, but it’s so much better than 16:9, especially for writing. My Surface pen works perfectly with this computer, but with the screen so narrow and ridiculously long in portrait mode, it’s almost pointless to write on it. I don’t think there are any Windows laptops other than the Surface line with a 3:2 screen, and they’re just too expensive now. Dell offers a 16:10 format on there higher-end stuff, but they’re a little too expensive for me. Personally, I think the 4:3 format of the iPad is probably the ideal for non-television screens.

Anyways, I really like it so far. I’m eager to get the new RAM installed and see if that makes any difference to gaming. Task manager showed that all 8 GB were used with Lego Star Wars going. Despite it’s foibles, I think it would be hard to do better for the price.

Some Computer News

Over the last couple days, I’ve acquired a whole bunch of TV shows with the HEVC codec. Not really a problem, but a lot of devices, like a Roku or Fire TV thing, can’t play it directly. That means it has to be transcoded. I use VAAPI in Jellyfin to trancode, but it doesn’t really work with HEVC. I found out the drivers that Debian provides are version 18.something, and I need at least version 20.1 for VAAPI to transcode HEVC video on an AMD graphics card. Those drivers are available in the testing repositories, but the dependency requirements are a little too complicated to make installing them worthwhile. I also put together a VM with Ubuntu 20.10, which has those drivers as standard, but it failed to boot up when I had the GPU passed though to it. The LTS version worked, but like Debian, the drivers are too old. Hopefully the newer drivers make it to Debian’s stable repos sooner rather than later. In the meantime, I’m fine for two reasons: First, I usually watch stuff through Kodi on a computer, which means I can direct play everything; second, my CPU can handle transcoding a couple streams at once, so it’s not a huge problem.

In other news, I ordered a new laptop today. I was eyeing a few during the black Friday week things a couple weeks ago, but decided against it. Today, I was at my parents’ house taking care of some school work. I had my Surface Book (first gen) hooked up to a 1080p monitor for some extra work space. I had Excel, Word, about a dozen Firefox tabs open while playing music on Spotify. It felt a little sluggish and not as responsive as it should be. In fact, when I scrolled though my RSS feeds, the music skipped when it was loading images. CPU usage was 80-95% when watching a 1080p Youtube video with the other stuff open in the background. This isn’t something I do often, and this semester is coming to an end, but I still have two more to go, and it is nice to take my work with me if I want to.

The touch screen on the Surface has also been broken since the summer. It sometimes experiences phantom touches along the bottom inch of the screen. That’s disappointing, but livable. I don’t really use the touchscreen. More unforgivable though, is the stylus situation. It won’t work along the edges of the screen, even after repeated calibrations, rendering it useless. I bought an iPad and Apple Pencil in September to pick up this slack, but I haven’t used it much. I haven’t felt the need to take notes in my classes. It feels like I kind of wasted the money on that, but eBay shows that I should be able to sell the stuff for almost as much as I paid for it if I want.

These issues got me looking for laptops this afternoon. I first turned to Slickdeals to see if there were any good deals out there today. First, I found an HP Pavilion that seemed like a good deal. It had a Ryzen 4700U CPU, 8GB RAM 128 GB NVMe SSD and a 1080p display for $450. Someone left a comment comparing it to a similar Dell. The Dell seemed like a better deal, with a better charging system, two M.2 slots, and 1x 8GB RAM stick (making the upgrade to 16GB easier). I found out from another Slickdeals post that there was 12% off Dell stuff with a sign up at a third-party site. I did that and was ready to order a 15″ Inspiron 5000 for about $500. I read and watched a few reviews, and decided the display and build quality would be too big a step down from the Surface, so I moved on.

I gave Best Buy a look and set my only criteria to an AMD processor and a 1080p display. Another HP popped up, but this time it was an Envy x360 (that means the screen flips all the way around). This one was equipped with a Ryzen 4500U, 256GB NVMe SSD and 8GB RAM for $629 new. I learned in the Best Buy questions and through some research that the RAM and SSD are able to be upgraded, and it’s compatible with an active stylus, like the Surface pen. The upgradability is a must-have for me, and the stylus compatibility is a huge plus. I waffled for a couple hours, but decided to buy an excellent condition open-box one for about $570. It should be in by December 22 they say, but of course I’m hoping it arrives earlier. The AMD processor bests more expensive Intels and has pretty good integrated graphics. I should be able to run some games at 1080p medium settings. It totally murders the i5 6300U in the Surface Book (11,286 passmark score vs. 3,269). The Verge called the 13″ version the best sub-$1000 laptop. I think I’ll probably keep the 256GB SSD for now, but I’ll definitely be upgrading to 16GB RAM as soon as possible. I’m really looking forward to it, and I’ll update when it comes in.

Taking a New Direction with the Server

As I said a couple weeks ago, I got my DL380 server going. At least for a little while anyways. I started testing some services on it like WordPress and Grocy (the latter of which will be a post of its own in a couple weeks). I was satisfied with the web services, so I decided to try getting my stupid Ceton TV tuner card set up in the server. I got Proxmox ready to do a PCI passthrough of the card to a Windows 10 VM and then installed the card. To do so, I had to detach the SAS cables from the RAID card. Unfortunately, the server won’t boot up correctly with the tuner card installed. So I took the tuner out, which means I detach and reattach the SAS cables again. I made sure to connect them to the same ports as before. But, to my annoyance, when I started the server, it couldn’t boot from the hard drives anymore. I don’t know if disconnecting the cables ruined my arrays, or maybe I mistakenly connected the cables to the wrong ports on the RAID card, or what. This enterprise server seems so touchy. I guess part of that might be because I’m not really using it as it was intended. Anyways, I’m going to cut my losses and use some of the parts to put together my own “white label” server.

I should have built my own to begin with, but I couldn’t find any parts that could beat the price to performance ratio of the DL380 on paper. I think part of the reason for the DL380’s excellent price/performance is the relatively unloved socket LGA 1356 for the processors. LGA 2011 is from about the same era, but was used in a lot of servers and desktops. The processors and motherboards, even many years later, are decent amount more expensive than similar LGA 1356 parts. At the beginning of this, I was unaware of the relative rarity of 1356 parts.

I found a dual-socket Intel motherboard that should do the trick. I’m teaming that with a Dell H310 RAID card that I’ll be flashing for use in IT mode so I can use ZFS and an HP NC365T NIC. This motherboard has enough slots to accept my TV tuner card, so hopefully it’ll boot with it installed. There’s also a PCIe x16 slot, so if I’m really lucky I might be able to put my RX 480 GPU in there too. The motherboard is a CEB size, which the internet tells me is between a regular ATX and E-ATX in size, and uses the same IO shield size and screw holes as ATX. I picked an Antec P101 case to hold everything. It says it should hold an E-ATX board and eight 3.5″ hard drives. Plus, it has a 5.25″ drive space on the outside, so I can move my Blu-Ray drive from my desktop over there and set up an automatic ripper.

Right now the only part I don’t have on order is a power supply. I want something that’s at least 750 watts, and I’d probably go up to 1000 watts. I definitely want something 80+ Gold efficiency at the least, and I have to have two CPU power connectors for the dual socket board. This narrows down my selection, but not severely so. Both EVGA power supplies I have in my house right now meet those criteria. Unfortunately, it appears coronavirus has totally wrecked the supply of power supplies. Hardly anything is available, let alone decent power supplies, and what is available is two to three times more expensive than usual. I’m going to keep my eyes peeled for reasonably priced used ones, but I may have to wait a while before I get this server going. I’ll update when I get it built.

The Server is Running

Long story short, I couldn’t figure out how to get the DL380e going again. I decided it was best to just get another because it was so cheap. The new one came in over the weekend and other than some damage to the hard drive cage, it’s working perfectly. I switched the damaged cage out for the good one on the broken server. Set up was fairly easy. I used the HP SmartArray tool to set up two arrays; one single SATA drive for the hypervisor, then six 3TB SAS drives in RAID 6 for my VMs. That gives me 12TB of storage. I got some virtual machines running today, and they’re working great.

One thing I’m not too thrilled about is the power usage. With a couple VMs going iLO reports power usage at around 165 watts. I was hoping for more like 100 or 120 watts. It’s hard to justify that kind of power usage. On the other hand, it’s hard to beat the performance per dollar of a used server. The server itself, processors and memory were around $250, for a machine with 16 cores, 32 threads and 46 GB of memory. I could get something that uses less power, but it would cost more for less performance. My ideal machine would probably be a 16-core Threadripper, but processor and motherboard separately cost more than my whole server. Maybe I can get some used stuff in a few years when it’s time to retire the HP server.

Homelab Underway

There’s been a flurry of activity, and a false start in the homelab in the last week. I made my shit-tier vertical mount rack system and put it in my office closet and got just about everything set up. On the other hand, I have some trouble with the HP server I bought.

The first step to get everything working was to run some cables into the office; two ethernet cables and on RG6 coaxial. All three cables go from the basement to the office along the outside of the house. The coax and one ethernet cable are attached to the ONT (optical network terminal, basically a modem but for fiber optic) in the basement. This supplies the main internet connection to the office. The coax is disconnected for now, but I might hook it up and put my TV tuner in the office. The second ethernet cable ends up connected to the Orbi satellite in the living room for wired backhaul.

The new cables are on the left. I would have run them with the electrical service and fiber optic cables, but I don’t have a ladder tall enough. Just ignore all the garbage on the ground. It’s not a crack house, I promise.

I added a wall plate under the office window to nicely terminate the wires coming in. I used a backless retrofit/old work box to hold everything in place.

Not quite straight, but it gets the job done.

The upper ethernet cable is the internet supply from the ONT, and it goes to the new pfSense router across the room.

The HP desktop is the “router.” The Orbi is now acting just as a wifi access point. The gold thing is a Raspberry Pi 4 B 2GB which is currently serving up this website.

The HP ProDesk 400 G1 (what a name 🙄) desktop has an HP NC365T four-port NIC that handles the in and out for pfSense. Speaking of the software, I’m actually virtualizing pfSense. I’m using Proxmox as the hypervisor. Proxmox is a common choice for homelabbers, but it doesn’t seem to be as popular as ESXi. Most homelabbers use the same hardware and software as their work does, and almost no business uses Proxmox. I picked Proxmox because it’s free and open source with no limitations on its capabilities. ESXi places limitations on what you can do with the free version of the software, and I don’t want to pay the yearly subscription to use everything. On the other hand, I probably don’t need everything in the paid version. Anyways, it’s Proxmox for now. I set up pfSense as a virtual machine within Proxmox and assigned it two ports from the NC365T to do the routing.

I’m also running a Pi-Hole on the HP desktop inside a Ubuntu virtual machine. I was initially using Debian, but I ran into problems that I may have erroneously attributed to Debian. I still have trouble with the Chrome browser on my desktop while running on Ubuntu. Firefox on the same computer works perfectly. I never had any problems with any browser when running on a Raspberry Pi. Pi-Hole had a big 5.0 update a couple weeks ago, so I might have to try Debian again sometime over the summer. For now, it gets the job done; the ad blocking is working normally.

The LAN port on the router is connected to a Cisco 3560G switch. I just finished a semester-long networking class with curriculum provided by Cisco (I got an A, by the way), so it seemed like a good idea to get a switch I was already familiar with. The switch basically distributes the LAN (and thus internet) access wherever it’s needed. The Orbi base station is plugged into the switch, and the base station is then connected to the satellite in the living room. These provide the wifi coverage for the house.

The shitty “rack” I put together in the closet. It works though, and the things are so much quieter in there than out in the open.
Pretty lights.

The network side of things is going great. The server on the other hand, is not. I installed the hard drives I ordered and put it in my rack, and now the RAID card doesn’t work. No matter what I do, I can’t get it to work. It gives me an error like the card itself is defective or not plugged in properly. I initially thought a dead battery for the card was causing the problem, so I bought a new card and battery, but got the same result. The only difference was adding the hard drives and moving the server. It worked perfectly fine two weeks ago on my shelf. The server was pretty cheap, so I ordered another identical one. Hopefully it doesn’t get killed. If you think you might be able to help me with my P420 controller woes, drop me a line here.

Overall, I’m happy with the set up so far. The only thing I’m a tad dissatisfied with is the wifi solution. The Orbi is a great mesh system for the consumer, but I find it a little lacking from my more somewhat more knowledgeable perspective. The big thing that’s missing from it is support for virtual LANs. I’d like to have three wifi networks: one for guests, one for things, like printers and smart speakers and the thermostat, and one for trusted devices like personal laptops and phones. VLANs would make this possible by allowing the three wifi networks to be on separate VLANs with separate routing and firewall rules to keep traffic out of the home network if needed. Commercial wifi gear like Ubiquiti is all about that stuff, and if I hadn’t purchased the Orbi stuff relatively recently, I’d probably look into some of those commercial access points. Maybe I’ll cruise around for some used ones on eBay some time.

Home Lab Update

I’ve acquired a everything I need to start my set up and I’ve been playing with it for about a week now.

The HP server is great so far, except for the noise. I know enterprise servers are probably designed with no thought given to noise levels, but Jesus, this thing is ridiculous. During the entire minute-plus POST process, the six fans run at their maximum speed of something like 12,000 rpm. It’s loud. If I was near a rack full of them all day, I’d definitely be wearing some ear protection. Once the fans settle down to 35-40% when the thing is idling, they’re bearable, but still too loud to have on an open shelf in the office. I’ve decided to make a redneck “rack” to suspend the server and the switch vertically in the office closet. I’ll be making it this week, so I’ll be sure to post some pictures. In other news, the server turned out to take 3.5″ drives, not 2.5″, which really pleases me. I found some used 3TB HGST SAS drives and got six of them, plus some drive trays. I’ll be doing a RAID 6 array, so I should have 12TB of total storage while being able to recover from two drives failing simultaneously.

I got a Cisco 3560G 48-port switch to connect everything. I decided I had to update it, and killed it somehow. The flash memory appears to be wrecked. I used the web admin page to try to update it, and that image might have been too large for the flash memory. I tried to format the memory and install a new OS over the serial connection, but I had no success. So I bought another switch. The same model, but this one has already been updated to the latest supported version of IOS and it has a one year warranty. I won’t even be thinking about updating this one.

The SFF HP desktop as a router is coming along just fine. I dug up an old hard drive from a MacBook Pro I flipped a few years ago to use as the storage for that. At 250GB, it should be more than plenty. The system came with only 4GB of RAM, so I ordered another 4GB stick that should be in this week. I could probably get by on 4GB, but why not double it for like $12? I installed an HP NC365T NIC to give that computer a total of five gigabit ethernet ports. I’ll be using the built in port for the Proxmox admin console, then three of the four ports on the HP card for pfSense and Pi-Hole. I also need to get a VPN running, and I’d like to use WireGuard, which I may be able to do right in pfSense. If not, I’ll get a third VM going on this router box to handle VPN duties and use up the last ethernet port.

I’ll be running some wires to get wired internet and cable TV from the basement to the second-floor office. It shouldn’t be too difficult, but I’ve never run wires on the outside of a house before. Theoretically, all I need to do is add a couple holes to the area where the electric service and cable/fiber connections enter the house from the outside, put the wires through there and then drill a couple holes in a wall of the office. I’m hoping to not have to terminate my own ethernet cables because it’s a real pain. Monoprice has some outdoor rated cables with RJ45 connectors already attached for a great price, so I’m going to try to drill a hole big enough to let the connector through. The coax cable for the TV signal is no problem though, I’ve used compression-fit connectors on those a million times.

I’ll be ordering all my cables early in the week, so hopefully I can get drilling on the weekend.

Starting the Home Lab

A little while ago, I stumbled upon the r/homelab subreddit. There, users gather to discuss their home network setups, often used for experimentation like a laboratory. I finally have enough money to get started on a modest set up, so I ordered some pieces this week to get started.

I’m currently taking classes to get a degree in computer networking. One class is an introduction to computing class that has us using virtual machines for something. Another is a networking class that should prepare me for the Cisco CCENT certification exam. I’ve become a lot more interested in the subjects thanks to the classes. At the same time, my home network needs have changed and I could use some more power and storage.

Enter the homelab. Following some guidance from the r/homelab wiki, I decided on a basic set up. For hardware, I decided to get an HP DL380e Gen8 server, a Cisco Catalyst 3560G 48-port switch, and an HP ProDesk desktop. The plan is to use a hypervisor (probably Proxmox) on the ProDesk so it can act as a pfSense router, Pi-Hole ad blocker, VPN (hopefully with WireGuard), and reverse proxy (probably Caddy) all at the same time. This router will be connected to the Cisco switch, where I may set up some virtual LANs. I’ll have to see how everything works together. I got the DL380e as a barebones thing, so the specs were up to me. I decided to go with dual Xeon E5-2450 processors, and I’ll be getting 48 GB of RAM (the maximum is 384 GB). These are both pretty cheap options. I think I found a good deal on some 1TB 2.5″ SAS hard drives, so I will probably start with six of them in a RAID 6 configuration and add more as needed. I’m not totally certain about this though. The server will be home to a few virtual machines. I’ll be moving my TV tuner card there, so there will be at least one Windows 10 VM. I’m also going to be running my normal website from there, and I’ll probably run an OctoPrint setup so I can control my 3D printer, so I’ll need at least two Linux VMs for that. I’ll probably also throw in one for Arch Linux, just because I like to tinker with it. Beyond that, I’m not sure what else I’m going to do with it yet. Maybe host some game servers for Minecraft or something or get a media library going. I’d really, really love to put my spare RX 480 graphics card in it and stream some games. This is totally possible, but I need a very specific PCI riser card to fit a double slot graphics card, plus a power wire adapter and maybe an extra power supply. We’ll see how it goes and maybe I’ll try to track down the special riser in the future.

There’s no real purpose to doing this other than I want to. The experience with Cisco networking and virtual machines might help me get a job some time in the future. It’s not super expensive at least, and it’ll be fun to have a ton of computer power at my disposal.