My Homelab Hardware Journey

The Beginning

Even before I called it a “homelab,” I found uses for home servers – mostly to replace multiple external USB hard drives. Although I grew up with Mac desktops, I became a laptop user in high school. Imagine being able to access my data from anywhere using my PowerBook G4! Or even better, being able to have tasks running without tying up my main computer. I bought a used Power Mac G3 tower (I later upgraded it to a G4 tower), removed the optical and Zip drives, then added ATA/IDE expansion cards and additional hard drive bays. That worked well for a few years. As a repair technician, I even used this method to build NetBoot and DeployStudio servers at work.

At some point, I decided to complicate things a bit more. I picked up a liquid-cooled Power Mac G5, a Sonnet Tempo E4P eSATA card, and a couple of eSATA drive enclosures. Port multipliers were such cool technology. I later moved that to a secondhand Mac Pro.

Mac Pro tower with two eSATA enclosures (March 2010)

Consolidation

After that, I consolidated everything onto a brand new 2010 Mac Pro – the idea was that it’d be my primary Mac, my gaming PC (booting to Windows via Boot Camp), and my file server via the eSATA enclosures.

Transferring data from 2007 iMac to 2010 Mac Pro tower with two eSATA enclosures (August 2010)

After a couple of years of that, I realized consolidation had too many drawbacks – for example, if I played Borderlands on Windows for several days at a time, I couldn’t as easily browse the web, check my email, or access my storage without rebooting to the macOS. I needed to split things up.

Un-Consolidation

First, after a lot of research, I purchased a Synology DS1815+. Although I had dabbled with RAID on macOS, this was much more stable – SHR and SHR2 meant that if a drive failed, I could remove it and replace it with no data loss. In addition, I could access my storage via SMB, as well as Synology’s included apps. The OS, DSM (DiskStation Manager), is Linux-based – built on top of BusyBox. After a couple of years, I bought a DS1817+, and kept the DS1815+ for backups.

My basement homelab (October 2020)

I also built a gaming PC from discarded parts. Through that experience, I learned that most games don’t demand a lot of CPU or RAM – a fast SSD and a decent GPU are generally enough. I connected it to my TV via HDMI, then used Steam’s Big Picture mode and a Steam Controller to play games from my couch. Finally, I was able to downsize my Mac to a MacBook Pro, then a Mac mini.

After becoming familiar with running Docker containers on my Synology NAS, I hit yet another ceiling – the Intel Atom processor just couldn’t keep up with the number of containers I had accumulated. In fact, Synology’s UI for Docker refused to load at some point due to the number of containers, so I had to manage Docker completely through the command line.

Application Server vs. File Server

By 2021, I obtained a Dell PowerEdge R720 for learning VMware ESXi and vSphere. At the time, there was a strong homelab culture at Saint Joe’s, so we traded ideas and helped each other learn new skills. Matt Ferro (Mateo) helped me configure ESXi, as well as iDRAC for Lights Out Management. While I kept my data on the Synology DS1817+, I moved Docker to an Ubuntu VM on the Dell, which increased performance considerably. I used NFS and autofs to keep things working seamlessly. I bought some plastic shelving at Home Depot that was wide enough to accommodate the R720, but was uncomfortable with how much it swayed (though it never collapsed, thankfully). I repurposed Mac minis for AutoPkg / App Store caching / uptime monitoring.

My basement homelab (September 2021)

After a couple of years, I realized I outgrew both the R720 and the DS1817+. Three separate systems (ESXi, Ubuntu, and Synology’s DSM) made it difficult to patch – I had to take things down and bring them back in a certain order, so it couldn’t be fully automated. In 2014, the Synology NAS’s 8 bays seemed limitless, but I was almost out of disk space a decade later. I calculated that if I replaced half of the drives, it wouldn’t be worth the cost for the amount of disk space I’d gain. I really just needed more bays, so I could buy cheaper drives. It’d be smarter to put that money towards a new build instead.

The Redesign

I started off with the approach that I’d buy a rack and mount everything in there. When I looked at cases, I found some that could store 30+ drives! The idea of being able to buy so many cheap drives was enticing. However, they’re huge, heavy, and it could be hard to access disks if I needed to swap one out.

I also had to make the decision if I was going to use Unraid or TrueNAS. I had dabbled with TrueNAS back when it was called FreeNAS, but had a couple of bad experiences on the forums by a (now deactivated) moderator, so I didn’t have fond memories of the project. On top of that, I used the software during a period where it suddenly received a major redesign. I was frustrated for a bit, as I tried to figure out where everything was moved. On the other hand, I’d heard nothing but good things about Unraid, and I wanted an OS that made it easy to expand my disk array or replace failing drives. TrueNAS’s ZFS support sounded great, but I couldn’t tell if the OS would be flexible enough for my Docker requirements. It really helped that the LinuxServer.io crew frequently recommends Unraid in their Docker image README files.

I posted to the Unraid Discord server about buying another old server, and received strong feedback that I should consider building things myself instead. Mateo suggested I build a “proof of concept” Unraid server, just to see how it works. I had a spare PC tower lying around, so I installed Unraid on a USB stick and experimented with the OS. It was very easy to get up and running, and seemed to do what I needed without much modification. This could definitely work.

Building the New Server

I remember reading a few years ago that John Carmack has an interesting approach to developing games – it takes years to build a game, but he wants the game to require cutting-edge technology when it’s released. To do that, he has to plan for hardware that doesn’t exist yet.

For computing projects like this, I’ve found that if I spec to my current needs, I’ll outgrow it faster. On the other hand, if I spec more than I need, I’ll find new use cases that push my setup farther than I had originally planned. My goal with this server build was to build something as future-proof as possible.

While gathering ideas, I searched PCPartPicker for Unraid builds. I found a couple of excellent ones that really helped shape my project. One was also local to the Philly burbs and mentioned the nearby Micro Center. The timing was excellent, as they were having a sale on motherboard / CPU / RAM bundles for gaming PCs. I hadn’t anticipated that I’d use an Intel i9 processor here, but I was replacing dual Xeon processors, so I had hoped the difference in age would make up for any performance gaps. Later, I found a benchmark website that confirmed my hunch. Not only is the i9 more powerful, but it also supports Intel’s Quick Sync, so video transcoding tasks could be offloaded onto the built-in GPU.

Another mentioned the Fractal Design Meshify 2 XL case, which is surprisingly flexible. Things were starting to come together. This case holds sixteen 2.5″ or 3.5″ drives, with room for two 2.5″ drives mounted to the back. While that’s not the 36 bays I was originally hoping for, it’s still more than I’d actually need. I ended up using both 2.5″ bays on the back, and eight of the sixteen 2.5″/3.5″ bays. B&H is located in New York City, so shipping the case and extra drive carriers was fast and convenient.

Since the motherboard had slots for M.2 SSDs, I added a few as a cache pool in Unraid, speeding up access to recently added files (the “mover” task seamlessly offloads them to the disk array overnight). I had put together something similar on my Synology NAS, but it required manual work – Unraid’s automated approach is significantly better.

Lastly, I bought new shelves. These shelves are incredibly sturdy and have a very clean look – I highly recommend them. I even added a $20 monitor from Facebook Marketplace!

My basement homelab (January 2024)

Please take a look at my completed build, which includes part links, prices, and pictures. Overall, I’m very happy with this setup, and hope it’ll last for years to come!

Smart Home, Part 3

I can’t believe it’s been almost 3 years since I wrote about this! Things have settled down a bit, so I figured I’d post another follow-up.

First, I built Mike and Joyce’s Smart Home Inventory if you’d like a quick glance at what we have installed. I’ll keep it updated as we make changes.

Things are mostly stable here, though. Apple has spent the past couple of years working on Matter. The Home app was redesigned, but most of the changes have been behind the scenes. I’m hopeful we’ll see more improvements as Matter matures.

Almost two years ago, our router (a Synology RT2600ac) added support for multiple VLANs, so I added a separate network specifically for smart home devices. This allowed me to broadcast dedicated 2.5 GHz and 5 GHz SSIDs for this network, since many smart home devices are only compatible with 2.5 GHz. Eventually, I hope I can firewall it off from the rest of the network, but I’m not sure how I’m going to accomplish that with the Apple TVs needing to connect to this network and the primary VLAN.

Plugs and Bulbs

We’re still very happy with Philips Hue! We use Adaptive Lighting extensively, and the HomeKit integration is top notch.

Here’s one great feature that was added recently: we have a Philips Hue Dimmer Switch in the bathroom, which controls three bulbs. Philips recently added the ability to do time-based lighting, which I absolutely love. Here’s how it works: when you turn on the lights, you can set a different brightness based on the time of day. For the bathroom, we have bright lights during the day, warmer lights in the evening, and very dim lights from bedtime until sunrise. Of course, if you need to override that, you can easily use the buttons to change the brightness. Next time you turn on the lights, it’ll go back to the defaults. It’s really nice to use the bathroom at night without being blinded!

Joyce wrote a Python script for rotating the light strips in our front windows through Christmas colors. We’re planning to expand that to other holidays, too. Once I’ve got that Dockerized, I’ll post here!

We also phased out all WeMo devices. It’s unfortunate, because Belkin / WeMo were one of the first companies to do this kind of stuff, but they haven’t kept up with the times. For one thing, they’ve all but dropped support for our light switches, dimmer switch, and smart plugs – I had a lot of difficulty adding them back to HomeKit once they were wiped. For the ones that support HomeKit natively, I couldn’t get the WeMo app to recognize them for firmware updates. They initially announced Matter support, then backed out. It’s time to move on.

While TP-Link has been hit or miss over the years, they bought a company named Kasa that makes really good smart home products. We bought a TP-Link / Kasa dimmer switch and several smart plugs to replace the WeMos. I’m impressed that they already support Matter, are easily added to HomeKit, are very stable, and reliably (and automatically) update their firmware through the Kasa app.

As mentioned in Finding Balance While Working Remotely, we added Nanoleaf Shapes to our home offices. We’ve found that although they support HomeKit, it’s best to just control the lights through the Nanoleaf app. The LED panels have added a lot of light and color to our home.

Also, it’s not exactly a plug or a bulb, but we have several sensors for triggering lights throughout the house and in the garage. I’ve been very impressed with the Philips Hue Motion Sensor – we have one in our stairwell that is triggered multiple times a day. I can’t remember the last time I changed the batteries, but it’s been at least a year or two. They’re tiny, have a magnet on the back for sticking to surfaces, and you can drill a screw through a hole in the magnet to mount it anywhere. More devices like this, please!

We also have a few Eve Motion sensors, but batteries don’t last nearly as long in those. At this point, I’m leaning towards replacing those with more Philips Hue sensors instead.

Water Leak Detectors

We still have the Flo by Moen water leak detectors. Although we haven’t had any incidents, we sleep better at night knowing that we’ll be alerted if something were to happen.

In fact, we purchased the Flo by Moen smart water shutoff shortly after the last blog post. It’s easy to integrate with the water leak detectors – for example, if the toilet exploded, the water leak detector would screech, the app would send a push notification to our phones, we’d get an email and phone call, and Moen would instruct the shutoff in the basement to stop the flow of water to the rest of the house. Water leaks have the potential to do significant damage, but this setup minimizes the effects.

When we go away on vacation, we set the Moen shutoff to “away” mode, which means that any water used will trigger an alert, shutting off the water for the whole house in the process. Our homeowners’ insurance gives us a yearly discount for having the system installed, and we can download a certificate from the Flo by Moen web console.

The water shutoff also has its own logic to determine if you’re using an unusual amount of water, but pairing it with a water leak detector is significantly more accurate. Our humidifier consistently tricks the water shutoff into thinking we have a leak somewhere, and I’ve had the water turn off during a shower too many times to count.

If you’re thinking about buying the smart water shutoff, the extended warranty is absolutely worth the monthly cost. Ours stopped working due to mineral deposits building up inside, and support sent me a replacement part right away.

Unfortunately, the water leak detectors chew through batteries a little too quickly for my liking. The app shows the battery at 100%, then suddenly it’s dropped to 40%, then it’s offline. It’s impossible to know when we’ll need to replace a battery, making them way less useful when they die while we’re traveling.

One other downside: the Flo app hasn’t been updated by Moen for a very long time. There are numerous improvements that Moen could make to modernize the app, such as Time Sensitive Notifications, Siri support, or allowing multiple users on the account. At the moment, Moen only supports a single phone number for emergency calls, which makes it tough for the two of us to respond to notifications quickly.

It looks like Moen is building a brand new app, but it’s not compatible with our Flo devices yet. I hope they’re working to add the improvements I mentioned above. I’d love to use more of these kinds of devices in the future. The smart shower controls look awesome, and they have a very fancy toilet…if only it sounded like Jon Hamm.

Cameras

I’m hoping to see some improvement in this space soon. We’re still with Arlo, but their support is absolutely terrible. In the past few years, we’ve replaced all of the Arlo cameras a couple of times, hoping for better stability, but it hasn’t quite happened yet.

Right now, we have the Arlo Pro 5S, the solar panel, the Video Doorbell, and the Chime 2. We replaced everything because newer Arlo devices connect directly to WiFi, rather than to the old wired base station, so I figured they’d be more reliable. Now, we notice that one (or all) of the cameras just stop recording randomly until they’re rebooted through the app. It’s not great.

Arlo promised the Pro 5S cameras would be able to be added to HomeKit in 2022, but as of now, that still hasn’t shipped yet. Lots of “coming soon” promises in their forums, posted every few months (including last month). No word on whether they’ll get around to adding HomeKit support for the new Video Doorbell, either. Although I didn’t stream the camera feeds in the Home app, I liked being able to use the camera’s sensors to trigger outdoor lights.

One camera has a solar panel connected, but it just doesn’t provide enough power to keep the camera from needing to be charged every couple of months. So, we manually charge both cameras as needed. In the future, I’d like to permanently connect them to power, but that’ll have to wait until we can add some outlets outside. At least the batteries last long enough that we only need to do this every couple of months.

Every time I’ve thought about switching away from Arlo, I’ve found the competition is much worse: Ring apparently works well, but I don’t want to send our footage to Amazon. Similarly, Nest requires that we send our footage to Google. Eufy, despite being an Anker brand, had all sorts of terrible security issues (and lied about them to The Verge). Logitech outdoor cameras have a reputation for melting in direct sunlight, even on mild 70 degree days. HomeKit Secure Video cameras are limited to 1080p video, and Wirecutter found that they miss important events such as detecting people or packages. I think we’re stuck with Arlo for a bit longer.

Thermostat

We absolutely love our ecobee thermostat. It integrates with HomeKit, but the app also works well on its own. Our energy bill skyrocketed as Russia invaded Ukraine, but we’re pretty sure it’d be worse if we didn’t have this thermostat to keep things as efficient as possible.

We also bought a bunch of ecobee room sensors, which immediately paid for themselves. Having these sensors in nearly every room has allowed us to fine tune temperatures for the whole house. I can’t recommend ecobee enough.

Follow-Ups

Some things haven’t changed, but I can give detail on how well they’ve worked over the past few years:

We’ve still got all of the Sonos speakers, though we’ve had frequent stability issues. I’ll probably need to factory reset the entire system again. When it works, it works well, but when it doesn’t work, it’s very frustrating. Their phone support is surprisingly good, however. If I had to do it all again, and we used Apple Music instead of Spotify, I’d take a hard look at the HomePod minis.

We still have our Yale / August door locks. They mostly work fine, though I can’t recommend their support team at all. The August Connect for one of our doors stopped working (this connects the lock to HomeKit), and it took weeks of emailing back and forth to determine that they weren’t going to fix or replace it. Each reply came from a different person, who’d suggest yet another factory reset.

Otherwise, it’s been nice to have a keypad on the front door. I haven’t had to use the physical key once in the past 5 years – the door unlocks automatically via Bluetooth, manually via the app, or with my PIN on the keypad. We’ve also been able to generate emergency codes for family. I can’t help but feel that HomeKey would be a downgrade, as I’d have to tap my phone or my watch to the door lock. That’s hard to do when your hands are full of groceries!

We still have the Roborock vacuums. We run them every day, and our floors are noticeably clean. Each vacuum has required a few replacement parts, which are easy to buy on Amazon. No complaints there. Newer models also mop, empty their own dustbin, and are hopefully quieter, but it’s hard to justify the cost of replacing two fully working vacuums.

The Future

I’m hopeful that Matter will bring all kinds of improvements: more devices from other manufacturers that now integrate with HomeKit, as well as new types of devices that HomeKit doesn’t currently support. More competition generally means lower prices, too.

Plus, being able to integrate devices with each other is the best part! There are so many possibilities. People have been talking about Matter for a couple of years now, and I’m looking forward to seeing it finally take off soon.

Finding Balance While Working Remotely

Alright, back to the technical stuff. Well, sort of.

Something that’s been new to me is working remotely for a company where many of my coworkers are in different time zones. Although I was fully remote at SJU for the last few years of my time there, everyone I worked with started and ended their day at around the same time. That doesn’t happen when you’re working for a global company! To have a work / life balance these days, I need to be mindful of my own schedule. Here’s how I’ve used technology to help me do that.

July 2024: I’m updating this post without making a new one, just to keep things simple. Below each section, I’ve added some additional tips that I’ve learned since originally publishing this blog post.

Focus

macOS Ventura, iOS 16, and iPadOS 16 arrived at exactly the right time for me. I had just started at DoorDash, and was already familiar with Do Not Disturb mode and using the Health app to set a sleep schedule. I’m really glad Apple gave this feature so much attention with the Fall 2022 releases.

To get started, Apple has excellent documentation for iOS / iPadOS and macOS. You have a lot of flexibility to create different Focus modes, but I’ve settled on four: Sleep, Do Not Disturb, Personal, and Off. I work from 10 AM until 6 PM Monday through Friday, so I’ve built my Focus modes around those times.

Sleep: Sleep is a good place to start, since it has to be set up in the Health app on your iPhone. Pick what time you want to go to sleep, and what time you want to wake up. On the weekends, I give myself a slightly later bedtime, and a later wake time. You can pick an alarm if you want to, but I rely on our bedroom Sonos speaker for that, instead, so I can wake up to music. 😄

I’d recommend setting “wind down” to 0 minutes. It just activates Sleep focus early, which is somewhat unnecessary.

In Settings > Focus > Sleep, you can customize a number of things. For me, Sleep focus is my most restrictive – I have a custom Lock Screen (I’m using “Astronomy” which looks great at night), and the brightness is significantly dimmed. I only allow some apps to send push notifications – mostly ones like 1Password, in case I need an MFA code. I also made a page of apps solely comprised of ones that I’d need if I woke up at night or was getting ready for bed. I also have some shortcuts for actions such as the “good night” scene in Home or to quickly make a new to-do item in OmniFocus. I filter out my work email, too. Lastly, all badges are disabled.

Do Not Disturb: I want this to activate at 10 PM on weeknights, and 11 PM on weekends, well ahead of my actual bedtime. The end time doesn’t matter, since Sleep focus will take over. This is my own “wind down” time, where all notifications are silenced (again, except for apps like 1Password). I have a custom Lock Screen here too, so I can tell at a glance that I’ve activated Do Not Disturb. I picked an excellent wallpaper from Wallaroo and set it to greyscale, taking a colorful beach scene and turning it into a snowy evening. I also filter out my work email here, so I only see my personal email.

Personal: For obvious reasons, this is my favorite. I have a custom Lock Screen with a picture of my wife. It activates at 6 PM each weekday, but also in the mornings – my wake up time is at 9 AM, so it also covers from 9 AM until 10 AM (so I’m not hit with work emails as soon as I get out of bed).

Off: This is what’s in place during my work hours. “Off” is simply no focus activated – the default behavior for an iPhone. Since I manage Macs, I have an Apple-themed Lock Screen and Home Screen. All email accounts are shown in a unified inbox, and no notifications are silenced. I experimented with creating a “Work” focus, but for my purposes, it was kind of overkill to create a separate focus just for that.

Off Lock Screen

Outside of those schedules, I’ll frequently toggle Do Not Disturb during the work day if I’m joining a Zoom call and don’t want to be distracted by notifications. When I’m on vacation, I manually toggle Personal on, so I don’t see any work emails. I used to fully remove my work account from my phone while on vacation, but this is significantly easier!

One of the best additions to macOS Ventura is that you can add a menu bar icon for Focus mode, allowing you to quickly switch to a different mode. All of your iCloud-connected devices will instantly adopt the same mode.

July 2024: Also, while you’re configuring notifications, disable your email “ding” noise. It’s more disruptive than you probably think. I leave the notification banners in place, but the noise pulls me out of flow unnecessarily.

Slack

Slack has an excellent guide to configuring notifications. I set my work hours in there, so I don’t receive any notifications in my off-hours. Coworkers can still push DM notifications through if it’s an emergency, but otherwise, it’s all silenced at the end of the day.

One additional consideration: since I have both my work Slack and the Mac Admins Slack on my phone, I found that I was still seeing badge notifications for DMs on my work Slack, even in my off-hours. This became hard to ignore, so my solution was to disable badges for Slack on iOS altogether. For similar reasons, I don’t have my work Slack on my home computer, as I found myself checking work notifications in my off-hours just to clear the badge.

July 2024: My colleague, Sam Keeley, offered an even better suggestion: install the Slack EMM app for my work Slack instance, and use the standard Slack app for the Mac Admins Slack. He also suggested removing the home icon for Slack EMM, but leaving the app installed, so I can open it via notifications (or Spotlight).

This also gives me the flexibility of blocking work Slack notifications with Focus mode, which has been great for when I stop for lunch and temporarily switch back to Personal focus.

Same as the mail “ding” noise, I’ve also disabled all of Slack’s notification sounds (except for incoming calls via huddles).

Google

You can set your work hours in Google Calendar, too. My main recommendation here is to pad the time – in my case, I set my work hours from 10:30 AM to 5:30 PM. That gives me 30 minutes at the start of the day to catch up, as well as 30 minutes at the end of my day to wind things down.

Note that I’m not signed into my work email on my personal computer, and I’m not signed into my personal email on my work computer. However, I am signed into all of my calendars on both computers and my phone – this prevents me from double-booking events and makes it easy to block time on my work calendar as necessary.

July 2024: I’ve found it important to also schedule regular “meetings” for lunch and regular breaks. I shouldn’t be sitting for more than two hours at a time, so I’ve got items on my calendar (which can be moved if necessary) so I’m not scheduled in back-to-back meetings all day.

Smart Home

I’m extremely lucky to have my own home office – that was one of the reasons we bought our house in the first place. Even though that’s where I work from during the day, it’s also where I keep my personal computer and video game systems. I typically spend a lot of non-work time in my home office.

We picked up some Nanoleaf Shapes LED panels on sale a year or two ago, and I’ve grown really attached to them. I made an ugly fish with big teeth! They provide a lot of great light, but since they’re so customizable, I’ve set them to change on a schedule:

9:30 AM (30 minutes before I start work): Be Productive

6:00 PM: Jungle

10:00 PM (or 11:00 PM on the weekends): Starlight

This helps provide visual signals when my day has changed. The moment the panels go from light blue to green, I know my work day is over. Since Nanoleaf supports HomeKit, I also have the panels turn off as part of the “good night” scene when I go to bed.

July 2024: Shortly after I published this post, I decided that weekends should be special – I wanted my office to have a different feel on Friday and Saturday evenings, and during the day Saturdays and Sundays. I added additional schedules with completely different color schemes, so I can fully relax on the weekends.

Also, I added a HomeKit scene that dims every light in my home office 10% further at midnight, so I can more easily wind down for bedtime. That’s worked very well.

Conclusion

If you’re working remotely, I hope this helps give you ideas on how you can use technology to have a better work / life balance. It’s certainly helped me!

Making History

I’ve been super lax about posting here. I wrote something last week about Mr. McCormick’s retirement from the Historical Society of Riverton. I’ve copied it below:


Hi everyone –

Mike Solin here. It’s hard to believe this is actually the first time I’ve posted! If you’ve been reading this website for a while, you might have caught mention of me throughout the years. Though Mr. McCormick has been posting this entire time, I’ve handled the technical aspects – silently keeping things running in the background, but also, doing my best to implement any site improvements that Mr. McC has requested.

I want to direct your attention to the latest edition of the Gaslight News, which Mr. McC published at the end of June. Amongst other items, it contains a farewell from outgoing HSR President Bill Brown, a recap of the HSR Awards Night, and an article about Ada E. Price coauthored by Patricia Smith Solin (my Mom!).

With so much packed into a single issue, you could be forgiven for not scrolling towards the end. However, you’d miss Mr. McC’s announcement that he’s retiring from the Historical Society of Riverton as of July 1st, 2023. His bio now reads:

Teacher at Riverton School 1974-2019, author, amateur historian, Historical Society of Riverton Board Member 2007-2023, newsletter editor 2007-2023, website editor 2011-2023

I’ve been very, very lucky to have such a partnership with Mr. McCormick. At Riverton Public School, he was my 5th grade teacher, tag teaming with Mrs. Dechnik. Between the two of them, they covered nearly every subject. During my 7th and 8th grade years, Mr. McC moved up to the third floor, and taught History. Mr. McCormick helped me develop an appreciation for both science and history, and Mrs. McCormick fostered my love for technology in the computer lab.

Years later, in spring 2010, both HSR President Gerald Weaber and Gaslight News editor Mr. McC reached out to my Mom for help with a revamp of the website. Having already built the first website for Riverton Public School, she had recently rebuilt the Riverton Free Library’s website, and they were seeking her experience. Here’s a fun email from that era:

From: John McCormick
Date: Mon, May 17, 2010 at 11:24 AM
To: Pat Solin

Hi, Pat

I hand delivered most of the Gaslight News issues myself. I’m glad that I won’t have to think about that again until August. Gerald has been so busy with his new job that I seldom see him. I jogged his memory about the website last night when I emailed him asking if he’d be able to post the pdf file of the most recent issue. I was just perusing the HSR website and thinking what a huge undertaking it will be to re-do that job. I am available whenever things quiet down for you. Say when, and I will come with mass quantities of files.

John

“Huge undertaking,” eh? He wasn’t kidding.

Months later, on the Fourth of July, Mr. McC stopped by my parents’ house to discuss the website redesign project. Realizing the complexity of the website that the Historical Society required, I volunteered to build something brand new with WordPress, a free and open source publishing tool. We spent months on the first iteration – uploading old Gaslight News back issues, building photo galleries, and more. By January, we’d have a fully redesigned website. In February 2011, we held a meeting to discuss the new website at the Riverton Public School library.

In addition to the many functional improvements associated with the new website, we brought the cost of running everything from $99/year (what Homestead charged) to $0/year (thanks to the continued generosity of DreamHost). At that time, we also launched our Facebook page, which has helped keep us connected with the community at large.

Of course, Mr. McC hasn’t been “sitting around, eating bonbons” (as he’d put it) since this website launched in 2011. Besides writing 587 posts, he’s also produced numerous editions of the Gaslight News, scanned too many postcards and photos to count, designed and printed custom mugs, and so much more to support Riverton history.

I’m immensely proud to have worked with Mr. McCormick on this “huge undertaking” for the past 13 years. Please join me in expressing appreciation for all that he’s done for the Historical Society of Riverton for nearly two decades!

Some personal news

It’s been a while since I’ve posted anything non-technical here, but I have some news! I’m excited to announce that next week, I’ll be joining DoorDash’s IT team! I’ll be working as a Client Platform Engineer, helping to manage all of the devices. I seriously can’t wait!

I was at Saint Joseph’s University for almost nine years, and I couldn’t be more proud of the work I’ve done there. My CIO, Fran DiSanti, sent this to everyone in the Office of Information Technology (and gave permission for me to repost it publicly):

Hello Colleagues,

Many of you may already know that Mike Solin will be leaving SJU this Friday, October 21 to pursue a new job opportunity as a Client Platform Engineer with DoorDash.  Mike is very excited to be joining a new team of client engineers which has been taking shape at DoorDash for the past year.  I’m confident that Mike will do great things for DoorDash just as he has for SJU over the past 9 years. 

Mike started as Technology Support Specialist in OIT and over time was promoted to his current role as Senior Client Platform Engineer.  Throughout his tenure in OIT, Mike has contributed much to our organization and to the University community. He completely reimagined and reengineered the way that we manage our macOS and iOS environments.  When he started at SJU, Mike envisioned a zero-touch, modern approach to device management and he successfully realized this vision by delivering an out-of-the-box deployment experience for Mac users.  His approach was secure, highly automated and allowed users to select and install pre-packaged apps from a software catalog.  In addition to his Mac expertise, Mike became very proficient with our Windows environments and Active Directory.  

Mike has been instrumental in the design, development and deployment of a number of strategic technologies which have had a significant impact on the way in which we manage our endpoint devices, including:

  • An automated data-backup solution (Code42)
  • Endpoint detection and response software (Malwarebytes)
  • Adobe Creative Cloud implementations
  • Mobile Device Management software (Workspace One)
  • Computer encryption
  • Microsoft Azure environment
  • Automated delivery of iPads to users

Clearly, Mike has made many important contributions through the years and along the way, he has continually developed his knowledge and skills.  I am truly grateful for all that Mike has done for our division and the community.  Please join me in thanking Mike and wishing him well in the next chapter of his chapter. 

Fran

Prior to joining SJU, I had moved from Philadelphia to State College, PA, then Richmond, VA. Being a Mac admin is very specialized, and at the time, remote work wasn’t as common as it is now. I missed my family terribly, and regularly used all of my vacation time to drive back for visits. I was incredibly lucky that the opportunity at Saint Joe’s opened up – it brought me home.

Moreover, it gave me the chance to work with a great team. One of the best things about working at SJU was that nothing was off-limits – I was encouraged to learn anything that interested me, and to use those skills to make things better for the university. When the position is posted, I absolutely recommend applying.

Going forward, I’ll still be local to the Philly area! I’m still involved with Greater Philadelphia Mac Admins, and plan on continuing to post to this blog, present at conferences, and participate in the MacAdmins Slack. 😄

Controlling Munki via Workspace ONE and Active Directory

I got something working recently, and I thought it was interesting enough that it’d be worth sharing.

Our MDM server is a SaaS instance of Workspace ONE UEM, and we have the AirWatch Cloud Connector installed in an on-prem VM to provide integration with Active Directory. Although WS1 bundles its own (modified) version of Munki, we don’t use it – we have a separate on-prem VM for our vanilla Munki server.

Unfortunately, this post is partially about printers (sorry). The challenge with setting up LPD printers on the macOS, is that the drivers need to be installed before the printer is added (or the printer is added with a generic driver, and must be removed and reinstalled). Munki is an excellent use case for this, as the requires and update_for pkginfo keys are perfect for setting up dependencies.

For several years, I used Graham Gilbert’s printer-pkginfo script to deploy printers with Munki. That, combined with my NoMAD group condition script, allowed me to deploy printers to only certain people’s devices – their user accounts in AD needed to be a member of a particular group.

With macOS 12.3 dropping Python 2 from the OS, I needed another solution. I landed on wyncomco’s fork of Nick McSpadden’s PrinterGenerator script. It works well, but with our move from NoMAD to Jamf Connect, how would we be able to leverage our AD groups to deploy these printers?

Thanks to the AirWatch Cloud Connector, I was able to add the AD security group to WS1 (in Accounts > User Groups > List View). The group in WS1 syncs periodically with AD, so users added to AD will appear in the WS1 group after a few hours.

In my case, though, I needed a Smart Group (sometimes called an “Assignment Group”) to actually make use of the user group. In Groups & Settings > Groups > Assignment Groups, add a new Smart Group where the first criteria is the Organization Group that contains your devices. Scroll down to User Group, and select the group you’re synching from AD. Name your Smart Group and click Save.

The last piece was how I’d get the printer to these users. Around the same time, VMware added the ability to run scripts through Workspace ONE. I had remembered Nick McSpadden’s post about Local-Only Manifests in Munki, which was perfect for this. I’d set up a separate manifest for WS1 to write to, and Munki would install the printer driver and the printer automatically.

First, in your Munki configuration profile, add this:

<key>LocalOnlyManifest</key>
<string>LocalOnlyManifest.plist</string>

This tells Munki to check this additional manifest for potential items to install. There’s no need to create the file – if it doesn’t exist, Munki proceeds as normal, without printing any warnings/errors.

Lastly, add this script to WS1 (in Resources > Scripts), and assign it to your Smart Group. Set the language to Bash, and the execution context to System.

#!/bin/bash

defaults="/usr/bin/defaults"
grep="/usr/bin/grep"

printer_installed=$(${defaults} read "/Library/Managed Installs/manifests/LocalOnlyManifest" managed_installs | ${grep} "MyPrinter")

if [ ! "${printer_installed}" ]; then
 ${defaults} write "/Library/Managed Installs/manifests/LocalOnlyManifest" managed_installs -array-add "MyPrinter"
else
    exit 0
fi

exit

In my case, I have it run immediately upon device enrollment, as well as when the network interface changes. The code runs a check to see if the Munki item MyPrinter is in the LocalOnlyManifest, and if not, it adds it. The next time Munki runs a background check, it will install the driver and printer automatically.

The end result is that when a user requires our printer, any AD admin can add the user to a particular group. Some time later, the user will receive the printer without needing to do anything. If the user already has our printer, but receives a new computer, the printer will be added as soon as the computer is set up – no additional admin work necessary.

I hope someone finds this useful for more than just printers!

MunkiReport in Azure

Following up on my last post – up until a couple of months ago, our production MunkiReport server was running Windows Server 2012 R2. Yep, MunkiReport was running in IIS, and MySQL was installed in the same VM. The server was about 8 years old, and while it had served us well, it was time to migrate to something more modern.

As we’re pushing to move more stuff into Azure, and containers are the future of these types of deployments, I spent a bunch of time figuring out how to get MunkiReport running as a Docker container in Azure. Even better: I automated it, so you can do it, too!

Please check out my GitHub repo for the script.

We’ve had this running in production for a couple of months now, and it’s averaged out to about $4.60/day for ~700 clients.

Due to some upcoming life changes, I’m not sure how much further development the script will receive from me. I intend to add some documentation, but there are definitely improvements that could be made (such as migrating to an ARM/BICEP template, or making some portions of the script optional). Please check out the script and let me know what you think!

MacDevOpsYVR 2022 Workshop

It’s been really quiet here, but that’s because I’ve been busy!

For starters, I participated in a workshop in June for the consistently excellent MacDevOpsYVR conference. We discussed various ways of deploying MunkiReport. I strongly encouraged everyone to take a look at Docker!

Many, many thanks to Mat X for inviting me to share my experiences, and for his skillful editing of the video recording.

My diagrams are included in the video, but I’m posting them here for posterity. 😎

More to come on this topic!

Modern Bootstrapping Presentation

I had the honor of presenting at the University of Utah’s May 2021 MacAdmins Meeting this week.

The slides and video are already up – check them out here!

Modern Bootstrapping: Part 2 (Building the Packages)

This is the second post in my multi-part series on modern bootstrapping with Workspace ONE UEM. If you haven’t read the first one, you can find it here.

Page 1 of 5

Powered by WordPress & Theme by Anders Norén