So on the surface there is nothing wrong with your deployment model. As they say, it it works, don't fix it.
Now, Apple is trying to fix it. A common statement I've heard over and over is "why replace perfectly good bits with the same bits?" Meaning, why replace an OS and core apps with the OS and core apps. This has been given the term monolithic imaging. To that end, Apple has introduced the Device Enrollment Program (DEP) and the Volume Purchasing Program (VPP), allowing fleet managers the ability to link hardware to their organization and thus an MDM solution (including Profile Manager). By doing so, you can cut down the steps required to deploy your fleet by automatically causing enrollment and thus policy enforcement and possibly app delivery. This is now referred to as thin imaging. But DEP is really only effective during original deployment and VPP is ineffective when you are supporting 3rd party apps that are not available in the App store (Microsoft and Adobe to name two big ones in corporate deployments). Once the unit is initially deployed, you have the challenge of that nasty side effect called user data. And using only Apple's Profile Manager, all you can do it set profiles. With Apple's tools only you have no means of delivering or managing those 3rd party apps. To really have a turn-key solutions you need JAMF.
Also, as a fan of monolithic imaging, I agree that there are certain things thin imaging just can not accomplish (or at least not without extensive engineering). For example, I want to publish a list of servers in the Connect to Server dialog box. Or perhaps I want to ensure that all users see scroll bars regardless of input device. These options are not available in the stock config profiles. You need to get into the User Template and customize. For the controlled environment of a school, monolithic imaging still has a vital place (this is clearly my opinion based on experience). For example, last summer I had the opportunity to guild a college with about 400 Macs to an upgrade solution. The units were all running 10.7.? and there was no plan to purchase new (budget issues). The objective was to upgrade everyone to 10.10 (because 10.11 was not out). So how to do it? I looked at the layout (campuses in two locations, multiple building, student use and faculty systems, no MDM, limited resources). My recommendation was to build a monolithic image, purchase a handful of USB thumb drives and use student assistants to carry out the reimage. I broke the firmware out of the installer and wrote a post script to ensure all EFI and Thunderbolt firmware was updated. With minimal training, the student assistants were able to image machines in about 20 minutes a piece (some faster, all depending on the speed of USB). In the end, this was the best possible solution because it was pre-existing gear, they wanted it done quickly, they did not want a ton of post-deployment steps, and they wanted to keep costs down without building new architecture. In the end, my solution was a hit and a clear success. The monolithic image was just what the doctor ordered and saved them over an hour of time if they were to perform in place upgrades. Obviously, user data (except with faculty) was not a concern.
This is a topic that I've put a lot of thought and effort into. I would love to see Apple address the full lifecycle of a device better. DEP definitely solves the initial deployment challenges. I can not express enough how great JAMF and DEP are together. But JAMF handles the 3rd party challenge. Profile Manager does not have the same level of completeness.
I've written a bit on this and figured I would share it. Here is a full section on my thoughts. I've been thinking about this a lot for some time and come to the conclusion that monolithic remains a viable solution in many circumstances. Hope this insight helps.
Preparing systems for deployment is an exercise in time management. And as we all know, time is money. So system prep is really an battle against cost.
From receipt of the device to the user’s desk, how fast can you deploy a Mac? How fast can you deploy 10? 100? Is the configuration the same for all devices? Minor differences? Major differences? How does each change impact the time to deploy?
I am sure many readers have spent a better part of a day preparing a single machine. A few weeks later when another machine arrives, away goes yet another day.
One off system prep is a tremendous time suck. Download OS patches or updates: 1 hour. Install and patch third party applications: 2 to 3 hours. Define all settings and establish the user’s account: 1 hour. Waiting on downloads and installers, even on the newest equipment with SSDs, can waste away a good chuck of the day. This is all time lost to a task that can be automated in a number of ways.
Rapid delivery of systems to users should be a primary objective of any system administrator. While there is the time to prep and deliver the machine, there is also to cost of the device and the workflow impact to the end user. New devices sitting on shelfs or in boxes is doing no one any good. Spending a day prepping one machine or derailing a user for an entire day is equally negative. How do I prepare my systems for rapid delivery to the end user?
Ah, initial system prep is not where the discussion ends. While rapid initial delivery of the device is of paramount importance, so is redeploying reclaimed devices. When planning how to prepare your workstations, you also need to consider the entire lifecycle of the machine. After all, there is a good chance many machines will come back to you before they are ready to be retired.
The Great Debate
∂
There are two competing philosophies on how to prepare systems. One approach is an age old one that dates back to the days of Classic now known as monolithic imaging. Other terms include cloning and ghosting (for the Windows converts). The new trend, supported by BYOD, JAMF, and services like the Device Enrollment Program (DEP), is thin imaging.
Monolithic imaging is the pursuit of capturing a perfect moment in time. On a given date, you will put into the image all required software, apply all the updates, define all settings, populate the user template, and snap an image that can be rapidly cloned to all other devices. Monolithic imaging requires an upfront investment in time to define the master image. However, you reap the dividends as cloning to additional devices can be measured in minutes.
Updates and patches are downloaded once. Settings are defined once. Capture the master image and cloned it to all the other devices. Latest iTunes? Part of the image. Print drivers? Part of the image. This approach places a lot of responsibility on the admin but ensures initial consistency and rapid device deployment.
Thin imaging takes a different approach. The thin imaging model creates the argument asking why are you replacing perfectly good bits (the factory installed OS) with the exact same bits? Just layer what you are missing (apps, settings, patches) on top of the default OS and call it a day. Thus, thin imaging is handing a machine to an end user with nothing added and delivering to the machine the items required when needed. For thin imaging to work, a product like JAMF is required (otherwise you are effectively performing a one-off build). Thin imaging requires much less upfront time and places the day long setup responsibility on the user.
Updates and patches need to be downloaded to each thin imaged device, requiring the end user to endure potentially long downloads and a number of reboots. The management system needs to be setup very well to ensure the delivery of settings, printers, etc. This approach places a behind the scenes responsibility on the administrator to ensure all the settings, software, and updates are available to make the device compliant, but also makes the end user endure what can be a lengthy setup.
The real problem with this debate is that it is not as easy as black and white. Nor is it a one or the other choice. There are conditions in which monolithic imaging remains the fastest method of deploying a fleet of devices. There are also occasions where thin imaging is the most efficient method of deployment. The bottom line is that you really must support both. I’ve worked long and hard and see no way around this conundrum. I’ve tried to embrace thin imaging but monolithic remains vital to a comprehensive workflow.
Also, starting with monolithic imaging is a great way to prepare for thin imaging. Thin imaging is all about your ability to deliver packages to users as quickly and reliably as possible. There is no better way to learn how to do this than digging into the trenches of monolithic imaging. Get your hands dirty and become intimate with the OS and your apps.
Lifecycle of a Mac
∂
As defined, I believe you must plan for the entire lifecycle of your Mac assets. Sure the initial deployment is what most people and tools focus on, but what about redeployment and decommissioning. These stages are part of a Mac’s life and should be considered and planned for before the first page of the Setup Assistant is completed. Let’s examine a scenario.
Your environment has embraced thin imaging. The promise of zero touch IT has everyone excited and your initial explorations are encouraging. You have DEP and VPP setup and working. Machines are delivered and on first boot are enrolled in your management console, allowing the on-enrollment policies to deliver everything the user needs and everything required by the organization.
You have gotten so confident that devices are shipped direct to the end user. IT does not touch the devices for prep. When the user receives the machine she follows the basics to get applications installed and DEP and VPP take care of the rest. This is great and wonderful... until the person leaves the company and the computer asset is returned to you. Now what?
Let’s follow this line of thinking. Mary Smith is hired and a MacBook Pro is shipped to her. The machine was purchased through the DEP program so it is linked to your MDM. It originally shipped with Yosemite. Mary receives the laptop, unboxes it, powers it up, creates a local account (let’s say she uses hadalittlelamb as the short name), and enrolls the device into your MDM during Setup Assistant. Your MDM delivers policy and applications and everything is great. A little over a year passes and Mary decides to resign, returning the MacBook Pro to IT.
In your lab, you unbox a corporate owned device that contains a user account named hadalittlelamb as the primary 501 account. You have no idea what Mary’s password is and the unit is filled with Mary’s data. Since you are using DEP, Mary is the initial admin of the machine and thus installed additional software.
You’re corporate policy has always been to nuke and pave devices that are to be repurposed. Now you are stuck with a challenge. How do you reset the machine for the next user?
This is a simplified scenario. Obviously if you are using JAMF you have access via the service account and if you are enforcing disk encryption you can retrieve the recovery key.
During Mary’s possession of the machine it was updated to El Capitan. Ah, but if you decide to restore from Internet Recovery, you will be given Yosemite. Do you restore Yosemite and call it a day? What if your corporate policy is El Capitan? Now you must restore, create an account, and update. Or, do you make that the responsibility of the next owner of the device? Do you restore Yosemite, hand the device to the new user and let them enroll the device back into the MDM, where a policy tells the machine to upgrade to Yosemite. This sounds like a pretty rough day for a new employee. “Welcome. Here is your machine. Complete Setup Assistant and wait a few hours for your machine to finish upgrading to El Capitan.
If your officially supported OS is El Capitan, you are now expecting the new employee to do everything that Mary did over the last year or so on his first day. Doesn’t sound like an inviting first day on the job to me.
Here is an example where monolithic imaging could make sense. For rapid turnaround of reclaimed devices, drop a basic OS and recovery partition on the machine in a few minutes. You can save the new employee a lot of time patching and installing if the image is kept up to date.
Considering an image of about 30 GB can be cloned in about 10 minutes, monolithic imaging remains fast and consistent. To save the new employee some time, you could even include some core applications in the image. Unboxing is not the only life event for a Mac.
The Quagmire of Monolithic
∂
Monolithic imaging is not all roses and unicorns. The trap of monolithic imaging caught many Mac administrators off guard when Apple released Yosemite 10.10.0, again when Apple released 10.10.4, and most recently with the release of 10.11.2. Embedded in these installers are firmware updates. If you cloned your fleet from a master system, the clones would not get the firmware updates. And since Apple was patching security exploits in the firmware, missing these updates was a very bad thing.
To compound the problem, Apple did not reveal details about the embedded firmware updates. This left system administrators to follow old habits. Have a fleet of machines running 10.9? Want to get to 10.10 quickly and avoid the inevitable upgrade issues? Create a monolithic image. By the time systems were cloned it was too late. This left a lot of systems imaged to Yosemite running without the latest firmware. This was evident in the Gray vs Black boot screen for 2013 and newer models that appeared shortly after the release of Yosemite.
Luckily, machines cloned before 10.10.4 got the firmware update when installing the 10.10.4 or 10.10.5 update.
Don’t count monolithic out. For education and the annual reset of devices, monolithic remains the king. Using NetRestore to rapidly reset a lab of computers remains a suitable use for the service. Build a monolithic image to rapidly prep machines remains the fastest way to get from box to desk. The big concern remains how Apple will handle future firmware releases.
Benefits and Drawbacks
∂
New Apple Hardware – A major drawback of monolithic images is that they don’t last forever. You may take a few days and build the ultimate 10.10.3 build. It is perfect. Everything is in there and you are cutting through your deployment project. And then... Apple released new hardware. And that hardware will only boot 10.10.4! Now what? Rebuild your entire image on the latest hardware and recapture your image? Plow 10.10.3 image onto the disk and then run a delta or combo updater on the drive before rebooting?
Anyone buying the latest 5K iMacs are all too aware of this condition. They only boot 10.11.1.
Your once perfect image now needs workarounds and hacks or a complete reworking. Multiply this with each time Apple released hardware linked to the latest OS or each time an update is posted for an included app (Flash Player anyone?) and you can begin to guess how much work it is to keep a monolithic image up to date.
Monolithic images remain effective for targeted deployments. For example, you have a customer with 30 machines and you are looking at 11 different OS versions and endless application variation. You want to unify the environment. Monolithic imaging remains one of the fastest and most effective ways to level the playing field and get everyone on the same level. Backup the user data, nuke and pave with a monolithic image, restore user data, and move on to the next machine.
If you are embracing thin imaging then you don’t care what OS Apple ties to hardware (as long as your third party tools are compatible). With thin imaging you simply layer on your apps, settings, and data and off you go.
But Apple does not have an effective thin imaging solution. You will need to seek third party tools to accomplish thin imaging effectively.
Speed of Deployment – Speed is a relative consideration for many people. While the delivery of a monolithic image may take 10 minutes, its creation may take days. There is a lot of upfront investment and the hope is that the dividends will be paid over time as systems can be rapidly cloned.
However, as stated, monolithic images may need constant rework or extensive post-cloning steps to get the unit “up to date.” After all, you snap an image that is 10.11.0 but now 10.11.2 is out. So you clone, then patch. There is a post-install step that consumes time.
However, building a machine via thin imaging may take an hour or more (depending on your delivery tools, where the machine is located, and how much is being layers on top of the OS). If you are using all PKG installers, then the process my iterate through each installer and patch to complete the system. Some installers take a long time. And what about the time it takes to build and validate all the packages needed for the install? While thin imaging remains more flexible, it is not necessarily quicker than monolithic to prepare.
Post Installation Steps – The scenario is simple. You create an image on March 1st. It is perfect... until March 2nd when new Firefox, Flash and Office patches are released. To keep you fleet up to date, you need to apply these after cloning. Now you have extra work. So the 10 minutes to clone just turned into 1 hour due to the post installation procedure.
Post installation steps in a monolithic workflow are nearly unavoidable. Sooner or later there will be patches or adjustments required with every cloned device. You goal is to limit the number of post deployment steps the best you can. Manual post install processes are subject to error and inconsistency. Do everything you can to limit the steps needed after cloning.
Thin imaging is effectively one long post installation process. All you efforts should be focussed on making the end user’s experience as simple as possible. The greatest challenges is time to deliver the tools. If you are using self-service solutions, the user can be waiting a long time for applications to install. For many non-technical end users, this delayed gratification can cause confusion and frustration.
Device Lifecycle – Macs have a life. Often, they have more of a life than I do. That life may involve thin imaging and monolithic imaging, depending on the devices role. Plan for the birth and death of Macs on your network, no matter the size.
User Data – Ah, the bane of our existence. As the old IT saying goes, “This job would be perfect if there were no users.” Well, there are. And it is your job to support them and their data. Monolithic imaging is destructive. Don’t expect to move a fleet from 10.10 to 10.11 with monolithic imaging without considering the user data. How do you protect it? Where do you put it? How rapidly can you move it off and back on again? Once again, time is money and moving data consumes plenty of both. Yet no where near the cost of lost data. Be smart, think about data. Even for users who leave the company. Backup their home folder. You never know when you may need to examine the data.
Which Is Best For Me?
∂
I’ve struggled with this question for what seems like years. I’ve finally come to the conclusion that the answer is likely both. Or, perhaps the best answer is whatever works for your environment and your current need. Until it does not... and you need to switch to the other method.
The bottom line is that if you are hand building every machine, you are wasting a lot of time and likely introducing a ton of variation to your deployed fleet. One-off hand built machines needs to stop. Unless you are the smallest of small deployments, invest your time in an imaging solution to speed delivery of your image and enforce consistency across the fleet.
If you have JAMF and are enrolled in DEP and VPP, embrace thin imaging. There is nothing easier than taking a machine out of the box, enrolling it, and walking away. But have a plan for when equipment comes back for repurposing.
If you are a school that supports labs of systems that don’t really need the cost or overhead of an MDM system, monolithic imaging (and re-imaging) is likely more cost effective and time efficient.
In the end there is no one solution to solve all scenarios. You need to know both monolithic and thin imaging. For example, if you are starting a Mac deployment and you are looking at a palette of new systems, thin imaging may work if you have DEP and and MDM in place. But what happens in 12 months when some of those machines are turned in for reuse by other users?
Do you remove the user data and hand it back out? Do you erase and start over? Well if you don’t have a monolithic image, you are reinstalling the OS from scratch.