Aperture is designed to bite into small chunks at a time, so TBs of data do not bother the app itself. However, handling super-large batches of data like 1.5 TB on consumer hardware tends to be problematic, it is that simple.
Slower speeds seem to exacerbate handling large data chunks, and Drobos historically tend to be slow.
IMO referenced Masters make far more sense than building huge managed-Masters Libraries. With referenced Masters one has no need to copy a 1.5 TB sized file.
• Hard disk speed. Drives slow as they fill so making a drive more full (which managed Masters always does) will slow down drive operation.
• Database size. Larger databases are by definition more prone to "issues" than smaller databases are.
• Vaults. Larger Library means larger Vaults, and Vaults are an incremental repetitive backup process, so again larger Vaults are by definition more prone to "issues" than smaller Vaults are. One-time backup of Referenced Masters (each file small, unlike a huge managed-Masters DB) is neither incremental nor ongoing; which is by definition a more stable process.
Managed-Masters Libraries can work, but they cannot avoid the basic database physics.
Note that whether managed or referenced, original images should be separately backed up prior to import into Aperture or any other images management application. IMO after backing up each batch of original images importing that batch into Aperture as a new Project by reference makes by far the most sense. Building a huge managed Library or splitting into multiple smaller Libraries is less logical.
HTH
-Allen