Picture a small corporate IT group with around 12TB of data growing at 25 percent per year. Their backup software licensing is capacity-based and they have conservatively estimated that 50 percent of their data is inactive. Given the relatively small amount of data, and the limited resources they have available, they decide to move this data to the cloud. Not only does this move cut their backup windows in half, but now they have less backup capacity charges to worry about. They are able to make two redundant archive copies to cloud storage for a fraction of what it would have cost them to keep that data on the Primary Tier. The change was seamless to their existing backup operations. Implementing a cloud archive strategy is both easier and less costly with the right storage management software.
Managing Budgets and Growing Capacities
When a data-driven business predicts that their data will expand at a compounded annual rate, business growth mandates may seem to directly clash with budget constraints. Let us say, for example, that a midsized corporate IT department currently runs 600TB on their primary storage and expects their managed data to increase by roughly 40 percent per year. Adding 240TB of high-performance HDD per year doesn’t fit their budget. Rather than implementing severe storage quotas to departments, they deploy a new storage management software. They will be able to move an estimated 70 percent of data deemed inactive to the Perpetual Tier, consisting of a combination of NAS and tape.
Imagine a governmental research facility collecting large amounts of machine-generated data from sensors that detect physical phenomena. The output may not be analyzed immediately, but the data is by no means disposable and must be kept for future reference. Most of the machine-generated data they receive initially requires high-speed disk as a landing zone. With storage management software, individual researchers can designate the storage layer for data, immediately moving it to lower cost storage after it lands and bringing it back as needed. Data that needs to be safeguarded can be moved to a tape storage tier – which is not externally accessible – as well as directed to the cloud for distribution or sharing. This type of project identification allows multiple forms of data from multiple types of data generators to be collected, archived and accessed many years into the future.
In environments such as a large university supporting multiple research projects, where different groups use the same data storage infrastructure under standardized service level agreements (SLAs) based on performance, storage management software can help accelerate research by moving inactive data and whole data sets off high-speed storage. Providing access across all data – human, application or machine generated – it allows administrators and researchers to identify what is actively used, what needs to be archived and what data is orphaned.
Now imagine a midsize corporate IT department in the process of upgrading their primary storage flash array. They have roughly 500TB of data in primary storage and estimate 60 percent of it is inactive. By deploying storage management software to remove “low transaction” data from the Primary Tier, they can now more affordably upgrade to using high-performance SSD, NVMe flash or other cutting-edge technologies. They are able to significantly increase the performance of their data center and work with state-of-art technology, and users have seamless access to migrated data on the NAS in the Perpetual Tier.
Drawing from our 40 years of experience in the storage industry, Spectra is excited to introduce StorCycle storage management software, and a new, two-tiered paradigm for storage that enables data to reside on the appropriate level of storage. Read “Six Ways to Implement StorCycle in a Modern Storage Workflow” to learn more and find out if StorCycle can be a fit for your organization.
Spectra Logic, Storage, Paradigm, Managing, Use Cases, PR, Press Release, Blog, Insight