The increased usage of big data and the increasing need for security are two of the great data center issues facing CIO’s.
The pressures evolving from the increased use of large scale data analytic software and regulatory data retention requirements means that data centers have to store a wider variety of data types, whilst the data has to be more readily accessible for longer periods.
Tight budgets require most data managers to focus on resource conservation with shrinking technical staff to handle inexorably increasing data processing demands, avoiding time consuming manual procedures.
Staffing data centers remains another major concern. According to a recent State of the Data Center Report, thirty-eight percent of data center operations are understaffed. Forty-three percent of IT operations report that finding qualified candidates for open positions is a major problem.
Because of the increasing mass of data storage and processing, critical infrastructure service continuity and security has also become an increasing concern.
Design of uninterruptable power supplies, diesel back-up generator systems, fail-safe power generation system and complex air conditioning are needed to support critical IT operations. The reliability of complex one-time engineered power and cooling systems becomes increasingly problematic as the load on critical infrastructure increases as the data mass continues to build.
During operation, finding accurate ways of pre-empting faults in power or cooling systems is vitally important. Ordinary load tests at time of commissioning often do not replicate dynamic real world conditions leaving data center operators unaware of the actual resilience of one of their most important assets.
Automated daily storage tasks like virtual machine migration, defragmentation, file system re-sizing and compression do make operations more efficient. Change-management tools, cluster management tools and storage management software are now simpler, cheaper and easier to use. . These automated tools can be combined with replication technologies to reduce the workflow burden of data replication, backup and recovery.
Many CIOs are resorting to public cloud services to reduce some of the pressures of large scale data storage. However, using this approach raises concerns about data security and sovereignty. A recent survey of “hackers” at the Defcon conference found that ninety-six percent of those who responded believe the cloud provides new opportunities for data theft. Eighty-six percent believed that cloud vendors are not doing enough to secure their servers. Developments like CloudProtect software does offer some assurance that work is being done to provide new layers of protection for cloud stored data.
An alternative to public cloud storage, considered by many CIOs is to use modular data centers to create or expand private or hybrid cloud operations.
Modular data centers are self-contained units that are scalable end-end-to-end solutions typically deployed to a client site and represent a manageable in-house, centrally controlled, and economical way of extending a secure data handling capacity. Provisioned complete with end-to-end compute and storage infrastructure with cloud management software, or simply as a physical housing for client-infrastructure, an MDC solution deployed to a client site represents an immediate and more cost effective cloud solution than would be available via a traditional public cloud provider.
For any questions regarding the Datapod modular data center solution.
Also, you may like to download the Datapod White Paper.
The image above appears courtesy of http://commons.wikimedia.org/wiki/File:Cyber_Security_at_the_Ministry_of_Defence_MOD_45153617.jpg