There are beliets and there are a number of chief drivers that are shaping the data centre strategies of CIOs. These are the result of several macro-economic, geographical and developmental shifts including location, policy and automation and cloud communities.
With globalising accelerating the pace of business change, the locations in which businesses are looking to operate are growing. Many organisations are looking to enter entirely new markets - even operating on a global scale - in order to remain competitive and enhance profitability.
There’s a new breed of consumption models that’s making globalisation aspirations achievable. Cloud computing has emerged as a means to embrace globalisation without having to establish new data centres in the new geographies.
For example, a Western European university might be looking to sell its education services in Singapore. Serving students from the other side of the world might not immediately appear feasible. But thanks to cloud computing, today the institution could realistically pursue this opportunity by running this section of its business school out of a public cloud facility based in Singapore.
Data centre professionals must factor their organisation globalisation plans into their data centre strategies, and consider how cloud-based models can enable expansion into new markets and territories,” says Leahy.
Risk is working its way to the top of the list of considerations regarding identification and selection of the optimal data centre location. Following recent earthquake activity around the globe, for example, it makes sense to move primary facilities away from areas with high levels of seismic activity.
Businesses eyeing geographies where energy is both affordable and renewable
Meanwhile, with the ever-increasing cost of energy, a number of businesses are eyeing geographies where energy is both affordable and renewable. Today, more businesses see the wisdom in locating their data centres in countries where power is created in an ecologically and environmentally efficient way. Then there’s also the notion of ‘following the sun’, and being able to automatically direct workloads to the data centres where energy is less expensive,” explains Leahy who points out that almost everywhere in the world, energy is cheaper at night.
The network is also relevant to the matter of data centre location. While access to highly available network facilities and adequate bandwidth have been must-have’s for effective data centre operations, today, networking costs, quality, latency and availability have jumped up to number one or two on the list of considerations.
For example, Amsterdam is receiving significant levels of interest, as it benefits from greater access to physical space compared to many major cities. More importantly, it’s the closest city to AMS-IX, the fastest Internet exchange in Europe. In fact, the country is emerging as a popular data centre facility setting, and is seeing the highest take-up among European tier-one markets.
Leahy believes that today’s data centre location plans also need to heed the advent of ‘big data’ and the new sources from which businesses are obtaining data.
Collecting data from places and devices never before imagined
Today. organisations are collecting data from places and devices never before imagined. For example, it’s becoming common for utility providers to station smart grids or meters in the homes of consumers, which monitor usage patterns. This has resulted in a fundamental shift in terms of questions such as: Where’s my data coming from? What data am I going to keep? What data will I share with others? The rise of big data and analytics therefore fundamentally influences decisions in relation to where a data centre should be built.
In fact, the question might be whether to build a data centre at all – but rather participate in a shared or cloud-based facility. In the past, organisations would station their data centres in close proximity to their knowledge workers. Today it could make more sense to locate them close to where the data itself is being gathered.
The notion of opex versus capex models is also an important factor to consider when charting a data centre journey. For example, is there merit in moving to a cloud-based, pay-as-you-go operating model, does it make sense to put it in a facility that’s going to remain on your books as an asset for 20 years? On the other hand, if an organisation puts its data in a cloud provider’s data centre, then it’s truly running as an operating expense.
Some businesses may be overlooking the importance of a strong focus on policy and automation. As businesses move ahead on their next-generation data centre journeys, they need to think long and hard about how technology advances can be exploited for the benefit of the business. For example, if an organisation opts to virtualise its desktop, what additional data centre space will be required? Are there workloads that could be better run in the cloud or in a third-party facility? Should we revisit our policies and automate our system so that we can redirect capacity and resources to alternative workloads on weekends and public holidays, or where energy is cheapest, depending on the time of day.
Tax authority saves thousands by placing its customer online self-help environment in a public cloud
Originally, the organisation assumed that, as a financial services operator subject to stringent governance and privacy regulations, using public cloud facilities was not an option. It was pointed out to them that online customer self-help activity, which represented 40 percent of its overall workload during the busy tax season, didn’t involve the sharing of confidential data, and could indeed be operated in a less expensive public cloud environment for which the organisation would pay only during a few weeks of the year.
While the concept of community clouds is not new, Leahy believes that the true value that organisations can extract form community cloud constructs remains untapped. Community clouds involve more than just sharing infrastructure and computing power among affiliated organisations in a public cloud environment. They hold immense potential to facilitate sharing of intellectual property, best practice and enhance collaboration and productivity. Consider for example, how much more quickly and effectively a complex medical diagnosis can be made if a single radiology image can be shared in real-time among a geographically dispersed team of experts, as opposed to it residing in a single, on-premise hospital data centre.
The rise in popularity of community clouds is influencing many organisations’ thinking regarding their data centres. If an organisation decides they’re going to participate in a community cloud, they must decide whether all participants will build a single facility and benefit from the associated economies of scale or alternatively make use of a co-location environment. If, on the other hand, they elect to use their own data centre, they’ll need to ensure that it has access to adequate bandwidth so that all community members can participate and communicate with one another effectively.
* Kevin Leahy is Dimension Data’s Chief Architect for Virtualisation and Cloud.