Google Transfer Service Mitigates Cloud Migration Migraines
Google Cloud announced beta availability of its Transfer Service for on-premises data this week as a free service. To further expedite cloud migration, the cloud provider claims that the new platform removes the technical complexities inherent with large-scale, online data transfers for moving customer’s on-premise data centers into the Google Cloud.
Google Cloud announced beta availability of its Transfer Service for on-premises data this week as a free service. To further expedite cloud migration, the cloud provider claims that the new platform removes the technical complexities inherent with large-scale, online data transfers for moving customer’s on-premise data centers into the Google Cloud.
For many organizations, it’s not the destination that’s has them wary of moving on-premise data centers into the Google Cloud, it’s the challenge of transportation that continues to plague cloud adoption as the risk of potentially damaging or even losing data is greater than the possible reward.
Transfer Service is Google Cloud’s attempt to claim the hearts and hardware of those organizations still operating in on-premises environments that want to move but lack confidence in transferring to the cloud.
The new offering is a managed service meant to streamline the process of transferring large-scale data. Upon installing the on-premises software, which utilizes Docker-based software agents to house data, the user can select which files to upload from a network file system (NFS) and initiate transfer to Cloud Storage, the company explained in an article announcing the new service.
The service provides a high-speed connection to Google Cloud for an efficient way to move billions of files occupying petabytes of data. Users can scale up to utilize tens of Gb/s worth of bandwidth and use multiple agents to handle the transfer. Files weighing up to 100 TB can be uploaded with a single transfer, Google says.
Safeguards are included to provide assurance that in-progress data transfers can be restarted in the event that any problems show up.
The use of cloud platforms for storage continues to grow as consumers, companies, and connected things and devices generate every-increasing volumes of data. And collecting, processing, and retrieving massive sets of data requires the right applications in the right places if enterprises are to create value from the data. Gartner said this trend shows cloud service provider infrastructures and the services that run on them are becoming the new data management platform.
Earlier this year at the annual Google Cloud Next conference, the cloud provider announced its move into on-premises data centers and across clouds — including competitors Amazon Web Services (AWS) and Microsoft Azure — with Anthos, its hybrid-cloud platform.
According to Google Cloud CEO Thomas Kurian, Anthos is the result of listening to customers’ needs.
“First they want the ability to have a technology stack that they can run in their data center next to enterprise workloads that they couldn’t yet move to the cloud,” he said. “Second: a single programming model that gives them the choice and flexibility to move workloads to both Google Cloud as well as to other cloud providers without any change. And third, a platform that allows them to operate this infrastructure without complexity and to secure and manage across multiple clouds in a single and consistent way.”
Recent initiatives like Anthos and Transfer Service for on-premises data are pathways paved and promises delivered for cloud migration that don’t require full architectural construction to transition over to Google Cloud.
“I see enterprises default to making their own custom solutions, which is a slippery slope as they can’t anticipate the costs and long-term resourcing,” said Scott Sinclair, senior analyst at ESG in a blog post. “With Transfer Service for on-premises data (beta), enterprises can optimize for TCO and reduce the friction that often comes with data transfers. This solution is a great fit for enterprises moving data for business-critical use cases like archive and disaster recovery, lift and shift, and analytics and machine learning.”