Archives

data sovereignty

With Hyperforce, Salesforce lets you move your data to any public cloud

For much of its existence, Salesforce was a cloud service on its own with its own cloud resources available for its customers, but as the company and cloud computing in general has evolved, Salesforce has moved some of its workloads to other clouds like AWS, Azure and Google. Now, it wants to allow customers to do the same.

To help facilitate that, the company announced Hyperforce today at its Dreamforce customer conference, a new architecture designed from the ground up to help customers deliver workloads to the public cloud of choice.

The idea behind Hyperforce is to enable customers to take all of the data in what Salesforce calls Customer 360 — that’s the company’s detailed view of the customer across channels, Salesforce products and even other systems outside the Salesforce family — and be able to store that in whichever public cloud you want in whatever region you happen to operate. For now, they are in India and Germany, but there are plans to add support for 10 additional countries over the next year.

Company president and CTO Bret Taylor introduced the new approach. “We call this new capability Hyperforce. Simply put, we’ve been working to enable us to deliver Salesforce on public cloud infrastructure all around the world,” Taylor said.

Holger Mueller, an analyst at Constellation Research, says the underlying architecture running the Salesforce system is long overdue for an overhaul. At over 20 years old, it’s been around a long time now, but Mueller says that it’s about more than modernizing. “The pandemic requires SaaS vendors to move their offerings from their own data centers to [public cloud] data centers, so they can offer both architectural and commercial elasticity to their customers,” he said.

Mueller added that by bringing Salesforce data into the public cloud, besides the obvious data sovereignty issues it solves, it bring all of the advantages of using public cloud resources.

“Salesforce can now offer both architectural and commercial elasticity to their customers. Commercial elasticity matters a lot to CIOs and CTOs these days because when your business slows down, you pay less, and when your business accelerates, then you can afford to pay more,” he said. He says that Salesforce is bringing an early generation SaaS product and pulling it into the modern age, something that is imperative at this point in the company’s evolution.

But while moving forward, Taylor was careful to point out that they rebuilt the system in such a way as to be fully backwards compatible, so you don’t have to throw out all of the applications and investment you’ve made over the years, something that most companies couldn’t afford to do.”For you developers out there, This is the most remarkable thing. It is 100% backwards compatible, your apps will work with no changes and you can benefit from all of this automatically,” he said.

The company will be rolling out Hyperforce over the next year and beyond as it opens in more regions.

India may become next restricted market for U.S. cloud providers

Data sovereignty is on the rise across the world. Laws and regulations increasingly require that citizen data be stored in local data centers, and often restricts movement of that data outside of a country’s borders. The European Union’s GDPR policy is one example, although it’s relatively porous. China’s relatively new cloud computing law is much more strict, and forced Apple to turn over its Chinese-citizen iCloud data to local providers and Amazon to sell off data center assets in the country.

Now, it appears that India will join this policy movement. According to Aditya Kalra in Reuters, an influential cloud policy panel has recommended that India mandate data localization in the country, for investigative and national security reasons, in a draft report set to be released later this year. That panel is headed by well-known local entrepreneur Kris Gopalakrishnan, who founded Infosys, the IT giant.

That report would match other policy statements from the Indian political establishment in recent months. The government’s draft National Digital Communications Policy this year said that data sovereignty is a top mission for the country. The report called for the government by 2022 to “Establish a comprehensive data protection regime for digital communications that safeguards the privacy, autonomy and choice of individuals and facilitates India’s effective participation in the global digital economy.”

It’s that last line that is increasingly the objective of governments around the world. While privacy and security are certainly top priorities, governments now recognize that the economics of data are going to be crucial for future innovation and growth. Maintaining local control of data — through whatever means necessary — ensures that cloud providers and other services have to spend locally, even in a global digital economy.

India is both a crucial and an ironic manifestation of this pattern. It is crucial because of the size of its economy: public cloud revenues in the country are expected to hit $2.5 billion this year, according to Gartner’s estimates, an annual growth rate of 37.5%. It is ironic because much of the historical success of India’s IT industry has been its ability to offer offshoring and data IT services across borders.

Indian Prime Minister Narendra Modi has made development and rapid economic growth a top priority of his government. (Krisztian Bocsi/Bloomberg via Getty Images)

India is certainly no stranger to localization demands. In areas as diverse as education and ecommerce, the country maintains strict rules around local ownership and investment. While those rules have been opening up slowly since the 1990s, the explosion of interest in cloud computing has made the gap in regulations around cloud much more apparent.

If the draft report and its various recommendations become law in India, it would have significant effects on public cloud providers like Microsoft, Google, Amazon, and Alibaba, all of whom have cloud operations in the country. In order to comply with the regulations, they would almost certainly have to expend significant resources to build additional data centers locally, and also enforce data governance mechanisms to ensure that data didn’t flow from a domestic to a foreign data center accidentally or programmatically.

I’ve written before that these data sovereignty regulations ultimately benefit the largest service providers, since they’re the only ones with the scale to be able to competently handle the thicket of constantly changing regulations that govern this space.

In the India case though, the expense may well be warranted. Given the phenomenal growth of the Indian cloud IT sector, it’s highly likely that the major cloud providers are already planning a massive expansion to handle the increasing storage and computing loads required by local customers. Depending on how simple the regulations are written, there may well be limited cost to the rules.

One question will involve what level of foreign ownership will be allowed for public cloud providers. Given that several foreign companies already exist in the marketplace, it might be hard to completely eliminate them entirely in favor of local competitors. Yet, the large providers will have their work cut out for them to ensure the market stays open to all.

The real costs though would be borne by other companies, such as startups who rely on customer datasets to power artificial intelligence. Can Indian datasets be used to train an AI model that is used globally? Will the economics be required to stay local, or will the regulations be robust enough to handle global startup innovation? It would be a shame if the very law designed to encourage growth in the IT sector was the one that put a dampener on it.

India’s chief objective is to ensure that Indian data benefits Indian citizens. That’s a laudable goal on the surface, but deeply complicated when it comes time to write these sorts of regulations. Ultimately, consumers should have the right to park their data wherever they want — with a local provider or a foreign one. Data portability should be key to data sovereignty, since it is consumers who will drive innovation through their demand for best-in-class services.

MongoDB launches Global Clusters to put geographic data control within reach of anyone

MongoDB‘s Atlas service has been giving companies a managed database service in the cloud for some time. Mongo deals with all the heavy lifting behind the scenes, relieving the developer of creating it all themselves. Today the company announced it was taking that a step further by allowing customers to have granular control over where the data lives, with a new feature called Global Clusters.

This allows companies to choose a cloud provider, then move to any location in the world data from a MongoDB database running in Atlas. As MongoDB CTO and co-founder Eliot Horowitz explained, it doesn’t matter who the cloud provider is, you can set a data location policy, select a cloud vendor and data center location and see what the results will look like on a graphical representation of the world map. When you give the OK, Mongo moves the data automatically for you in the background without shutting anything down to do it.

Global Clusters interface. Screenshot: MongoDB

Lots of countries are requiring proof of data sovereignty, including the EU’s GDPR rules that went into effect last month. It has been challenging for many businesses to comply with these kinds of rules on their own. Horowitz said he created geographic partitions before Atlas and it required a tremendous amount of engineering effort. By providing it as a service in this fashion, the company is putting this kind of data migration in the hands of the smallest business owners, giving them that geographic granularity from the start.

“I think what you’re going to see is a lot of [small businesses], who feel they can now compete with some of the larger websites and actually have that high level of service on day one, rather than having to wait to hire a team of engineers.”

The beauty of this approach from Mongo’s perspective is that they don’t even have to worry about building a worldwide data center presence of their own. Instead, they simply piggyback on the global locations of each of the main public cloud providers, AWS, Microsoft and Google.

“The cool thing is that those data centers come from any of the cloud providers. So you can actually do this on any cloud provider that you want in any region [they support],” he said.

This feature will be available starting today for all Atlas users.