Archives

bigquery

Google Cloud opens its Seoul region

Google Cloud today announced that its new Seoul region, its first in Korea, is now open for business. The region, which it first talked about last April, will feature three availability zones and support for virtually all of Google Cloud’s standard service, ranging from Compute Engine to BigQuery, Bigtable and Cloud Spanner.

With this, Google Cloud now has a presence in 16 countries and offers 21 regions with a total of 64 zones. The Seoul region (with the memorable name of asia-northeast3) will complement Google’s other regions in the area, including two in Japan, as well as regions in Hong Kong and Taiwan, but the obvious focus here is on serving Korean companies with low-latency access to its cloud services.

“As South Korea’s largest gaming company, we’re partnering with Google Cloud for game development, infrastructure management, and to infuse our operations with business intelligence,” said Chang-Whan Sul, the CTO of Netmarble. “Google Cloud’s region in Seoul reinforces its commitment to the region and we welcome the opportunities this initiative offers our business.”

Over the course of this year, Google Cloud also plans to open more zones and regions in Salt Lake City, Las Vegas and Jakarta, Indonesia.

RealityEngines launches its autonomous AI service

RealityEngines.AI, an AI and machine learning startup founded by a number of former Google executives and engineers, is coming out of stealth today and announcing its first set of products.

When the company first announced its $5.25 million seed round last year, CEO Bindu Reddy wasn’t quite ready to disclose RealityEngines’ mission beyond saying that it planned to make machine learning easier for enterprises. With today’s launch, the team is putting this into practice by launching a set of tools that specifically tackle a number of standard enterprise use cases for ML, including user churn predictions, fraud detection, sales lead forecasting, security threat detection and cloud spend optimization. For use cases that don’t fit neatly into these buckets, the service also offers a more general predictive modeling service.

Before co-founding RealiyEngines, Reddy was the head of product for Google Apps and general manager for AI verticals at AWS. Her co-founders are Arvind Sundararajan (formerly at Google and Uber) and Siddartha Naidu (who founded BigQuery at Google). Investors in the company include Eric Schmidt, Ram Shriram, Khosla Ventures and Paul Buchheit.

As Reddy noted, the idea behind this first set of products from RealityEngines is to give businesses an easy entry into machine learning, even if they don’t have data scientists on staff.

Besides talent, another issue that businesses often face is that they don’t always have massive amounts of data to train their networks effectively. That has long been a roadblock for many companies that want to see what AI can do for them but that didn’t have the right resources to do so. RealityEngines overcomes this by creating realistic synthetic data that it can then use to augment a company’s existing data. In its tests, this creates models that are up to 15 percent more accurate than models that were trained without the synthetic data.

“The most prominent use of generative adversarial networks  — GANS — has been to create deep fakes,” said Reddy. “Deepfakes have captured the public’s imagination by highlighting how easy it to spread misinformation with these doctored videos and images. However, GANS can also be applied to productive and good use. They can be used to create synthetic datasets which when then be combined with the original data, to produce robust AI models even when a business doesn’t have much training data.”

RealityEngines currently has about 20 employees, most of whom have a deep background in ML/AI, both as researchers and practitioners.

 

Google makes moving data to its cloud easier

Google Cloud today announced Transfer Service, a new service for enterprises that want to move their data from on-premise systems to the count. This new managed service is meant for large-scale transfers on the scale of billions of files and petabytes of data. It complements similar services from Google that allow you to ship data to its data centers via a hardware appliance and FedEx or to automate data transfers from SaaS applications to Google’s BigQuery service.

Transfer Service handles all of the hard work of validating your data’s integrity as it moves to the cloud. The agent automatically handles failures and use as much available bandwidth as it can to reduce transfer times.

To do this, all you have to do is install an agent on your on-premises servers, select the directories you want to copy and let the service do its job. You can then monitor and manage your transfer jobs from the Google Cloud console.

The obvious use case for this is archiving and disaster recovery. But Google is also targeting this at companies that are looking to lift and shift workloads (and their attached data), as well as analytics and machine learning use cases.

As with most of Google Cloud’s recent product launches, the focus here is squarely on enterprise customers. Google wants to make it easier for them to move their workloads to its cloud and for most workloads, that also involves moving lots of data as well.