Edge Computing and Local AI, Google Coral Hardware is too Modest Compared to Int
AI enables machines to perform various tasks that used to belong only to the human domain. For example, set up AI-driven cameras to discover product defects, in order to know that if quality control is required on the factory production line. How to analyze medical data? Machine learning can identify potential tumors from the scan and label them to the doctor.
However, they are only useful if such applications are fast and safe. There are not many devices that use AI cameras in factories to process images for a few minutes. And if they are sent to the cloud for analysis, no patient is willing to risk exposing their medical data.
These are the problems that Google is trying to solve through a plan called Coral. "Traditionally, data from AI devices is sent to large computing instances, which are located in centralized data centers, and machine learning models can be run quickly," Coral product manager Vikram Tank explained to The Verge via email. "Coral is Google’s hardware and software component platform that helps you build devices using local AI, that is providing hardware acceleration for neural networks on edge devices."
You may have never heard of Coral before ("Graduated" from the Beta version last October), but it is part of the rapidly growing field of AI. Market analysts predict that, more than 750 million edge AI chips and computers will be sold by 2020 and will grow to 1.5 billion by 2024. Although most of them will be installed in consumer devices such as telephones, a large part is for enterprise customers in the industry: such as automobiles and healthcare.
In order to meet customer needs, Coral provides two main types of products: accelerators and development boards for prototyping new ideas, and modules designed to power the AI brain of production equipment such as smart cameras and sensors. In both devices, the core of the hardware is Google's Edge TPU, which is an ASIC chip optimized to run lightweight machine learning algorithms-similar to the water-cooled TPU used in Google's cloud servers. Tank said that although individual engineers can use their hardware to create interesting projects, for example, Coral provides guidance on how to build AI marshmallow sorters and smart bird feeders), the long-term focus is on corporate customers Healthcare and other industries.
As an example of a Coral positioning solution, Tank provides a scenario of self-driving cars that uses machine vision to identify objects on the street.
He said: "Cars traveling at 65 mph will cross nearly 10 feet in 100 milliseconds. Therefore, for example, any "processing delay" caused by a slow mobile connection will "increase the risk of critical use cases." "Coral can analyze on the device without having to wait for a slow connection to determine whether the stop sign or the street light in front is on. This is much safer.
Tank said there are similar benefits in improving privacy. He said: "Consider medical device manufacturers who want to use image recognition to perform real-time analysis of ultrasound images." Sending these images to the cloud will provide a potential weak link for hackers to locate, but analyzing the images on the device can make patients and doctors "confident that the data processed on the device will not exceed their control."
Google's Edge TPU, a micro-processing chip optimized for AI, is the core of most Coral products.
Tank said that although Coral’s market target is the business world, the project actually stems from Google’s “AIY” - DIY machine learning suite. The AIY kit was launched in 2017 and is supported by Raspberry Pi computers, allowing anyone to build their own smart speakers and smart cameras, and has achieved great success in the STEM and maker markets. Tank said that the AIY team quickly noticed that although some customers just wanted to follow the instructions and make toys, other customers wanted to use hardware to make their own device prototypes. Coral was created to cater to these customers.
The problem with Google is that dozens of companies have businesses similar to Coral. The business scope of these companies starts with startups such as Xnor in Seattle, which make AI cameras efficient enough to run on solar power. And then comes to powerful companies like Intel, which launched one of the first enterprise USB accelerators in 2017. And it acquired the company in December last year for $2 billion. Chip maker Habana Labs improves its edge AI products (among other functions).
In view of the large number of competitors, the Coral team said that by closely integrating its hardware with Google’s AI service ecosystem, it can be different.
This series of products, including chips, cloud training, and development tools, has long been the main force of Google's artificial intelligence series. As far as Coral is concerned, there is an AI model library specially compiled for its hardware, and AI services on Google Cloud, which are directly integrated with various Coral modules (such as its environmental sensors).
In fact, Coral is tightly integrated with Google’s AI ecosystem. Edge TPU-powered hardware can only be used in conjunction with Google’s ML framework TensorFlow. Verge talks about competitors in the AI edge market, which may be a limiting factor. "Edge products are processed specifically for their platforms, and our products support all major AI frameworks and models on the market," a spokesperson for AI edge company Kneron told The Verge. (Kneron said that its assessment is not "negative," and that Google’s entry into the market is welcomed because it "verifies and drives innovation in this area.")
However, it is unclear what business Coral is currently doing. Google will certainly not push Coral as strongly as the Cloud AI service, and the company has yet to disclose any sales data and targets. However, a source familiar with the matter did tell The Verge that most of Coral's orders are for a single device, including AI accelerators and development boards. And only a few customers order 10,000 devices in the enterprise.
For Google, the appeal of Coral may not necessarily be income, but just to learn more about how AI is applied in important areas. In today's world of practical machine learning, all roads inevitably flock to artificial intelligence at the edge.
AI hardware and software supporting Google Edge TPU edge artificial intelligence computing chip
The Model Play and Tiorb AIX launched by Gravitylink can also perfectly support Edge TPU. Tiorb AIX is an artificial intelligence hardware that integrates two core functions of computer vision and intelligent voice interaction. It is equipped with professional AI edge computing chips and various sensors. Model Play is an AI model resource platform for global developers. It has built-in diversified AI models, compatible with Tiorb AIX, and supports Google Edge TPU, an edge artificial intelligence computing chip, to accelerate professional-level development. In addition, Model Play provides complete and easy-to-use transfer learning model training tools and rich model examples, which can be combined with Titan AIX to achieve rapid development of various artificial intelligence applications. Based on Google's open source neural network architecture and algorithms, it builds an autonomous transfer learning function. Users do not need to write code. They can complete AI model training by selecting pictures, defining models and category names, and realize easy learning and development of artificial intelligence.
However, they are only useful if such applications are fast and safe. There are not many devices that use AI cameras in factories to process images for a few minutes. And if they are sent to the cloud for analysis, no patient is willing to risk exposing their medical data.
These are the problems that Google is trying to solve through a plan called Coral. "Traditionally, data from AI devices is sent to large computing instances, which are located in centralized data centers, and machine learning models can be run quickly," Coral product manager Vikram Tank explained to The Verge via email. "Coral is Google’s hardware and software component platform that helps you build devices using local AI, that is providing hardware acceleration for neural networks on edge devices."
You may have never heard of Coral before ("Graduated" from the Beta version last October), but it is part of the rapidly growing field of AI. Market analysts predict that, more than 750 million edge AI chips and computers will be sold by 2020 and will grow to 1.5 billion by 2024. Although most of them will be installed in consumer devices such as telephones, a large part is for enterprise customers in the industry: such as automobiles and healthcare.
In order to meet customer needs, Coral provides two main types of products: accelerators and development boards for prototyping new ideas, and modules designed to power the AI brain of production equipment such as smart cameras and sensors. In both devices, the core of the hardware is Google's Edge TPU, which is an ASIC chip optimized to run lightweight machine learning algorithms-similar to the water-cooled TPU used in Google's cloud servers. Tank said that although individual engineers can use their hardware to create interesting projects, for example, Coral provides guidance on how to build AI marshmallow sorters and smart bird feeders), the long-term focus is on corporate customers Healthcare and other industries.
As an example of a Coral positioning solution, Tank provides a scenario of self-driving cars that uses machine vision to identify objects on the street.
He said: "Cars traveling at 65 mph will cross nearly 10 feet in 100 milliseconds. Therefore, for example, any "processing delay" caused by a slow mobile connection will "increase the risk of critical use cases." "Coral can analyze on the device without having to wait for a slow connection to determine whether the stop sign or the street light in front is on. This is much safer.
Tank said there are similar benefits in improving privacy. He said: "Consider medical device manufacturers who want to use image recognition to perform real-time analysis of ultrasound images." Sending these images to the cloud will provide a potential weak link for hackers to locate, but analyzing the images on the device can make patients and doctors "confident that the data processed on the device will not exceed their control."
Google's Edge TPU, a micro-processing chip optimized for AI, is the core of most Coral products.
Tank said that although Coral’s market target is the business world, the project actually stems from Google’s “AIY” - DIY machine learning suite. The AIY kit was launched in 2017 and is supported by Raspberry Pi computers, allowing anyone to build their own smart speakers and smart cameras, and has achieved great success in the STEM and maker markets. Tank said that the AIY team quickly noticed that although some customers just wanted to follow the instructions and make toys, other customers wanted to use hardware to make their own device prototypes. Coral was created to cater to these customers.
The problem with Google is that dozens of companies have businesses similar to Coral. The business scope of these companies starts with startups such as Xnor in Seattle, which make AI cameras efficient enough to run on solar power. And then comes to powerful companies like Intel, which launched one of the first enterprise USB accelerators in 2017. And it acquired the company in December last year for $2 billion. Chip maker Habana Labs improves its edge AI products (among other functions).
In view of the large number of competitors, the Coral team said that by closely integrating its hardware with Google’s AI service ecosystem, it can be different.
This series of products, including chips, cloud training, and development tools, has long been the main force of Google's artificial intelligence series. As far as Coral is concerned, there is an AI model library specially compiled for its hardware, and AI services on Google Cloud, which are directly integrated with various Coral modules (such as its environmental sensors).
In fact, Coral is tightly integrated with Google’s AI ecosystem. Edge TPU-powered hardware can only be used in conjunction with Google’s ML framework TensorFlow. Verge talks about competitors in the AI edge market, which may be a limiting factor. "Edge products are processed specifically for their platforms, and our products support all major AI frameworks and models on the market," a spokesperson for AI edge company Kneron told The Verge. (Kneron said that its assessment is not "negative," and that Google’s entry into the market is welcomed because it "verifies and drives innovation in this area.")
However, it is unclear what business Coral is currently doing. Google will certainly not push Coral as strongly as the Cloud AI service, and the company has yet to disclose any sales data and targets. However, a source familiar with the matter did tell The Verge that most of Coral's orders are for a single device, including AI accelerators and development boards. And only a few customers order 10,000 devices in the enterprise.
For Google, the appeal of Coral may not necessarily be income, but just to learn more about how AI is applied in important areas. In today's world of practical machine learning, all roads inevitably flock to artificial intelligence at the edge.
AI hardware and software supporting Google Edge TPU edge artificial intelligence computing chip
The Model Play and Tiorb AIX launched by Gravitylink can also perfectly support Edge TPU. Tiorb AIX is an artificial intelligence hardware that integrates two core functions of computer vision and intelligent voice interaction. It is equipped with professional AI edge computing chips and various sensors. Model Play is an AI model resource platform for global developers. It has built-in diversified AI models, compatible with Tiorb AIX, and supports Google Edge TPU, an edge artificial intelligence computing chip, to accelerate professional-level development. In addition, Model Play provides complete and easy-to-use transfer learning model training tools and rich model examples, which can be combined with Titan AIX to achieve rapid development of various artificial intelligence applications. Based on Google's open source neural network architecture and algorithms, it builds an autonomous transfer learning function. Users do not need to write code. They can complete AI model training by selecting pictures, defining models and category names, and realize easy learning and development of artificial intelligence.