Coral USB Accelerator brings powerful ML (machine learning) inferencing capabilities to existing Linux systems. Featuring the Edge TPU, a small ASIC designed and built by Google, the USB Accelerator provides high performance ML inferencing with a low power cost over a USB 3.0 interface. For example, it can execute state-of-the-art mobile vision models, such as MobileNet v2 at 100+ fps, in a power-efficient manner. This allows fast ML inferencing to embedded AI devices in a power-efficient and privacy-preserving way.
Models are developed in TensorFlow Lite and then compiled to run on the USB Accelerator.
Edge TPU key benefits:
High speed TensorFlow Lite inferencing
Low power
Small footprint
Features
Google Edge TPU ML accelerator coprocessor
USB 3.0 Type-C socket
Supports Debian Linux on host CPU
Models are built using TensorFlow. Fully supports MobileNet and Inception architectures though custom architectures are possible
Compatible with Google Cloud
Specifications
Arm 32-bit Cortex-M0+ Microprocessor (MCU): Up to 32 MHz max 16 KB Flash memory with ECC 2 KB RAM
Connections: USB 3.1 (gen 1) port and cable (SuperSpeed, 5Gb/s transfer speed)
Included cable is USB Type-C to Type-A
Coral, a division of Google, helps build intelligent ideas with a platform for local AI.