AI at the Edge GigaOm Research Byte

What AI needs to go mainstream

The future of artificial intelligence (AI) relies on advances in the underlying machine learning (ML) technology, and an effective ML architecture for computing at the edge.

Through research with organizations such as Google, Arm, and leading universities that focus on data science, GigaOm reports on two important factors required for ML planning and to broaden the impact and deliver on the promise of AI.

Neither factors pertain to how AI’s are built and trained, but rather to where they are deployed and used.  They are:

  • The decrease in cost and increase in power of high-performance chips that can do AI and ML inference “at the edge.”
  • The development of middleware that allows a broader range of intelligent AI and ML applications to run seamlessly on a wider variety of chips.

AI at the Edge

Download the full report, “AI at the Edge,” to better understand how the right ML architecture will ensure that AI meets its potential in our pockets, our cars, our houses, and a hundred other places.

Loading...