Embedded Vision Alliance: Technical Articles

Machine Learning On Edge Devices: Benchmark Report

This article was originally published at Tryolabs' website. It is reprinted here with the permission of Tryolabs.

Why edge computing?

Humans are generating and collecting more data than ever. We have devices in our pockets that facilitate the creation of huge amounts of data, such as photos, gps coordinates, audio, and all kinds of personal information we consciously and unconsciously reveal.

Cloud and Edge Vision Processing Options for Deep Learning Inference

Should deep learning-based computer vision processing take place in the cloud, at the edge, or both? This seemingly simple question has a complicated answer: "it depends." This article provides perspectives on the various factors you should consider, and with what priorities, when making this implementation decision for your particular project's requirements.

Rapid Prototyping on NVIDIA Jetson Platforms with MATLAB

Bookmark and Share

Rapid Prototyping on NVIDIA Jetson Platforms with MATLAB

Register or sign in to access the Embedded Vision Academy's free technical training content.

The training materials provided by the Embedded Vision Academy are offered free of charge to everyone. All we ask in return is that you register, and tell us a little about yourself so that we can understand a bit about our audience. As detailed in our Privacy Policy, we will not share your registration information, nor contact you, except with your consent.

Registration is free and takes less than one minute. Click here to register, and get full access to the Embedded Vision Academy's unique technical training content.

If you've already registered, click here to sign in.

See a sample of this page's content below:


This article was originally published at NVIDIA's website. It is reprinted here with the permission of NVIDIA.

This article discusses how an application developer can prototype and deploy deep learning algorithms on hardware like the NVIDIA Jetson Nano Developer Kit with MATLAB. In previous posts, we explored how you can design and train deep learning networks in MATLAB and how you can generate optimized CUDA code from your deep learning algorithms.

In our experience working with deep learning engineers, we often see they run into challenges when prototyping with real hardware because they have to manually integrate their entire application code, such as interfaces with the sensors on the hardware, or integrate with the necessary toolchain to deploy and run the application on the hardware. If the algorithm does not have the expected behavior or if it does not meet the performance expectation, they have to go back to their workstation to debug the underlying cause.

This post shares how an application developer can deploy, validate and verify their MATLAB algorithms on real hardware like the NVIDIA Jetson platform by:

  • Using live data from the Jetson board to improve algorithm robustness
  • Using hardware-in-the-loop simulation for verification and performance profiling
  • Deploying standalone applications on the Jetson board

Using NVIDIA Jetson with MATLAB

MATLAB makes it easier to prototype and deploy to NVIDIA hardware through the NVIDIA hardware support package. It provides simple APIs for interactive workflow as well as standalone execution and enables you to:

  1. Connect directly to the hardware from MATLAB and test your application on sensor data from the hardware
  2. Deploy the standalone application to the Jetson board
  3. Debug any issues before deploying a standalone application to the Jetson board 

These workflows steps are illustrated in the figure below:


Figure 1: Illustrating the three steps of the workflow

The support package supports the NVIDIA Jetson TK1, Jetson TX1, Jetson TX2, Jetson Xavier and Jetson Nano developer kits. It also...

The Next Phase of Deep Learning: Neural Architecture Learning Leads to Optimized Computer Vision Models

This article was originally published by Xnor.ai. It is reprinted here with the permission of Xnor.ai.

How Xnor.ai Managed to Squeeze a Deep Neural Network onto a $20 Wyze Camera

This article was originally published by Xnor.ai. It is reprinted here with the permission of Xnor.ai.

What Is Deep Learning? Three Things You Need to Know

This article was originally published at MathWorks' website. It is reprinted here with the permission of MathWorks.

Sensor Modules Help Accelerate Embedded Vision Development

This article was originally published at FRAMOS' website. It is reprinted here with the permission of FRAMOS.

Automated Optical Inspection

This article was originally published at Basler's website. It is reprinted here with the permission of Basler.

Optical Measurement Systems inspect objects and detect a variety of different characteristics. With its large selection of area scan and line scan cameras, Basler has the right model for any camera inspection task.

Live and in Color: Why Color Calibration is So Important in Medical Technology

This article was originally published at Basler's website. It is reprinted here with the permission of Basler.

Event-based Sensing Enables a New Generation of Machine Vision Solutions

This article excerpt is published in full form at Prophesee's website. It is reprinted here with the permission of Prophesee.

Event-based sensing is a new paradigm in imaging technology inspired by human biology. It promises to enable a smarter and safer world by improving the ability of machines to sense their environments and make intelligent decisions about what they see.