Page 1
8M Plus Edge AI Kit Linux Yocto User Manual V1.0...
Page 2
Edge AI Kit-Linux-Yocto-User-Manual-v1.0 Copyright Statement: • The Edge AI Kit and its related intellectual property are owned by Avnet Manufacturing Services. • Avnet Manufacturing Services has the copyright of this document and reserves all rights. Any part of the document should not be modified, distributed or duplicated in any approach and form without the written permission issued by Avnet Manufacturing Services.
Edge AI Kit-Linux-Yocto-User-Manual-v1.0 Revision History Rev. Description Author Date v1.0 Initial version Monica 2022/12/20 http://avnet.me/imx8mplus-edgeai...
Software ........................21 eIQ ..........................21 4.1.1 eIQ inference runtime overview for i.MX8M+ ..........21 4.1.2 Tensorflow Lite ....................21 Chapter 5 Appendix ........................25 Hardware Documents ....................25 Software Documents ....................25 Linux System Image and Application Development ........... 25 http://avnet.me/imx8mplus-edgeai...
Page 5
Edge AI Kit-Linux-Yocto-User-Manual-v1.0 5.3.1 Out of box System Image ................25 5.3.2 Yocto BSP ......................25 5.3.3 eIQ ........................25 Contact Information ............Error! Bookmark not defined. http://avnet.me/imx8mplus-edgeai...
Chapter 1 Introduction 1.1 Target Board Edge AI Kit is a development board developed by Avnet, based on the i.MX 8M+ processor from NXP. 1.2 Introduction This document provides a guide to prepare Edge AI Kit to boot up with the Verified Linux Package and introduces how to use the supported functions.
2.1.2 Hardware Preparation You'll need a small phillips screwdriver to complete the following steps: 1. Plug the Avnet i.MX 8M Plus SMARC SOM into the EP5 carrier board at an angle, and then push down so it rests flat on the standoffs.
Edge AI Kit-Linux-Yocto-User-Manual-v1.0 2. Use the provided screws to attach the heat sink to the Avnet i.MX 8M Plus SMARC SOM. 3. Connect the 12 volt, 3 amp power connector to the board, but do not plug it in yet.
2.1.4 Software Tools Preparation Install Tera Term terminal software For Windows-based command-line debug output and command entry, the use of • Tera Term terminal software is recommended • Download and install teraterm-***.exe and configure the relevant COM port as shown below: http://avnet.me/imx8mplus-edgeai...
The desired boot mode can be selected via a 4 pole Dipswitch located next to the microUSB port on the SMARC Carrier. The following table describes the available boot options for the Edge AI Kit. See the i.MX 8M Plus Applications Processor Reference Manual from NXP for a complete description.
Edge AI Kit-Linux-Yocto-User-Manual-v1.0 2.2.2 Boot from SD Card 2.2.2.1 Downloading operating system images http://avnet.me/imx8mplus-edgeai Visit the downloads page: to download the out of box Yocto image 2.2.2.2 Flash the SD Card You will need a 16+GB microSD card + adapter Download and install the flash tool –...
Page 13
Edge AI Kit-Linux-Yocto-User-Manual-v1.0 When the system boots-up, the TeraTerm will print the following: • Avnet Embedded Strudel Distro 0.1.0-ff2970817ef8e5172d4b6593e31cd82758a1477f sm2s- imx8mp ttymxc1 sm2s-imx8mp login: Enter username as “root”, no password. • The Linux system interface also supports directly attached keyboard and mouse •...
3.1.1 Available Device Tree overlays These are the overlays for the devices included in the Edge AI Kit. For overlays for additional compatible devices, see the MSC-LDK Manual from Avnet Embedded. SOM overlay: msc-sm2s-imx8mp-24N0600I-module.dtb Carrier overlay: overlay-baseboard-ep5.dtb...
By default, both HDMI and LVDS are selected for the fdt_overlay, so HDMI should work out of the box. In U-Boot, use the printenv command to check: u-boot=> printenv fdt_overlay fdt_overlay=overlay-lvds0-ama-101a01.dtb overlay-hdmi.dtb overlay-cam1-ap1302-ar1335- dual.dtbo If you wish to use only HDMI and not LVDS, set the environment like this: setenv fdt_overlay 'overlay-hdmi.dtb overlay-cam1-ap1302.dtb’ http://avnet.me/imx8mplus-edgeai...
CMOS active-pixel digital image sensors with a pixel array of 4208H x 3120V. The modules are manufactured by Rapyrus. The AR1335 digital image sensor features 1.1µm pixel technology that delivers superior low-light image quality through leading sensitivity, quantum efficiency and linear full well. http://avnet.me/imx8mplus-edgeai...
: set ar0144 dual config 3.3.3 Configure Dual vs Single Image Sensors The Edge AI Kit currently supports both single and dual camera configurations. Dual or single cameras can be configured using device tree overlays from U-Boot, like so: http://avnet.me/imx8mplus-edgeai...
To configure dual AR0144, use this command and then reboot: ap1302-cfg.sh ar0144-dual 3.3.5 Troubleshooting Cameras You can check that both cameras boot with the status command: root@sm2s-imx8mp:~# ap1302-cfg.sh status U-Boot Env overlay configuration Cannot read environment, using default http://avnet.me/imx8mplus-edgeai...
Page 19
--list-devices /dev/v4l-subdev0 FSL Capture Media Device (platform:mxc-md): /dev/media0 vsi_v4l2dec (platform:vsi_v4l2dec): /dev/video1 vsi_v4l2enc (platform:vsi_v4l2enc): Check the CSI interface: root@sm2s-imx8mp:~# media-ctl -p The last command that can be run is the status/log of the CSI interface to check for ECC errors: http://avnet.me/imx8mplus-edgeai...
NXP’s processor by using delegates to accelerate supported operations in hardware. Read more about how delegates work in eIQ here: http://avnet.me/maaxboard-ml-delegates The following five inference engines are currently supported in the NXP eIQ software stack: TensorFlow Lite, ONNX Runtime, PyTorch, DeepView RT, and OpenCV.
Page 22
4.1.2.4 Verify whether acceleration is running on NPU or GPU To verify whether hardware acceleration is running on the Verisilicon NPU, you can look at the interrupt for the galcore-3D driver before and after running your model with the following command: root@sm2s-imx8mp:/usr/bin/tensorflow-lite-2.6.0/examples# cat /proc/interrupts | grep galcore:3d http://avnet.me/imx8mplus-edgeai...
Page 23
Note that the video examples take a path to a video file as input (720p30 video is recommended), while the camera examples target a single MIPI camera. For example, for the Mobilenet SSD Camera example, run the following script: gstnninferencedemo-mobilenet-ssd-camera http://avnet.me/imx8mplus-edgeai...
Page 24
Edge AI Kit-Linux-Yocto-User-Manual-v1.0 http://avnet.me/imx8mplus-edgeai...
The BSP is currently available by request only. To access the board support package (BSP) to build your own Yocto image for the Edge AI Kit, contact support.boards@avnet.com. In your email, please clearly state that you are working with the Edge AI Kit and would like that BSP.
Need help?
Do you have a question about the i.MX 8M Plus and is the answer not in the manual?
Questions and answers