The NXP eIQ ML (edge intelligence machine learning) software environment provides tools to perform inference on embedded systems using neural network models. The software includes optimizations that leverage the hardware capabilities of the i.MX8M Nano family for improved performance. Examples of applications that typically use neural network inference include object/pattern recognition, gesture control, voice processing, and sound monitoring.

eIQ includes support for four inference engines:

Performance numbers documented by NXP have been made with i.MX8M Plus, a CPU that has a dedicated Neural Processing Unit (NPU). Expect lower performance on ConnectCore 8M Nano. Differences in CPU speed and memory bus width can also affect performance.

Include eIQ packages in Digi Embedded Yocto

Edit your conf/local.conf file to include the eIQ package group in your Digi Embedded Yocto image:

conf/local.conf
IMAGE_INSTALL:append = " packagegroup-imx-ml"

This package group contains all of NXP’s eIQ packages compatible with the ConnectCore 8M Nano.

Including this package group increases the size of the rootfs image significantly. To minimize the increase in image size, select a subset of its packages depending on your needs. See the package group’s recipe for more information on the packages it contains.

More information

See NXP’s i.MX Machine Learning User’s Guide for more information on eIQ.