I-Pi OSM i.MX93 — Compact Edge AI Powerhouse with eIQ Integration

The I-Pi OSM i.MX93 brings next-generation edge capability into an ultra-compact OSM form factor. Designed for developers building AI-driven products, it combines NXP’s i.MX93 dual-core Arm® Cortex-A55 + Ethos-U65 NPU with ADLINK’s production-ready Yocto BSP—now enhanced with NXP eIQ™ AI Toolkit support out of the box.

This platform gives customers a fast, low-risk path to build smart, connected, and power-efficient devices across industrial, medical, retail, and IoT markets.

Edge AI Made Practical

With integrated eIQ ML support, customers can go from model development to deployment without friction.

Built-in eIQ capabilities include:

  • TensorFlow Lite, Arm NN, and ONNX Runtime
  • Model quantization, optimization, and benchmarking tools
  • Deployment-ready ML pipelines
  • Pre-tested eIQ demo application in ADLINK Yocto image
    (per: ADLINK’s ML Demo procedure)

This dramatically shortens the “prototype → PoC → product” cycle by eliminating manual integration and toolchain setup.

Key Hardware Advantages

• i.MX93 CPU + NPU:
Dual Cortex-A55 (up to 1.7 GHz) with Ethos-U65 microNPU for real-time ML acceleration.

• Ultra-compact OSM Size-S module:
Reliable, solder-down design ideal for high-volume products.

• Rich Connectivity:
USB-C OTG, Ethernet, Wi-Fi/BT, MIPI-CSI/DSI, I²C, SPI, UART—ready for sensors, cameras, and displays.

• Industrial-grade reliability:
Robust thermal performance and long lifecycle availability.

• ADLINK-quality Yocto BSP:
Stable, production-ready, and maintained by ADLINK’s embedded Linux team.

Fast ML Deployment Workflow

https://www.nxp.com//videos/poster/TIP-EIQ-SOFTWARE-DEVELOPMENT-ENVIROMENT-INTRO.jpg

  1. Develop & Train ML Models
    Build classification, detection, anomaly, or semantic models using the eIQ Toolkit.
  2. Optimize & Quantize
    Convert models to low-footprint formats suitable for the NPU.
  3. Package for Yocto
    Use ADLINK’s pre-configured Yocto layer with eIQ already enabled.
  4. Deploy to I-Pi OSM i.MX93
    Run your optimized ML models directly on the NPU for low-latency performance.

Try Now eIQ® Toolkit on ADLINK OSM-i.MX93

If you are curious about what ADLINK NXP has to offer, you can try the demos on our development right now by visiting our ADLINK NXP repository wiki page.