Taiwanese electronics giant BIOSTAR has become the latest to adopt NVIDIA’s Jetson ecosystem for machine learning and artificial intelligence (ML and AI) at the edge, launching its own carrier board for the Jetson Orin Nano or Jetson Orin NX Super system-on-modules: the AI-NONXS Developer Kit.
“Designed to support NVIDIA Jetson Orin NX and Orin Nano series modules, the AI-NONXS Developer Kit empowers users to efficiently build and deploy AI-powered solutions across a broad range of real-world applications,” BIOSTAR claims of its creation, which stands as a more feature-packed alternative to NVIDIA’s own carrier board design. “From smart manufacture, smart retail, and automated warehouses to smart cities, transportation, and smart agriculture, the AI-NONXS provides the computing power, flexibility, and reliability required for advanced edge systems.”
BIOSTAR has launched a new carrier board for the NVIDIA Jetson Orin Nano and Orin NX, with a wealth of features. (📷: BIOSTAR)
The board itself, brought to our attention by CNX Software, accepts the SODIMM-form-factor modules in NVIDIA’s Jetson line, with precise specifications depending on whether you install the Jetson Orin Nano or Jetson Orin NX Super. The carrier then breaks out the module’s functionality to a number of ports, including one gigabit Ethernet and one 2.5-gigabit-Ethernet network port, four USB 3.2 Gen. 2 ports, a micro-USB port, analog audio in and out, two MIPI Camera Serial Interface (CSI) ports, one each of M-key, E-key, and B-key M.2 slots, an HDMI video output (2.1 if fitted with the Orin NX, 1.4 if using the Orin Nano), and RS232, RS422, RS485, and CAN bus connectivity.
For power, BIOSTAR has designed the carrier board to accept 12-24V inputs, making it suitable for use in industrial scenarios, though its somewhat limited operating temperature range of -20–60°C (-4–140°F) may restrict exactly where it can be deployed. For software, anything that can run on either the Jetson Orin Nano or Jetson Orin NX Super modules should be compatible — including NVIDIA’s own JetPack 6.2 software stack for local execution of large language models (LLMs), vision language models (VLMs), and other artificial intelligence and machine learning workloads.
More information is available on the BIOSTAR website, though at the time of writing the company had yet to announce pricing and availability.