Grove Vision AI V2 (WE2) without Arduino SSCMA library

Hi there,

and Really good work , shaking it down. I asked that custom code question in the live stream when WE2 first showed up.

Answer: No, not directly.

  • The Grove Vision AI V2 is locked to model invocation and fixed AT-style interactions.
  • It does NOT run arbitrary user code unless you recompile the firmware (which is undocumented and unsupported). :dizzy_face:
  • You can load new models via SD card or serial, but not full custom logic beyond what the AT command set exposes.

So for any real-time decision-making, data filtering, or dynamic response, you’ll need to offload that to a host MCU — which is where If You are BOLD and capable,(seems you are) :grin: Tomorrow’s Announcement may make your Day.

  • You’re using a RAK3172 (STM32-based) MCU module with SX1262 LoRa radio and RAK’s RUI firmware.
  • The firmware does not transparently forward AT commands (UART/I2C), which blocks communication with the Grove Vision AI V2.
  • You are trying to send AT+MODELS? and AT+INVOKE=1,0,1 to the Vision AI camera.
  • You need a microcontroller that can parse logic AND handle AI accessory communication reliably.

Who Knew the best solution would be delivered on a Platter… Served up by those Awesome Seeedineers

Here is what’s on your menu… :stuck_out_tongue:

1. Dual-core MCU (Arm Cortex-M33 + Network core)

  • One core could handle LoRa or BLE communication, while the other handles AI model logic and AT command communication.
  • Massive step up from single-core STM32 MCUs like in the RAK3172.

2. Generous RAM & Flash

  • Enough space for rules-based logic, custom applications, and maybe even basic ML models locally (Edge Impulse has nRF54 support now).
  • Also large enough to buffer or translate AT command streams between interfaces.

3. No Vendor Lock-in

  • Unlike the RUI firmware from RAK, you are in full control of UART, I2C, GPIO, and timing behavior.
  • This makes it ideal for building bridges between external devices like the Grove Vision AI and a LoRa radio.

4. Flexible Peripheral Routing

  • Multiple UART/I2C/SPI interfaces can be routed freely.
  • Perfect for creating a transparent AT pass-through from USB or BLE to the Vision AI.

5. Works with Zephyr and PlatformIO

  • If they want modern RTOS capabilities with great toolchain support.
  • Debugging, OTA, and peripheral control are miles ahead compared to most AT-centric firmware like RUI.

That is Why the nRF54L15 Sense Would Work :shushing_face:
comes out tomorrow

Welcome to level 5 :raised_hand_with_fingers_splayed:

I asked AI for an Architecture : WOW…
You seated… Probably , :grin:

check it out.

[ Grove Vision AI V2 ]
        ↕ (UART AT commands)
[ nRF54L15 Sense ]
        ↕
   [ LoRa or BLE output ]
        ↕
[ Cloud / Dashboard ]

  • The nRF54L15 can poll the camera, parse responses (AT+MODELS?, AT+INVOKE, etc.), and take actions accordingly.
  • If the user wants to send AI events over BLE, LoRa, or even USB serial, the 54L15 can handle all that in one device.

the nRF54L15 Sense is an ideal upgrade path for this application. It solves the transparency issue, adds real compute power, and removes dependency on vendor AT firmware limitations.

If they’re open to leaving the RAK3172 behind and going full custom MCU, this is the move.

Here’s how to build a transparent AT bridge + logic controller using the nRF54L15 Sense with the Grove Vision AI V2 over UART. You’ll get full control of communication and can layer your own logic on top (e.g. rule-based triggers, BLE alerts, LoRa messages, etc.).

Project Goal:

Use the nRF54L15 Sense to:

  1. Talk to the Grove Vision AI V2 via UART using AT commands.
  2. Read & parse responses (e.g., models detected).
  3. Apply basic logic to the results.
  4. Optionally forward results over BLE, USB serial, or LoRa (if added via module).
Device Connection Notes
Grove Vision AI V2 UART TX ↔ RX, RX ↔ TX, GND
nRF54L15 Sense UART Use a hardware UART (e.g. UARTE0)
Optional USB Serial For debugging or pass-through console
Optional BLE Notify smartphone or central app

Dependencies

You’ll want to use the nRF Connect SDK (NCS) with Zephyr RTOS:

  • uart_async_api for UART comms
  • console or logging for debug output
  • bt_nus or ble peripheral for BLE if needed
  • k_work for deferred logic handling

Example: AT Command Bridge + Simple Logic

Here’s a simplified Zephyr C example (assumes UARTE0 connected to Grove Vision AI):

// REV 0.1a - AT Bridge & Trigger Logic - PJG + ChatGPT
#include <zephyr/kernel.h>
#include <zephyr/drivers/uart.h>
#include <zephyr/sys/printk.h>
#include <string.h>

#define UART_DEV_NODE DT_NODELABEL(uart0)
const struct device *uart_dev = DEVICE_DT_GET(UART_DEV_NODE);

#define CMD_BUF_SIZE 256
static char uart_rx_buf[CMD_BUF_SIZE];
static int rx_pos = 0;

static void send_at_command(const char *cmd)
{
    uart_tx(uart_dev, cmd, strlen(cmd), SYS_FOREVER_US);
    uart_tx(uart_dev, "\r\n", 2, SYS_FOREVER_US);
    printk("Sent AT cmd: %s\n", cmd);
}

static void process_ai_response(const char *resp)
{
    // Example logic: check if "person" was detected
    if (strstr(resp, "person")) {
        printk(">>> PERSON DETECTED <<<\n");
        // TODO: trigger BLE alert or LoRa packet here
    }
}

static void uart_cb(const struct device *dev, void *user_data)
{
    uint8_t c;
    while (uart_fifo_read(dev, &c, 1)) {
        if (c == '\n' || rx_pos >= CMD_BUF_SIZE - 1) {
            uart_rx_buf[rx_pos] = '\0';
            printk("RX: %s\n", uart_rx_buf);
            process_ai_response(uart_rx_buf);
            rx_pos = 0;
        } else {
            uart_rx_buf[rx_pos++] = c;
        }
    }
}

void main(void)
{
    printk("AT Bridge Booted. UART ready.\n");

    if (!device_is_ready(uart_dev)) {
        printk("UART not ready\n");
        return;
    }

    uart_irq_callback_user_data_set(uart_dev, uart_cb, NULL);
    uart_irq_rx_enable(uart_dev);

    k_msleep(500);

    // Start polling the AI module
    send_at_command("AT+MODELS?");
    while (1) {
        send_at_command("AT+INVOKE=1,0,1");
        k_sleep(K_SECONDS(10));
    }
}

Result:

  • UART initializes and communicates with Grove Vision AI.
  • Every 10 seconds, it sends AT+INVOKE to run inference.
  • If "person" is in the response, the MCU prints it — or triggers a response like BLE notify, buzzer, etc.

Optional Logic Layer

Add a logic queue, such as:

if (strstr(resp, "car") && !strstr(resp, "person")) {
   // Suspicious vehicle, no person
   trigger_alert();
}

WOWsa, I say it nailed it, but I love the added bonus:

Bonus Features You Can Add:

Feature How
BLE output Use bt_nus service to notify mobile device
LoRa packet Send a short encoded packet to HomeBase
OLED display Show current detection model name / state
Button override Add physical input to trigger AI mode switch

What to Prepare

  • :white_check_mark: Flash nRF54L15 with west flash (or CMSIS-DAP via PlatformIO soon)
  • :white_check_mark: Confirm Grove Vision AI is powered & UART wired correctly
  • :white_check_mark: Run PuTTY or serial to watch logs
  • :white_check_mark: Ready to add your BLE/LoRa hooks

Ready , set …GO! :face_with_hand_over_mouth:

HTH
GL :slight_smile: PJ :v:

Lik that home alone kid, “you hungry and wants some more”

the wiki is LIVE!
Get a serious Libation :grin: of your choice, Get the space Quiet and conducive to learning and taking over the world :v:

GL Mr. Phelps…This tape will self destruct in 5 seconds. Poof! :sunglasses: