Edge AI & IoT

Edge AI Smart Agriculture Device

Field Deployable Crop & Soil Monitor

Ongoing
Edge AI & IoT
Solar + 18650 pack
Power Budget
~12 ms / sample
On-Device Inference
LoRaWAN / Wi-Fi
Uplink
8
Sensor Channels

Overview

A weather-sealed device built around an ESP32-S3 + Coral micro-TPU. Capacitive soil-moisture probes, an SHT35 ambient sensor, and a small multispectral leaf camera feed a TFLite model that scores irrigation need and early disease risk on-device. Data is buffered to flash, summarized hourly, and uplinked via LoRaWAN (or Wi-Fi when available) to a self-hosted dashboard. A companion Flutter app exposes per-plot insights and lets the farmer trigger irrigation valves over MQTT.

The Problem

Farms that would benefit most from precision agriculture sit outside reliable cellular coverage and can't run cloud-dependent IoT. Existing field sensors are dumb data loggers — they push raw readings and leave the farmer to interpret them. The goal was a device that decides locally, alerts only when it matters, and keeps working offline for days.

The Approach

An ESP32-S3 runs FreeRTOS with a tight power-managed sample-infer-uplink cycle. Soil moisture (capacitive), SHT35 ambient T/RH, and a 4-band multispectral leaf camera produce a feature vector consumed by a quantized TFLite Micro model trained against agronomist-labelled data. Inference runs on a Coral micro-accelerator in ~12 ms; uplink uses LoRaWAN class A by default with opportunistic Wi-Fi. The Flutter app and self-hosted Grafana dashboard share a single MQTT broker, so valve control and visualization stay consistent.

Results

A device that runs unattended on solar + a small 18650 pack, surfaces actionable irrigation/disease alerts instead of raw telemetry, and degrades gracefully when uplink is lost. Early field trials show LoRa battery life measured in weeks, not hours.

Code Highlight

ESP32-S3 — TFLite Micro Inference Loop
// Edge AI Smart Agri Node — irrigation-need inference
// Runs on ESP32-S3 + Coral micro accelerator

#include "tensorflow/lite/micro/micro_interpreter.h"
#include "tensorflow/lite/micro/all_ops_resolver.h"
#include "sensors.h"
#include "uplink.h"
#include "model_data.h"

static constexpr int kArenaBytes = 96 * 1024;
static uint8_t tensor_arena[kArenaBytes];

void run_node(void) {
    tflite::AllOpsResolver resolver;
    const tflite::Model* model = tflite::GetModel(g_model_data);
    tflite::MicroInterpreter interp(model, resolver,
                                    tensor_arena, kArenaBytes);
    if (interp.AllocateTensors() != kTfLiteOk) abort();

    TfLiteTensor* in  = interp.input(0);
    TfLiteTensor* out = interp.output(0);

    for (;;) {
        SensorFrame f = sensors_sample();   // soil, SHT35, NDVI bands
        feature_pack(f, in->data.f);

        if (interp.Invoke() != kTfLiteOk) continue;

        float irrigate_p = out->data.f[0];
        float disease_p  = out->data.f[1];

        if (irrigate_p > 0.7f) valve_open(IRRIG_DURATION_S);
        if (disease_p  > 0.6f) uplink_alert(ALERT_DISEASE, disease_p);

        uplink_buffer(f, irrigate_p, disease_p);
        rtos_sleep_until_next_window();
    }
}

Like what you see?

I'm always open to collaborations on AI, robotics, edge computing, or embedded systems.