You can now talk to my front garden

A few years ago I created a project called Natural Networks. The idea was to connect environmental data with machine learning - in this case, water quality at points on the canal network.

The ML part was a recurrent neural network trained on literature, generating poetry from sensor data. The outputs were strange and abstract: grammatically clunky, occasionally beautiful, not quite making sense.

But the question underneath it - what happens when you connect environmental data to systems that can interpret and express it, has stayed with me.

The tools available then made the answer more philosophical than practical. But fast forward a few years and the scope of question of this has only increased.

At Agreena, I spent a couple of years working on products built around agricultural and environmental data, linking soil carbon, crop productivity and field practices to enable more sustainable farming. One of the recurring tensions in that work was the gap between data that existed and data that was actually useful to the people making decisions on the ground.

As Dieter Helm puts it: 'Putting in the hands of farmers the tools to see in real time what is happening to their soil, their crops, their water supplies, the run-off, and the water contaminants opens up great scope to adapt to the new farming future.'

The domain changes; agriculture, air quality, urban infrastructure, but the challenge remains a constant. Data without application is just evidence. The interesting work is putting that into context, designed in the right form, connected to something that can change.


Connecting my garden

The front garden project is a small attempt to close that gap in a specific, concrete way.

There's an air quality monitor outside my house feeding a conversational agent in near real time. You can ask it what the air is like right now. You can ask it why particulates tend to spike on still days. You can ask it how today compares to the same time last year. The sensor data isn't sitting in a dashboard waiting to be interpreted, it's connected to something that can reason about it, explain it, and respond to questions.

A single sensor in a front garden is still just an experiment. But the same architecture applies at neighbourhood scale, something I'm exploring a few streets away by my kids school.

The broader opportunity feels significant and still largely untapped. Hardware is getting cheaper and more capable, products like Ambient Works show there's already a professional appetite for environmental sensing.


The missing layer isn't the sensors. It's the intelligence that sits above them: AI that can reason about what the data means, connect it to context, and feed into systems that can actually act on it.

That might mean responsive buildings that adapt to occupancy and air quality in real time. Neighbourhood monitoring networks that give communities the same environmental intelligence currently only available to researchers. Urban infrastructure that doesn't just log what's happening but responds to it. The pieces; affordable hardware, capable models, agentic frameworks; are largely there.

The design and integration work to make it useful rather than just technically possible is where the interesting problems are.

The thread that runs from the RNN poetry experiments years ago to a garden air monitor in Brighton today. The question was always the same. The tools we have now offer a new set of potential answers.

On Process

The sensor enclosure was modelled in Blender using Claude MCP, and printed with some help from the good people in the workshop at Plus X. The agent itself was built with an easy to follow a tutorial from Chris Downs.



Have a chat with my garden here: https://garden.jamescuddy.co.uk

Get in touch

contact@jamescuddy.co.uk

|

Linkedin