avatar
Mop Bucket @mopbucket.bsky.social

So the meters already had the automated reading system installed, you just wrote a script that calls the API and puts the data in the utility's DB?

aug 23, 2025, 8:15 pm • 1 0

Replies

avatar
harphacker.bsky.social @harphacker.bsky.social

No the utility meters were not involved. We stood up an OpenADR system that broadcast the prices and collected power consumption from home batteries and EVs. We also had a bidirectional V2G charger on the program. These were billed separately from the primary meter.

aug 23, 2025, 8:18 pm • 0 0 • view
avatar
Mop Bucket @mopbucket.bsky.social

I think we're talking about different systems... how does the data get from the meters to you local water company?

aug 23, 2025, 8:25 pm • 0 0 • view
avatar
harphacker.bsky.social @harphacker.bsky.social

Ah ok. Yes for the water company each meter has a Raspberry pi pico that reads the switch closure on the meter and counts gallons. It wakes up periodically and communicates with another pico that is on s meshtastic lora network. We are installing the devices and setting up the network.

aug 23, 2025, 8:30 pm • 1 0 • view
avatar
Mop Bucket @mopbucket.bsky.social

Wait, you budgeted for an in-house hardware solution but wouldn't have bothered writing the data collection software without an LLM? Does this water co. serve more than, like, 25 homes?

aug 23, 2025, 8:38 pm • 0 0 • view
avatar
harphacker.bsky.social @harphacker.bsky.social

Less. Maybe 5-10. But yes. This is me and another person doing the development and we want it to be a cheap as possible. I used the LLM to get the two devices talking to each other and to get the power management sorted. I don't have the coding chops to write the code from scratch. But i can debug

aug 23, 2025, 8:42 pm • 1 0 • view
avatar
Damien Tonkin @madscitool.bsky.social

So you didn't actually need the LLM you just didn't personally know how to code it? Every time I want to code something and I don't know how I learn something which I can take into other projects. Because you used LLMs to do it for you you're no better off than you were before in terms of knowledge.

aug 24, 2025, 1:15 am • 0 0 • view
avatar
harphacker.bsky.social @harphacker.bsky.social

The LLM is often wrong or writes buggy code. But stepping through it with a debugger gives me a good sense of what it is supposed to do. It still takes time and effort. However I still think it is a "hit the ground running" kind of start rather than self bootstrap. It's an assistant.

aug 24, 2025, 1:53 am • 0 0 • view
avatar
harphacker.bsky.social @harphacker.bsky.social

Could I do it without it? Yes, it would just take me a lot longer. Is it worth it? I think that question will be answered ultimately by whether the externalities are built into the price. Right now they are not.

aug 24, 2025, 1:55 am • 0 0 • view
avatar
Mop Bucket @mopbucket.bsky.social

It's worth remembering that not even the immediate costs are built into the price at present. LLM companies are burning through billions of their investors' dollars to sell you access at rates far below cost

aug 24, 2025, 3:53 am • 2 0 • view
avatar
harphacker.bsky.social @harphacker.bsky.social

I'm using the LLM to write V1 of the code which i then debug. It's in C which i know. What i don't know is the sdk for this system (raspberry pi pico). Having to make it run is how i learn. Your still have to learn it you can just learn by doing. That's the value for me.

aug 24, 2025, 1:50 am • 0 0 • view
avatar
Damien Tonkin @madscitool.bsky.social

What LLMs are automating here is teaching and community. In the past you would have had to ask someone or crowd source a solution in a forum. That's where the information came from it's just obfuscating the human connection. I beg you to read The Machine Stops. It's only short.

aug 24, 2025, 1:58 am • 1 0 • view
avatar
harphacker.bsky.social @harphacker.bsky.social

Still have to do that. But yes I'm aware. I think the gaps in the training corpus are where the human interaction occurs. And the HW/SW are i play in are all item source anyway. So yes humans are the source and the ultimate go to.

aug 24, 2025, 2:04 am • 0 0 • view
avatar
harphacker.bsky.social @harphacker.bsky.social

*open source

aug 24, 2025, 2:04 am • 0 0 • view