I want to build a dashboard for a solar panel inverter. This runs from an rpi4 which currently already takes ~15s for the npm run build step.
If I understand correctly, I would have to run this step every time my sqlite database changes, to be able to see the new data on the dashboard - is that correct?
Ideally I would be able to simply symlink the “live” sqlite that is getting updated regularly into the dist folder at the right position and then get the ability to reload the plot(s) I care about dynamically, similar to how I would be able to do that with a fetch of a remote resource. But I fail to see how that would be possible - is it?
If it isn’t possible, then I fear my only option is to write a thin service wrapper to provide a rest api around the sqlite file, to get regularly updated dashboard contents, correct?
You don’t have to create a data loader, and can access a live file instead. To do this you could use the “hot loading” described in SQLite | Observable Framework (you just have to make the file accessible by http).
```js
const db = SQLiteDatabaseClient.open("https://rpi4.local/myfile.sqlite");
```
Right, that would work as long as the database is small I think. Over time I will need to find an alternative solution.
But I’m wondering about how others are handling this - dashboards with live updating data sounds like a niche that observablehq could serve, but it seems like its static nature gets in the way there. Or are there any helpers that allow one to quickly build dynamic data loaders that serve data via an API, to get an effect similar to the live updating npm run dev setup?
Ah I see. I thought you wanted to access the whole database. I’d recommend setting up this service as an independent API endpoint, which you can do with the usual tools (node + express, or even Apache+PHP…). Then you can read it from a built Framework project, polling with http or streaming on a web socket if you need real-time?
Thank you for the confirmation. For now reloading the database regularly is fine as it’s still small, giving me time to build a more efficient backend API for this purpose next.