Is it possible to trigger a build process on publishing observable notebook? I’m working on a POC where I’m embedding observable notebooks in a React frontend, everything’s good so far just stumbled upon this roadblock where I want to trigger a build pipeline (bitbucket/GitLab) instead of manually adding the notebook again after each iteration. Is there a way in Observable where I can achieve this?
Unfortunately, while we love this use case, we don’t have a webhook or anything similar for when a notebook is published, so there is no (obvious) way to trigger the build as you describe.
Thanks @duaneatat will check if there’s an alternative way
This is something I have been looking for too, so I just made a solution that exploits Observable’s thumbnail generating process.
What’s quite good is you can extract the notebook id and version as part of the process.
Thank you so much! I’ll look into this.
Sorry to bump this old thread but I want to ask about this again. I’m currently building stuff that is polling for etag changes on observablehq API endpoint. I’m polling every 11 seconds at the moment to see when a notebook has changed. Would it be possible to instead have some kind of webhook? Is there any TOS for how often I can poll for the etag changes?
Isn’t the caching duration for published notebooks 30 seconds?
Yes they set the cache control header to 30s. But then the fastest I could respond to a new published notebook version is 30 seconds later. I’m calling from a backend context with no browser so the cache header for that isn’t directly used. But really I’m after something that isn’t based on polling as it feels a bit wasteful.
what was the problem with the hook notebook? It is event driven
Maybe it is not reliable enough. I think to make a reliable webhook I might start fusing a few different signals. The onversion hook is great because it is fast and works on team notebooks, but it doesn’t seem to be totally reliable.
There is the /recent feed which lets you bulk get updates, but you might want a specific notebook not all notebooks updating. It only works for public notebook though.
So maybe write all of /recent changes to a Firebase, then we can realtime listen to a query filtering to the notebook we are interested in updates for, should be super fast and give use the per notebook targets yet efficient because its data source is a bulk feed. You can augment it by writing additional data sources into it too
Bummer that it only includes public notebooks, I like using link-sharing notebooks rather than requiring “publication” of every notebook. I haven’t tried your on-version hook so not sure on the reliability of it, but the idea of relying on the notebook code getting executed by the thumbnail generation service also made me wonder of its performance vs just polling.
What’s driving me is the experiencing of editing the code in the notebook, hitting the “republish” button, and refreshing my second tab where I have the website generated from my notebook cell. I’m never sure if the refresh I just made includes the code change or not. It can be anywhere from 5 - 30 seconds before the notebook is actually updated. By having my backend poll tightly on when the etag changes on the API endpoint, I know that the content change is live on that endpoint and can update the handler for it. The alternative is to load a fresh handler for every request, so you drop the variable of polling from the backend, but you still need to refresh the page a few times or wait a few seconds before attempting it.
I noticed in webcode you have the live debugging thing and such, so that can be a way to drive the dev experience to know when a change has taken effect. Perhaps I could add something like that by adding a header like notebook-etag or notebook-updated so the client can have a signalof the notebook version that served it, and poll itself and refresh itself as a script I can choose to include in the page response, or perhaps using a websocket or something with a separate API on the backend that I manually decide to include on a specific web page being served from a cell which would listen for “refresh” events from backend that are emitted when the notebook version changes.
For testing functionality of a serverless “function” backed by a notebook cell, its a little tighter since I can just have a cell generator in the notebook calling the function every few seconds, so the refresh “takes effect” as I’m watching it without needing to induce it specifically.
I think we’re in an “uncanny valley” of instant deployments where its fast enough that I expect it to be immediate but random enough that I get antsy or anxious because I’m spoiled not having to sit through a build and deploy cycle for other services!
Now that I think about it more, maybe it would be better to just publish notebooks and use the recent feed. That is a clean signal, and I was already thinking about that to make the collections API (which returns json info of public notebooks in the collection) as the mechanism to discover which notebooks to run my backend service (otherwise I have to accept “any” notebook route or have an explicit opt-in list in one of my notebooks). So yeah maybe I’ll convert over to using public everything as that seems like the cleanest approach. Will also probably add the extra headers as well as they can be used for variety of “dev tool” integrations.
How is the speed of the /recent feed?
Also thanks for all the cool notebooks and services!
I think the thumbnail process is pretty fast, perhaps it is on the hot path of Observable’s actual publish action?
Yeah live coding suits observable much better as it’s true to Observable’s reactive cell paradigm. Its true 0 latency with no button pressing, just like any other code written in Observable.
I jsut measured the onversion hook latency as 7 seconds onversion hook latency experiment - YouTube
In my case I have some server-side things exposed to the notebook cell, so live coding won’t necessarily work to route the request to the handler in the browser (if I’m understanding how live coding works). For example I had a service that would run a vite build against a “build” notebook with cells that would return the string content for a virtual file system. In that case I really just wanted to run the build once per publication of the notebook. Another usecase I imagined was in having a service automatically run postcss against the content in a notebook cell, serving the stylesheet for it that could be linked directly back in the notebook and placed to the dom. For that you have the style dom element in a generator that yields values from polling on the etag. Was thinking to enable some tailwind service for notebooks based on that where you could define a $tailwind cell in the notebook and then get an automatic obs.run/@user/notebook.tailwind.css endpoint on the internet that you can then drop in the dom of the notebook for the style. And other riffs on that.
yeah, I use this hook to invalidate my serverside caches and to drive notebook backups.
I may be way off here, but would another solution to your use case be to load the published notebook dynamically from your React app? Or are you already doing that?
The original poster could perhaps do that, however consider that they would lose performance on their app instead of having a bundle with the notebook content included and pushed to the same CDN or w/e as the rest of their app.
For my use case, I’m running observable notebooks as “serverless function handlers” and I don’t want to fetch a new notebook version on every server request, so I currently fetch a new version every few seconds and refresh the background worker process when a change is detected.
Instead of polling it would be nice to have some kind of webhook, both for triggering external build processes and all other sorts of integration hooks yet to be imagined.
Having an rss feed on a per-user basis accessed via API key could also work instead of just the public one, though I’m guessing that isn’t practical to implement.
I decided to stick to polling for now, and I reduced the polling time to 7 seconds. If observable team is fine with that then so am I. I’m not running at any kind of hyper-scale so it’s likely fine.