Bug: Error when navigating past page 100

Steps:

  1. Open https://observablehq.com/recent?page=100
  2. Click “Next”

Expected result:
See results 3001 to 3030.

Actual result:
Message “Oops, an unexpected error occurred.”, network tab shows response code 400.

Notes:

  • The response code suggests that this is an artifical API limit.

Yes, currently we limit pagination for any given query to end after 100 pages of results.

I’ll fix the Next button on page 100, and we’ll probably increase this limit soon, as there are a few folks who are already beginning to approach it.

Is there a technical reason for this limit?

A mild one, in that in our current pagination strategy, it gets (slightly more) increasingly expensive for our API to compute deep pages rather than shallow ones.

But mostly it’s a design limit, in that instead of paginating through hundreds of pages of results looking for something — we should be providing you with more advanced search operators and precise ways to winnow down what you’re looking for. GitHub also limits pagination to the first 100 pages of results.

But if you’re not actually trying to read through 100 pages of results, and instead are asking for personal scraping purposes (although of course, it’s still not a public API, and may change at any time) — our before=date method of paginating through the results continues to work as before.

For example (needs to be authenticated): https://api.observablehq.com/user/documents?before=2018-11-14T04:44:21.799Z

2 Likes

May I ask what storage engines (RDBMS, k/v stores etc) are powering Observable?

The reason I noticed this bug in the first place was that I got intrigued by the “Showing x of 10.000+ notebooks” message (which, btw, will need to be adjusted to the actual API limit), and wanted to find out if that number was a hard limit or an estimate. :slight_smile:

A while back I had actually started to document the API routes through SwaggerHub/OpenAPI, hoping to generate tooling from the spec, but had to put it on hold when I encountered too many (minor) variations in the JSON schemas. If there’s anything in that regard that you’d be able and willing to share (one-off and completely unsupported, of course), that would be highly appreciated.

I’m definitely planning to build and share a static index (as a means to experiment with custom search interfaces and create some statistics), but that’s not likely to happen soon.

Pretty much just Postgres, at this point.

I’m afraid we don’t have any internal API documentation we can share at the moment. It changes every week, as we add and remove fields here and there (and especially so with the recent redesign).

1 Like