I have been using DuckD-Wasm for data wrangling and connecting it to a visualization pipeline. I have attached data files totalling 2.1 GB to the notebook (each measuring less than 50 MB) which I am loading in a DuckDB instance. It was working fine for weeks, but suddenly it has stopped working now. It gives this error:
catchDB = Error: Out of Memory Error: could not allocate block of 262144 bytes (3435773952/3435921408 used)
Database is launched in in-memory mode and no temporary directory is specified.
Unused blocks cannot be offloaded to disk.
Launch the database with a persistent storage back-end
Or set PRAGMA temp_directory='/path/to/tmp.tmp'
I am not clear what I need to do to get it to work again. Please help