I have messy CSV files with several tables. I can drop in the CSV data directly into an observable cell, and then parse it with d3.csvParseRows
to get what appears to be a more true-to-original array as compared to using d3.csvParse
[that is, i get arrays composed of 21 object keys (if this is the right term) as opposed to arrays of 2 object keys that don’t seem to correspond well to the original file].
So far, so good.
A few of the CSV files, however, are too big to load in directly to Observable. So it seems I’ll need to upload them to GitHub and then reference the URL. Doing this, however, only seems to work with d3.csv
, which returns the non-ideally formatted arrays.
How can I run ParseRows
on a csv
file loaded from a URL?
Here’s a test notebook:
Thanks in advance for your help and guidance!!
reference: https://github.com/d3/d3-dsv