d3.csvParseRow for data via URL?

I have messy CSV files with several tables. I can drop in the CSV data directly into an observable cell, and then parse it with d3.csvParseRows to get what appears to be a more true-to-original array as compared to using d3.csvParse [that is, i get arrays composed of 21 object keys (if this is the right term) as opposed to arrays of 2 object keys that don’t seem to correspond well to the original file].

So far, so good.

A few of the CSV files, however, are too big to load in directly to Observable. So it seems I’ll need to upload them to GitHub and then reference the URL. Doing this, however, only seems to work with d3.csv, which returns the non-ideally formatted arrays.

How can I run ParseRows on a csv file loaded from a URL?

Here’s a test notebook:

Thanks in advance for your help and guidance!!


reference: https://github.com/d3/d3-dsv

Try this:

d3.text(url).then(text => d3.csvParseRows(text))
1 Like

Ah, I see it now: convert the URL to text (string), then parse the text. Thank you so much!

Yep, that’s right. And another way to write it would be:

fetch(url)
  .then(response => response.text())
  .then(text => d3.csvParseRows(text))

Or using await:

{
  const response = await fetch(url);
  const text = await response.json();
  return d3.csvParseRows(text);
}
2 Likes