Closed Captioning Closed captioning available on our YouTube channel

How to get API data with R

InfoWorld | Sep 4, 2019

No R package for an API you’d like to use? No problem! See how to write your own R code to pull data from an API using API key authentication.

Copyright © 2019 IDG Communications, Inc.

Hi. I’m Sharon Machlis at IDG Communications, here with episode 34 of Do More With R: Import Data from an API.

There are a lot of great R packages that let you import from an API with a single function. But sometimes an API doesn’t have its own package. Good news is: It’s easy to code your own.

I’ll demo this with the AccuWeather API but the process and code I’ll show will work for most other APIs that use a key for authentication.

If you want to follow along, go to and follow the steps to sign up, create an app, and get a free key. There are instructions in this video’s associated article at InfoWorld on how to do all that.

But here, let’s focus on the R code for the API. First, I’ll load the httr package for getting data from the API; jsonlite for parsing results; and dplyr to use pipes.

Next – and this is critical -- I need to know how to structure a URL in order to request the data I want from the API. Figuring out the query structure can be the hardest part of the process, depending on how well the API is documented. Fortunately, the AccuWeather API docs are pretty good.

Any API query need a resource URL (or what I think of as the URL’s root); and then specific parts of the query. Here’s what AccuWeather says in its documentation for the forecast API:

The base URL for a forecast is mostly a constant, but it needs a location code. If you’re just looking for a forecast for 1 location, well, you can cheat and use the AccuWeather Web site to search for a forecast and then check the URL that comes back.

See the slash 571_pc at the end? That’s the location code for 01701 – the Zip code for office in Framingham, Massachusetts.

Or, you can use an AccuWeather Locations API to pull location codes with R, which I’ll show in a bit.

Query parameters for specific data requests get tacked onto the end of a base URL. The first one you add starts with a question mark, followed by name equals value. Any additional key-value pairs are added with an ampersand followed by name equals value.

So just to add my API key, the URL would look like If I wanted to add a second query parameter – say, changing the default details from false to true – it would look like this.

We can use httr’s GET() function to make a data request with that URL. That paste0 command creating the URL just adds my API key. I’ve stored it as an R environment variable so the key doesn’t display on my screen. Instructions on how to do that are also in this video’s associated article.

Let’s take a look at my_raw_result. This is a pretty complex list. The actual data we want is mostly in content, but if we look at that we’ll see that it’s a “raw” format that looks like binary data.

Fortunately, the httr package makes it easy to convert from raw to a usable format -- with the content() function.

Content() gives you three conversion options: as raw (which definitely isn’t helpful in this case); parsed – which seems to usually return some sort of list; and text. For json – especially nested json – I find text to be the easiest to work with. So let me run content with as equals text and see the structure. It’s just json as a text string.

This is where the jsonlite package comes in. The fromJSON() function will turn that json text string into a more usable R object. I’m going to run dplyr’s glimpse() function on that to get a good look at the structure. .

It’s a list with 2 items. Item 1 has some metadata and a text field we might want. The second item is a data frame with a lot of data points we definitely want for the forecast. So let’s look specifically at that data frame. This seems to have most of the info we need for a forecast.

If I run glimpse() on this, you can see that this was nested JSON, because some of the columns are actually their own data frames. But fromJSON() made it all pretty seamless.

So those are the basics of how to pull data from an API. ]
1. Figure out the API’s base URL and query parameters, and construct a request URL.
2. Run httr’s GET() on the URL.
3. Parse the results with content(). You can try it with as equals parsed, but if that returns a complicated list, try as equals text.
4. If necessary, run jsonlite’s fromJSON() function.
A couple of more points before I wrap up.

One: If we go back to my_raw_result– the initial object returned from GET() – you can see there’s a status code. 200 means all was OK. But if there’s a code in the 400s, something went wrong. If you’re writing a function or script, you can check whether status code is in the 200s before additional code runs.

Two: If you’ve got multiple query parameters, it can get a little annoying to string them all together with a paste() command. GET() has another option.

You can create your query arguments as a named list using this format. See the structure here? The GET() function takes the base URL as the first argument, and a list of names and values as the second, query argument. Each one is name equals value, with the name not in quotation marks. Rest of the code is the same.

That works for the AccuWeather locations API as well.

Here’s what the API is looking for: base URL, API key, and q (the text you’re searching for). Let me go back to R. I first set up my base url. Next is my my GET() command with the URL as first argument, and query string list as second argument. The query list has two items: my apikey and q, the text I’m searching for. .

I’ve got a 200 status code so it ran OK. If I parse that with content and then fromJSON()


the Key column holds my location code. Done!

That’s all for this episode, thanks for watching! For more R tips, head to the Do More With R page at go dot infoworld dot com slash more with R, all lowercase except for the R.
You can also find the Do More With R playlist on the YouTube IDG Tech Talk channel -- where you can subscribe so you never miss an episode.

Hope to see you next time!
Featured videos from