Skip to content

httrvsgot

MIT 1 9 979
710.2 thousand (month) May 06 2012 1.4.7(10 months ago)
13,904 10 108 MIT
14.2.1(30 days ago) Mar 27 2014 88.1 million (month)

The aim of httr is to provide a wrapper for the curl package, customised to the demands of modern web APIs.

Key features:

  • Functions for the most important http verbs: GET(), HEAD(), PATCH(), PUT(), DELETE() and POST().
  • Automatic connection sharing across requests to the same website (by default, curl handles are managed automatically), cookies are maintained across requests, and a up-to-date root-level SSL certificate store is used.
  • Requests return a standard reponse object that captures the http status line, headers and body, along with other useful information.
  • Response content is available with content() as a raw vector (as = "raw"), a character vector (as = "text"), or parsed into an R object (as = "parsed"), currently for html, xml, json, png and jpeg.
  • You can convert http errors into R errors with stop_for_status().
  • Config functions make it easier to modify the request in common ways: set_cookies(), add_headers(), authenticate(), use_proxy(), verbose(), timeout(), content_type(), accept(), progress().
  • Support for OAuth 1.0 and 2.0 with oauth1.0_token() and oauth2.0_token(). The demo directory has eight OAuth demos: four for 1.0 (twitter, vimeo, withings and yahoo) and four for 2.0 (facebook, github, google, linkedin). OAuth credentials are automatically cached within a project.

Got is a lightweight and powerful HTTP client for Node.js. It is built on top of the http and https modules and provides a simple, consistent API for making HTTP requests.

Got is one of the most feature-rich http clients in NodeJS ecosystem offering http2, proxy and asynchronous support making it ideal for web scraping.

Got also supports many specific domain integrations like AWS, plugins for various public APIs like github.

Note that Got has some inconsistent behaviors when it comes to web scraping use.
For example, it normalizes http headers which is undesired functionality in scraping and should be disabled.

Highlights


http2asyncpopularextendibletypescriptproxy

Example Use


library(httr)

# GET requests:
resp <- GET("http://httpbin.org/get")
status_code(resp)  # status code
headers(resp)  # headers
str(content(resp))  # body

# POST requests: 
# Form encoded
resp <- POST(url, body = body, encode = "form")
# Multipart encoded
resp <- POST(url, body = body, encode = "multipart")
# JSON encoded
resp <- POST(url, body = body, encode = "json")

# setting cookies:
resp <- GET("http://httpbin.org/cookies", set_cookies("MeWant" = "cookies"))
content(r)$cookies  # get response cookies
const got = require('got');

// GET requests are default and can be made calling the module as is:
const response = await got('https://api.example.com');
console.log(response.body);

// POST requests can send 
const response = await got.post('https://api.example.com', {
    json: { name: 'John Doe' },
});
console.log(response.body);

// handling cookies
import {CookieJar} from 'tough-cookie';

const cookieJar = new CookieJar();

await cookieJar.setCookie('foo=bar', 'https://httpbin.org');
await got('https://httpbin.org/anything', {cookieJar});

// using proxy
import got from 'got';
import {HttpsProxyAgent} from 'hpagent';

await got('https://httpbin.org/ip', {
  agent: {
    https: new HttpsProxyAgent({
      keepAlive: true,
      keepAliveMsecs: 1000,
      maxSockets: 256,
      maxFreeSockets: 256,
      scheduling: 'lifo',
      proxy: 'https://localhost:8080'
    })
  }
});

Alternatives / Similar