gotvsnode-fetch
Got is a lightweight and powerful HTTP client for Node.js. It is built on top of the http and https modules and provides a simple, consistent API for making HTTP requests.
Got is one of the most feature-rich http clients in NodeJS ecosystem offering http2, proxy and asynchronous support making it ideal for web scraping.
Got also supports many specific domain integrations like AWS, plugins for various public APIs like github.
Note that Got has some inconsistent behaviors when it comes to web scraping use.
For example, it normalizes http headers
which is undesired functionality in scraping and should be disabled.
node-fetch is a lightweight library that provides a fetch()-like API for making HTTP requests in Node.js. It is a light-weight implementation of the Fetch API, which is mostly compatible with the browser's version.
node-fetch is primarily known as almost identical package fetch() is included in web browsers so it shares the same use common API. It's great starting point for people coming from front-end environment.
Highlights
Example Use
const got = require('got');
// GET requests are default and can be made calling the module as is:
const response = await got('https://api.example.com');
console.log(response.body);
// POST requests can send
const response = await got.post('https://api.example.com', {
json: { name: 'John Doe' },
});
console.log(response.body);
// handling cookies
import {CookieJar} from 'tough-cookie';
const cookieJar = new CookieJar();
await cookieJar.setCookie('foo=bar', 'https://httpbin.org');
await got('https://httpbin.org/anything', {cookieJar});
// using proxy
import got from 'got';
import {HttpsProxyAgent} from 'hpagent';
await got('https://httpbin.org/ip', {
agent: {
https: new HttpsProxyAgent({
keepAlive: true,
keepAliveMsecs: 1000,
maxSockets: 256,
maxFreeSockets: 256,
scheduling: 'lifo',
proxy: 'https://localhost:8080'
})
}
});
const fetch = require('node-fetch');
// fetch supports both Promises and async/await
fetch('http://httpbin.org/get')
.then(res => res.text())
.then(body => console.log(body))
.catch(err => console.error(err));
const response = await fetch('http://httpbin.org/get');
// for concurrent scraping Promise.all can be used
const results = await Promise.all([
fetch('http://httpbin.org/html'),
fetch('http://httpbin.org/html'),
fetch('http://httpbin.org/html'),
])
// POST requests
await fetch('http://httpbin.org/post', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ name: 'John Doe' }),
})
// Proxy use:
const agent = new https.Agent({
rejectUnauthorized: false,
proxy: {
host: 'proxy.example.com',
port: 8080
}
});
await fetch('https://httpbin.org/ip', { agent })
// setting headers and cookies
const headers = new fetch.Headers();
headers.append('Cookie', 'myCookie=123');
headers.append('X-My-Header', 'myValue');
await fetch('https://httpbin.org/headers', { headers })