Skip to content

axiosvsrequests

MIT 676 15 105,935
240.7 million (month) Aug 29 2014 1.7.9(6 days ago)
52,271 30 255 Apache-2.0
Feb 14 2011 585.1 million (month) 2.32.3(6 months ago)

axios is a popular JavaScript library that allows you to make HTTP requests from a Node.js environment. It is a promise-based library that works in both the browser and Node.js. It is similar to the Fetch API, but with a more powerful feature set and better browser compatibility.

One of the main benefits of using axios is that it automatically transforms the response data into a JSON object, making it easy to work with.

Axios is known for user-friendly API and support for asynchronous async/await syntax making it very accessible in web scraping.

The requests package is a popular library for making HTTP requests in Python. It provides a simple, easy-to-use API for sending HTTP/1.1 requests, and it abstracts away many of the low-level details of working with HTTP. One of the key features of requests is its simple API. You can send a GET request with a single line of code:

import requests
response = requests.get('https://webscraping.fyi/lib/requests/')
requests makes it easy to send data along with your requests, including JSON data and files. It also automatically handles redirects and cookies, and it can handle both basic and digest authentication. Additionally, it's also providing powerful functionality for handling exceptions, managing timeouts and session, also handling a wide range of well-known content-encoding types. One thing to keep in mind is that requests is a synchronous library, which means that your program will block (stop execution) while waiting for a response. In some situations, this may not be desirable, and you may want to use an asynchronous library like httpx or aiohttp. You can install requests package via pip package manager:
pip install requests
requests is a very popular library and has a large and active community, which means that there are many third-party libraries that build on top of it, and it has a wide range of usage.

Highlights


syncease-of-useno-http2no-asyncpopular

Example Use


// axios can be used with promises:
axios.get('http://httpbin.org/json')
  .then(response => {
    console.log(response.data);
  })
  .catch(error => {
    console.log(error);
  });

// or async await syntax:
var resp = await axios.get('http://httpbin.org/json');
console.log(resp.data);

// to make requests concurrently Promise.all function can be used:
const results = await Promise.all([
  axios.get('http://httpbin.org/html'),
  axios.get('http://httpbin.org/html'),
  axios.get('http://httpbin.org/html'),
])

// axios also supports other type of requests like POST and even automatically serialize them:
await axios.post('http://httpbin.org/post', {'query': 'hello world'});
// or formdata
const data = {name: 'John Doe', email: 'johndoe@example.com'};

await axios.post('https://jsonplaceholder.typicode.com/users',
    querystring.stringify(data), 
    {
        headers: {
            'Content-Type': 'application/x-www-form-urlencoded'
        }
    }
);

// default values like headers can be configured globally
axios.defaults.headers.common['User-Agent'] = 'webscraping.fyi';
// or for session instance:
const instance = axios.create({
  headers: {"User-Agent": "webscraping.fyi"},
})
import requests

# get request:
response = requests.get("http://webscraping.fyi/")
response.status_code
200
response.text
"text"
response.content
b"bytes"

# requests can automatically convert json responses to Python dictionaries:
response = requests.get("http://httpbin.org/json")
print(response.json())
{'slideshow': {'author': 'Yours Truly', 'date': 'date of publication', 'slides': [{'title': 'Wake up to WonderWidgets!', 'type': 'all'}, {'items': ['Why <em>WonderWidgets</em> are great', 'Who <em>buys</em> WonderWidgets'], 'title': 'Overview', 'type': 'all'}], 'title': 'Sample Slide Show'}}

# for POST request it can ingest Python's dictionaries as JSON:
response = requests.post("http://httpbin.org/post", json={"query": "hello world"})
# or form data:
response = requests.post("http://httpbin.org/post", data={"query": "hello world"})

# Session object can be used to automatically keep track of cookies and set defaults:
from requests import Session
s = Session()
s.headers = {"User-Agent": "webscraping.fyi"}
s.get('http://httpbin.org/cookies/set/foo/bar')
print(s.cookies['foo'])
'bar'
print(s.get('http://httpbin.org/cookies').json())
{'cookies': {'foo': 'bar'}}

Alternatives / Similar


Was this page helpful?