Skip to content

primpvspydoll

MIT 3 1 504
7.1 million (month) Jun 01 2024 1.2.2(2026-04-03 07:11:15 ago)
- - - None
Jun 01 2024 0.0.0(2025-02-01 00:00:00 ago)

Primp is a Python HTTP client that impersonates real web browsers by replicating their TLS fingerprints, HTTP/2 settings, and header ordering. It is a lightweight alternative to curl-cffi for bypassing TLS and HTTP fingerprinting-based bot detection.

Key features include:

  • Browser impersonation Can impersonate Chrome, Firefox, Safari, Edge, and OkHttp clients by replicating their exact TLS fingerprints (JA3/JA4), HTTP/2 frame settings, header ordering, and other connection-level characteristics.
  • HTTP/2 support Full HTTP/2 support with configurable settings that match real browser behavior.
  • Lightweight Smaller and simpler than curl-cffi while providing similar impersonation capabilities. Built on Rust for performance.
  • Familiar API Provides a requests-like API with Session support, making it easy to adopt for developers familiar with the Python requests library.
  • Proxy support HTTP and SOCKS5 proxy support with authentication.
  • Cookie management Automatic cookie handling across requests within a session.

Primp fills a similar niche to curl-cffi and hrequests — HTTP clients designed to avoid TLS/HTTP fingerprinting — but takes a Rust-powered approach for better performance. It is particularly useful when you need to bypass bot detection that relies on connection-level fingerprinting without using a full browser.

Pydoll is a Python library for browser automation that uses the Chrome DevTools Protocol (CDP) directly, designed to be undetectable by anti-bot systems. Unlike Selenium-based tools, Pydoll does not use WebDriver and avoids the common detection vectors that anti-bot systems look for.

Key features include:

  • Native CDP communication Connects directly to Chrome/Chromium via CDP websocket without intermediary drivers, avoiding the automation flags and fingerprints that WebDriver-based tools leave behind.
  • Event-driven architecture Built around an async event system that can listen for and react to browser events like network requests, console messages, and DOM changes.
  • Network interception Can intercept, modify, and mock network requests and responses, useful for blocking unnecessary resources or modifying API responses during scraping.
  • Async-first design Fully asynchronous API built on Python's asyncio for efficient concurrent automation.
  • Clean API Provides a high-level, Pythonic API for common browser automation tasks while still allowing direct CDP command execution for advanced use cases.
  • Multi-browser support Can manage multiple browser instances and pages concurrently.

Pydoll fills a similar niche to nodriver and camoufox — browser automation with a focus on avoiding detection — but takes a different approach by providing more granular control over CDP communication and network interception.

Highlights


bypasstls-fingerprinthttp-fingerprinthttp2fast
anti-detectcdpasync

Example Use


```python import primp # Create a session that impersonates Chrome session = primp.Session(impersonate="chrome_131") # Make requests - TLS fingerprint matches real Chrome response = session.get("https://example.com") print(response.status_code) print(response.text) # POST with JSON data response = session.post( "https://api.example.com/data", json={"key": "value"}, ) # With proxy session = primp.Session( impersonate="firefox_133", proxy="http://user:pass@proxy.example.com:8080", ) response = session.get("https://example.com") # Different browser impersonation profiles for browser in ["chrome_131", "firefox_133", "safari_18", "edge_131"]: session = primp.Session(impersonate=browser) resp = session.get("https://tls.peet.ws/api/all") print(f"{browser}: {resp.json()['ja3_hash']}") ```
```python import asyncio from pydoll.browser import Chrome from pydoll.constants import By async def main(): async with Chrome() as browser: # Open a new page page = await browser.new_page() await page.go_to("https://example.com") # Find and interact with elements search_input = await page.find_element(By.CSS, "input[name='q']") await search_input.type_text("web scraping") submit_btn = await page.find_element(By.CSS, "button[type='submit']") await submit_btn.click() # Wait for results and extract content await page.wait_element(By.CSS, ".results") results = await page.find_elements(By.CSS, ".result-item") for result in results: title = await result.get_text() print(title) # Network interception example await page.enable_network_interception() # intercept and analyze API calls made by the page asyncio.run(main()) ```

Alternatives / Similar


Was this page helpful?