Skip to content

Commit

Permalink
Further remove info due to worrying political environment in China
Browse files Browse the repository at this point in the history
  • Loading branch information
lexiforest committed Aug 18, 2024
1 parent 2989ef5 commit 8e69ad7
Show file tree
Hide file tree
Showing 14 changed files with 46 additions and 46 deletions.
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ $(CURL_VERSION):
mv curl-$(CURL_VERSION) $(CURL_VERSION)

curl-impersonate-$(VERSION)/chrome/patches: $(CURL_VERSION)
curl -L "https://github.com/yifeikong/curl-impersonate/archive/refs/tags/v$(VERSION).tar.gz" \
curl -L "https://github.com/lexiforest/curl-impersonate/archive/refs/tags/v$(VERSION).tar.gz" \
-o "curl-impersonate-$(VERSION).tar.gz"
tar -xf curl-impersonate-$(VERSION).tar.gz

Expand Down
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ Only Python 3.8 and above are supported. Python 3.7 has reached its end of life.

------

<a href="https://scrapfly.io/?utm_source=github&utm_medium=sponsoring&utm_campaign=curl_cffi" target="_blank"><img src="https://raw.githubusercontent.com/yifeikong/curl_cffi/main/assets/scrapfly.png" alt="Scrapfly.io" width="149"></a>
<a href="https://scrapfly.io/?utm_source=github&utm_medium=sponsoring&utm_campaign=curl_cffi" target="_blank"><img src="https://raw.githubusercontent.com/lexiforest/curl_cffi/main/assets/scrapfly.png" alt="Scrapfly.io" width="149"></a>

[Scrapfly](https://scrapfly.io/?utm_source=github&utm_medium=sponsoring&utm_campaign=curl_cffi)
is an enterprise-grade solution providing Web Scraping API that aims to simplify the
Expand All @@ -39,7 +39,7 @@ If you are managing TLS/HTTP fingerprint by yourself with `curl_cffi`, they also
## Features

- Supports JA3/TLS and http2 fingerprints impersonation, inlucding recent browsers and custome fingerprints.
- Much faster than requests/httpx, on par with aiohttp/pycurl, see [benchmarks](https://github.com/yifeikong/curl_cffi/tree/main/benchmark).
- Much faster than requests/httpx, on par with aiohttp/pycurl, see [benchmarks](https://github.com/lexiforest/curl_cffi/tree/main/benchmark).
- Mimics requests API, no need to learn another one.
- Pre-compiled, so you don't have to compile on your machine.
- Supports `asyncio` with proxy rotation on each request.
Expand Down Expand Up @@ -69,7 +69,7 @@ To install beta releases:

To install unstable version from GitHub:

git clone https://github.com/yifeikong/curl_cffi/
git clone https://github.com/lexiforest/curl_cffi/
cd curl_cffi
make preprocess
pip install .
Expand Down Expand Up @@ -126,9 +126,9 @@ print(r.json())
# {'cookies': {'foo': 'bar'}}
```

`curl_cffi` supports the same browser versions as supported by my [fork](https://github.com/yifeikong/curl-impersonate) of [curl-impersonate](https://github.com/lwthiker/curl-impersonate):
`curl_cffi` supports the same browser versions as supported by my [fork](https://github.com/lexiforest/curl-impersonate) of [curl-impersonate](https://github.com/lwthiker/curl-impersonate):

However, only Chrome-like browsers are supported. Firefox support is tracked in [#59](https://github.com/yifeikong/curl_cffi/issues/59).
However, only Chrome-like browsers are supported. Firefox support is tracked in [#59](https://github.com/lexiforest/curl_cffi/issues/59).

Browser versions will be added **only** when their fingerprints change. If you see a version, e.g.
chrome122, were skipped, you can simply impersonate it with your own headers and the previous version.
Expand Down
2 changes: 1 addition & 1 deletion curl_cffi/curl.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ def write_callback(ptr, size, nmemb, userdata):
return nmemb * size


# Credits: @alexio777 on https://github.com/yifeikong/curl_cffi/issues/4
# Credits: @alexio777 on https://github.com/lexiforest/curl_cffi/issues/4
def slist_to_list(head) -> List[bytes]:
"""Converts curl slist to a python list."""
result = []
Expand Down
8 changes: 4 additions & 4 deletions curl_cffi/requests/session.py
Original file line number Diff line number Diff line change
Expand Up @@ -434,7 +434,7 @@ def _set_curl_options(

# remove Host header if it's unnecessary, otherwise curl may get confused.
# Host header will be automatically added by curl if it's not present.
# https://github.com/yifeikong/curl_cffi/issues/119
# https://github.com/lexiforest/curl_cffi/issues/119
host_header = h.get("Host")
if host_header is not None:
u = urlparse(url)
Expand Down Expand Up @@ -476,7 +476,7 @@ def _set_curl_options(
if files:
raise NotImplementedError(
"files is not supported, use `multipart`. See examples here: "
"https://github.com/yifeikong/curl_cffi/blob/main/examples/upload.py"
"https://github.com/lexiforest/curl_cffi/blob/main/examples/upload.py"
)

# multipart
Expand Down Expand Up @@ -508,7 +508,7 @@ def _set_curl_options(
if not stream:
c.setopt(CurlOpt.TIMEOUT_MS, int(all_timeout * 1000))
else:
# trick from: https://github.com/yifeikong/curl_cffi/issues/156
# trick from: https://github.com/lexiforest/curl_cffi/issues/156
c.setopt(CurlOpt.LOW_SPEED_LIMIT, 1)
c.setopt(CurlOpt.LOW_SPEED_TIME, math.ceil(all_timeout))

Expand Down Expand Up @@ -563,7 +563,7 @@ def _set_curl_options(
warnings.warn(
"Make sure you are using https over https proxy, otherwise, "
"the proxy prefix should be 'http://' not 'https://', "
"see: https://github.com/yifeikong/curl_cffi/issues/6",
"see: https://github.com/lexiforest/curl_cffi/issues/6",
RuntimeWarning,
stacklevel=2,
)
Expand Down
14 changes: 7 additions & 7 deletions docs/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -96,10 +96,10 @@ To force curl to use http 1.1 only.
Related issues:

- `#19 <https://github.com/yifeikong/curl_cffi/issues/19>`_,
- `#42 <https://github.com/yifeikong/curl_cffi/issues/42>`_,
- `#79 <https://github.com/yifeikong/curl_cffi/issues/79>`_,
- `#165 <https://github.com/yifeikong/curl_cffi/issues/165>`_,
- `#19 <https://github.com/lexiforest/curl_cffi/issues/19>`_,
- `#42 <https://github.com/lexiforest/curl_cffi/issues/42>`_,
- `#79 <https://github.com/lexiforest/curl_cffi/issues/79>`_,
- `#165 <https://github.com/lexiforest/curl_cffi/issues/165>`_,


Packaging with PyInstaller
Expand Down Expand Up @@ -127,8 +127,8 @@ Add other paths:
See also:

- `#5 <https://github.com/yifeikong/curl_cffi/issues/5>`_
- `#48 <https://github.com/yifeikong/curl_cffi/issues/48>`_
- `#5 <https://github.com/lexiforest/curl_cffi/issues/5>`_
- `#48 <https://github.com/lexiforest/curl_cffi/issues/48>`_

How to set proxy?
------
Expand All @@ -145,7 +145,7 @@ You can also use the ``http_proxy``, ``https_proxy``, and ``ws_proxy``, ``wss_pr
environment variables, respectively.

For explanation of differences between ``http_proxy`` and ``https_proxy``, please see
`#6 <https://github.com/yifeikong/curl_cffi/issues/6>`_.
`#6 <https://github.com/lexiforest/curl_cffi/issues/6>`_.


How to change the order of headers?
Expand Down
6 changes: 3 additions & 3 deletions docs/impersonate.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@ Impersonate guide
Supported browser versions
--------------------------

``curl_cffi`` supports the same browser versions as supported by our `fork <https://github.com/yifeikong/curl-impersonate>`_ of `curl-impersonate <https://github.com/lwthiker/curl-impersonate>`_:
``curl_cffi`` supports the same browser versions as supported by our `fork <https://github.com/lexiforest/curl-impersonate>`_ of `curl-impersonate <https://github.com/lwthiker/curl-impersonate>`_:

However, only Chrome-like browsers are supported. Firefox support is tracked in `#59 <https://github.com/yifeikong/curl_cffi/issues/59>`_.
However, only Chrome-like browsers are supported. Firefox support is tracked in `#59 <https://github.com/lexiforest/curl_cffi/issues/59>`_.

Browser versions will be added **only** when their fingerprints change. If you see a version, e.g.
chrome122, were skipped, you can simply impersonate it with your own headers and the previous version.
Expand Down Expand Up @@ -148,7 +148,7 @@ For Akamai http2 fingerprints, you can fully customize the 3 parts:

For a complete list of options and explanation, see the `curl-impersoante README`_.

.. _curl-impersonate README: https://github.com/yifeikong/curl-impersonate?tab=readme-ov-file#libcurl-impersonate
.. _curl-impersonate README: https://github.com/lexiforest/curl-impersonate?tab=readme-ov-file#libcurl-impersonate


Should I randomize my fingerprints for each request?
Expand Down
8 changes: 4 additions & 4 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ Welcome to curl_cffi's documentation!

curl_cffi is a Python binding for `curl-impersonate`_ via `cffi`_.

.. _curl-impersonate: https://github.com/yifeikong/curl-impersonate
.. _curl-impersonate: https://github.com/lexiforest/curl-impersonate
.. _cffi: https://cffi.readthedocs.io/en/latest/

Unlike other pure Python http clients like ``httpx`` or ``requests``, ``curl_cffi`` can
Expand All @@ -34,7 +34,7 @@ website for no obvious reason, you can give this package a try.

------

.. image:: https://raw.githubusercontent.com/yifeikong/curl_cffi/main/assets/scrapfly.png
.. image:: https://raw.githubusercontent.com/lexiforest/curl_cffi/main/assets/scrapfly.png
:width: 300
:alt: Scrapfly
:target: https://scrapfly.io/?utm_source=github&utm_medium=sponsoring&utm_campaign=curl_cffi
Expand All @@ -56,7 +56,7 @@ Features
------

- Supports JA3/TLS and http2 fingerprints impersonation.
- Much faster than requests/httpx, on par with aiohttp/pycurl, see `benchmarks <https://github.com/yifeikong/curl_cffi/tree/main/benchmark>`_.
- Much faster than requests/httpx, on par with aiohttp/pycurl, see `benchmarks <https://github.com/lexiforest/curl_cffi/tree/main/benchmark>`_.
- Mimics requests API, no need to learn another one.
- Pre-compiled, so you don't have to compile on your machine.
- Supports ``asyncio`` with proxy rotation on each request.
Expand Down Expand Up @@ -223,7 +223,7 @@ Click `here <https://buymeacoffee.com/yifei>`_ to buy me a coffee.
Bypass Cloudflare with API
~~~~~~

.. image:: https://raw.githubusercontent.com/yifeikong/curl_cffi/main/assets/yescaptcha.png
.. image:: https://raw.githubusercontent.com/lexiforest/curl_cffi/main/assets/yescaptcha.png
:width: 149
:alt: YesCaptcha
:target: https://yescaptcha.com/i/stfnIO
Expand Down
2 changes: 1 addition & 1 deletion docs/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ To install the latest unstable version from GitHub:

.. code-block::
git clone https://github.com/yifeikong/curl_cffi/
git clone https://github.com/lexiforest/curl_cffi/
cd curl_cffi
make preprocess
pip install .
6 changes: 3 additions & 3 deletions docs/vs-requests.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,6 @@ Although we try our best to mimic the requests API, some functionality is not ea
Here are a list of known incompatibilities:

- files API are slightly different, but more error-proof.
- retries are not supported yet, tracked in [#24](https://github.com/yifeikong/curl_cffi/issues/24)
- redirect history are not supported, tracked in [#82](https://github.com/yifeikong/curl_cffi/issues/82)
- empty-domains cookies may lost during redirects, tracked in [#55](https://github.com/yifeikong/curl_cffi/issues/55)
- retries are not supported yet, tracked in [#24](https://github.com/lexiforest/curl_cffi/issues/24)
- redirect history are not supported, tracked in [#82](https://github.com/lexiforest/curl_cffi/issues/82)
- empty-domains cookies may lost during redirects, tracked in [#55](https://github.com/lexiforest/curl_cffi/issues/55)
4 changes: 2 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[project]
name = "curl_cffi"
version = "0.7.1"
authors = [{ name = "Yifei Kong", email = "[email protected]" }]
authors = [{ name = "Lyonnet", email = "[email protected]" }]
description = "libcurl ffi bindings for Python, with impersonation support."
license = { file = "LICENSE" }
dependencies = [
Expand All @@ -10,7 +10,7 @@ dependencies = [
]
readme = "README.md"
requires-python = ">=3.8"
urls = { "repository" = "https://github.com/yifeikong/curl_cffi" }
urls = { "repository" = "https://github.com/lexiforest/curl_cffi" }
classifiers = [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
Expand Down
2 changes: 1 addition & 1 deletion scripts/build.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ def download_libcurl():
sysname = "linux-" + arch["libc"] if arch["system"] == "Linux" else arch["sysname"]

url = (
f"https://github.com/yifeikong/curl-impersonate/releases/download/"
f"https://github.com/lexiforest/curl-impersonate/releases/download/"
f"v{__version__}/libcurl-impersonate-v{__version__}"
f".{arch['so_arch']}-{sysname}.tar.gz"
)
Expand Down
6 changes: 3 additions & 3 deletions tests/unittest/test_async_session.py
Original file line number Diff line number Diff line change
Expand Up @@ -250,7 +250,7 @@ async def test_session_cookies(server):
assert cookies["hello"] == "world"


# https://github.com/yifeikong/curl_cffi/issues/16
# https://github.com/lexiforest/curl_cffi/issues/16
async def test_session_with_headers(server):
async with AsyncSession() as s:
r = await s.get(str(server.url), headers={"Foo": "bar"})
Expand All @@ -267,7 +267,7 @@ async def test_session_too_many_headers(server):
assert headers["Foo"][0] == "2"


# https://github.com/yifeikong/curl_cffi/issues/222
# https://github.com/lexiforest/curl_cffi/issues/222
async def test_closed_session_throws_error():
async with AsyncSession() as s:
pass
Expand Down Expand Up @@ -297,7 +297,7 @@ async def test_closed_session_throws_error():
await s.ws_connect("wss://example.com")


# https://github.com/yifeikong/curl_cffi/issues/39
# https://github.com/lexiforest/curl_cffi/issues/39
async def test_post_body_cleaned(server):
async with AsyncSession() as s:
# POST with body
Expand Down
4 changes: 2 additions & 2 deletions tests/unittest/test_curl.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ def test_headers(server):
headers = json.loads(buffer.getvalue().decode())
assert headers["Foo"][0] == "bar"

# https://github.com/yifeikong/curl_cffi/issues/16
# https://github.com/lexiforest/curl_cffi/issues/16
c.setopt(CurlOpt.HTTPHEADER, [b"Foo: baz"])
buffer = BytesIO()
c.setopt(CurlOpt.WRITEDATA, buffer)
Expand All @@ -90,7 +90,7 @@ def test_proxy_headers(server):
headers = json.loads(buffer.getvalue().decode())
assert "Foo" not in headers

# https://github.com/yifeikong/curl_cffi/issues/16
# https://github.com/lexiforest/curl_cffi/issues/16
c.setopt(CurlOpt.PROXYHEADER, [b"Foo: baz"])
buffer = BytesIO()
c.setopt(CurlOpt.WRITEDATA, buffer)
Expand Down
18 changes: 9 additions & 9 deletions tests/unittest/test_requests.py
Original file line number Diff line number Diff line change
Expand Up @@ -477,7 +477,7 @@ def test_cookies_with_special_chars(server):
assert r.json()["foo"] == "bar space"


# https://github.com/yifeikong/curl_cffi/issues/119
# https://github.com/lexiforest/curl_cffi/issues/119
def test_cookies_mislead_by_host(server):
s = requests.Session(debug=True)
s.curl.setopt(CurlOpt.RESOLVE, ["example.com:8000:127.0.0.1"])
Expand All @@ -489,7 +489,7 @@ def test_cookies_mislead_by_host(server):
assert r.json()["foo"] == "bar"


# https://github.com/yifeikong/curl_cffi/issues/119
# https://github.com/lexiforest/curl_cffi/issues/119
def test_cookies_redirect_to_another_domain(server):
s = requests.Session()
s.curl.setopt(CurlOpt.RESOLVE, ["google.com:8000:127.0.0.1"])
Expand All @@ -502,7 +502,7 @@ def test_cookies_redirect_to_another_domain(server):
assert cookies["foo"] == "google.com"


# https://github.com/yifeikong/curl_cffi/issues/119
# https://github.com/lexiforest/curl_cffi/issues/119
def test_cookies_wo_hostname_redirect_to_another_domain(server):
s = requests.Session(debug=True)
s.curl.setopt(
Expand All @@ -525,7 +525,7 @@ def test_cookies_wo_hostname_redirect_to_another_domain(server):
assert cookies["hello"] == "world"


# https://github.com/yifeikong/curl_cffi/issues/39
# https://github.com/lexiforest/curl_cffi/issues/39
def test_post_body_cleaned(server):
s = requests.Session()
# POST with body
Expand All @@ -537,15 +537,15 @@ def test_post_body_cleaned(server):
assert r.content == b""


# https://github.com/yifeikong/curl_cffi/issues/16
# https://github.com/lexiforest/curl_cffi/issues/16
def test_session_with_headers(server):
s = requests.Session()
r = s.get(str(server.url), headers={"Foo": "bar"})
r = s.get(str(server.url), headers={"Foo": "baz"})
assert r.status_code == 200


# https://github.com/yifeikong/curl_cffi/pull/171
# https://github.com/lexiforest/curl_cffi/pull/171
def test_session_with_hostname_proxies(server, proxy_server):
proxies = {
f"all://{server.url.host}": f"http://{proxy_server.flags.hostname}:{proxy_server.flags.port}"
Expand All @@ -556,7 +556,7 @@ def test_session_with_hostname_proxies(server, proxy_server):
assert r.text == "Hello from man in the middle"


# https://github.com/yifeikong/curl_cffi/pull/171
# https://github.com/lexiforest/curl_cffi/pull/171
def test_session_with_http_proxies(server, proxy_server):
proxies = {"http": f"http://{proxy_server.flags.hostname}:{proxy_server.flags.port}"}
s = requests.Session(proxies=proxies)
Expand All @@ -565,7 +565,7 @@ def test_session_with_http_proxies(server, proxy_server):
assert r.text == "Hello from man in the middle"


# https://github.com/yifeikong/curl_cffi/pull/171
# https://github.com/lexiforest/curl_cffi/pull/171
def test_session_with_all_proxies(server, proxy_server):
proxies = {"all": f"http://{proxy_server.flags.hostname}:{proxy_server.flags.port}"}
s = requests.Session(proxies=proxies)
Expand All @@ -574,7 +574,7 @@ def test_session_with_all_proxies(server, proxy_server):
assert r.text == "Hello from man in the middle"


# https://github.com/yifeikong/curl_cffi/issues/222
# https://github.com/lexiforest/curl_cffi/issues/222
def test_closed_session_throws_error():
with requests.Session() as s:
pass
Expand Down

0 comments on commit 8e69ad7

Please sign in to comment.