Skip to content

Commit ada51f3

Browse files
committed
master: setting up tests, readme
1 parent 0dfca33 commit ada51f3

File tree

13 files changed

+147
-14
lines changed

13 files changed

+147
-14
lines changed

.github/workflows/code_checks.yaml

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
name: Code checks
2+
3+
on: push
4+
5+
jobs:
6+
code_checks:
7+
runs-on: ubuntu-latest
8+
steps:
9+
- name: Set up Python
10+
uses: actions/setup-python@v1
11+
with:
12+
python-version: 3.7
13+
- name: Checkout
14+
uses: actions/checkout@v2
15+
- name: Init
16+
run: make init
17+
- name: Lint
18+
run: make lint
19+
- name: Tests
20+
run: make test

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,3 +3,4 @@ dist
33
.DS_Store
44
poetry.lock
55
*/__pycache__/
6+
*.egg-info/

Makefile

Lines changed: 5 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,9 @@
11
init:
2-
poetry update
2+
pip3 install -e .[dev]
3+
34
test:
4-
poetry run pytest -p no:cacheprovider
5+
pytest -p no:cacheprovider
56

6-
flake8:
7-
poetry run flake8 --max-line-length=120 scrapingant_client_python tests
7+
lint:
8+
flake8 --max-line-length=120 scrapingant_client tests
89

9-
publish:
10-
poetry build
11-
poetry publish

README.md

Lines changed: 108 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1 +1,109 @@
11
# ScrapingAnt API client for Python
2+
`scrapingant-client` is the official library to access [ScrapingAnt API](https://docs.scrapingant.com) from your
3+
Python applications. It provides useful features like parameters encoding to improve the ScrapingAnt usage experience.
4+
5+
<!-- toc -->
6+
7+
- [Quick Start](#quick-start)
8+
- [API key](#api-key)
9+
- [Retries with exponential backoff](#retries-with-exponential-backoff)
10+
- [API Reference](#api-reference)
11+
- [Examples](#examples)
12+
13+
<!-- tocstop -->
14+
15+
## Quick Start
16+
```python3
17+
from scrapingant_client.client import ScrapingAntClient
18+
19+
client = ScrapingAntClient(token='<YOUR-SCRAPINGANT-API-TOKEN>')
20+
# Scrape the example.com site.
21+
result = client.general_request('https://example.com')
22+
print(result.content)
23+
```
24+
25+
## API key
26+
In order to get API key you'll need to register at [ScrapingAnt Service](https://app.scrapingant.com)
27+
28+
## API Reference
29+
All public classes, methods and their parameters can be inspected in this API reference.
30+
31+
<a name="ScrapingAntClient"></a>
32+
33+
#### [](#ScrapingAntClient) `ScrapingAntClient(token)`
34+
35+
36+
| Param | Type | Default |
37+
| --- | --- | --- |
38+
| [token] | <code>string</code> | |
39+
40+
41+
42+
* * *
43+
44+
<a name="ScrapingAntClient+scrape"></a>
45+
46+
#### [](#ScrapingAntClient+scrape) `ScrapingAntClient.general_request(url, cookies, js_snippet, proxy_country, return_text)`[<code>ScrapingAnt API response</code>](https://docs.scrapingant.com/request-response-format#response-structure)
47+
48+
https://docs.scrapingant.com/request-response-format#available-parameters
49+
50+
| Param | Type | Default |
51+
| --- | --- | --- |
52+
| url | <code>string</code> | |
53+
| cookies | <code>string</code> | None |
54+
| js_snippet | <code>string</code> | None |
55+
| proxy_country | <code>string</code> | None |
56+
| return_text | <code>boolean</code> | False |
57+
58+
**IMPORTANT NOTE:** <code>js_snippet</code> will be encoded to Base64 automatically by the ScrapingAnt client library.
59+
60+
* * *
61+
62+
<a name="ScrapingAntApiError"></a>
63+
64+
### [](#ScrapingantClientException) ScrapingantClientException
65+
66+
`ScrapingantClientException` is base Exception class, used for all errors
67+
68+
* * *
69+
70+
## Examples
71+
72+
### Sending custom cookies
73+
74+
```python3
75+
from scrapingant_client.client import ScrapingAntClient
76+
from scrapingant_client.cookie import Cookie
77+
78+
client = ScrapingAntClient(token='<YOUR-SCRAPINGANT-API-TOKEN>')
79+
80+
result = client.general_request(
81+
'https://httpbin.org/cookies',
82+
cookies=[
83+
Cookie(name='cookieName1', value='cookieVal1'),
84+
Cookie(name='cookieName2', value='cookieVal2'),
85+
]
86+
)
87+
print(result.content)
88+
# Response cookies is a list of Cookie objects
89+
# They can be used in next requests
90+
response_cookies = result.cookies
91+
```
92+
93+
### Executing custom JS snippet
94+
95+
```python
96+
from scrapingant_client.client import ScrapingAntClient
97+
client = ScrapingAntClient(token='<YOUR-SCRAPINGANT-API-TOKEN>')
98+
99+
customJsSnippet = """
100+
var str = 'Hello, world!';
101+
var htmlElement = document.getElementsByTagName('html')[0];
102+
htmlElement.innerHTML = str;
103+
"""
104+
result = client.general_request(
105+
'https://example.com',
106+
js_snippet=customJsSnippet,
107+
)
108+
print(result.content)
109+
```
File renamed without changes.

scrapingant_client_python/client.py renamed to scrapingant_client/client.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,10 @@
44

55
import requests
66

7-
from src.constants import ProxyCountry
8-
from src.cookie import Cookie, cookies_list_to_string, cookies_list_from_string
9-
from src.response import Response
10-
from src.utils import base64_encode_string
7+
from scrapingant_client.constants import ProxyCountry
8+
from scrapingant_client.cookie import Cookie, cookies_list_to_string, cookies_list_from_string
9+
from scrapingant_client.response import Response
10+
from scrapingant_client.utils import base64_encode_string
1111

1212

1313
class ScrapingAntClient:
File renamed without changes.
File renamed without changes.
File renamed without changes.

scrapingant_client_python/response.py renamed to scrapingant_client/response.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
from typing import List
22

3-
from src.cookie import Cookie
3+
from scrapingant_client.cookie import Cookie
44

55

66
class Response:

0 commit comments

Comments
 (0)