Skip to content

Commit 37fab28

Browse files
committed
change socks package
switched the module to be able to use the script on multiple python versions
1 parent eae9d4b commit 37fab28

File tree

2 files changed

+265
-16
lines changed

2 files changed

+265
-16
lines changed

README.md

Lines changed: 168 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33

44
## Features
55

6-
- **Configurable Threading**: Adjust the number of threads based on your system's capability using a `usage_level` setting from 1 to 10.
6+
- **Configurable Threading**: Adjust the number of threads based on your system's capability using a `usage_level` setting from 1 to 3.
77
- **Scraping Proxies**: Scrape HTTP/s and SOCKS5 proxies from various sources.
88
- **Checking Proxies**: Validate the scraped proxies to ensure they are working.
99
- **System Monitoring**: Display CPU and RAM usage of the script in the console title.
@@ -28,9 +28,174 @@ python main.py
2828

2929
## Requirements
3030

31-
- Python 3.6+
32-
- Packages automatically installed on start
31+
- Python 3.8+
32+
- Required packages will be automatically installed on start.
3333

3434
## Important Information!
3535

3636
For educational & research purposes only!
37+
38+
## Detailed Documentation
39+
40+
### Configuration
41+
42+
The configuration file `config.json` contains settings for the script:
43+
44+
- `usage_level`: An integer from 1 to 3 representing system usage allowance.
45+
- `http_links`: A list of URLs to scrape HTTP proxies from.
46+
- `socks5_links`: A list of URLs to scrape SOCKS5 proxies from.
47+
48+
### Functions
49+
50+
#### `generate_random_folder_name(length=32)`
51+
52+
Generates a random folder name with the specified length.
53+
54+
#### `remove_old_folders(base_folder=".")`
55+
56+
Removes old folders with 32 character names in the base folder.
57+
58+
#### `get_time_rn()`
59+
60+
Returns the current time formatted as HH:MM:SS.
61+
62+
#### `get_usage_level_str(level)`
63+
64+
Converts the usage level integer to a string representation.
65+
66+
#### `update_title(http_selected, socks5_selected, usage_level)`
67+
68+
Updates the console title with current CPU, RAM usage, and validation counts.
69+
70+
#### `center_text(text, width)`
71+
72+
Centers the text within the given width.
73+
74+
#### `ui()`
75+
76+
Clears the console and displays the main UI with ASCII art.
77+
78+
#### `scrape_proxy_links(link, proxy_type)`
79+
80+
Scrapes proxies from the given link, retries up to 3 times in case of failure.
81+
82+
#### `check_proxy_link(link)`
83+
84+
Checks if a proxy link is accessible.
85+
86+
#### `clean_proxy_links()`
87+
88+
Cleans the proxy links by removing non-accessible ones.
89+
90+
#### `scrape_proxies(proxy_list, proxy_type, file_name)`
91+
92+
Scrapes proxies from the provided list of links and saves them to a file.
93+
94+
#### `check_proxy_http(proxy)`
95+
96+
Checks the validity of an HTTP/s proxy by making a request to httpbin.org.
97+
98+
#### `check_proxy_socks5(proxy)`
99+
100+
Checks the validity of a SOCKS5 proxy by connecting to google.com.
101+
102+
#### `check_http_proxies(proxies)`
103+
104+
Checks a list of HTTP/s proxies for validity.
105+
106+
#### `check_socks5_proxies(proxies)`
107+
108+
Checks a list of SOCKS5 proxies for validity.
109+
110+
#### `signal_handler(sig, frame)`
111+
112+
Handles SIGINT signal (Ctrl+C) to exit gracefully.
113+
114+
#### `set_process_priority()`
115+
116+
Sets the process priority to high for better performance.
117+
118+
#### `loading_animation()`
119+
120+
Displays a loading animation while verifying proxy links.
121+
122+
#### `clear_console()`
123+
124+
Clears the console screen.
125+
126+
#### `continuously_update_title()`
127+
128+
Continuously updates the console title with current status.
129+
130+
### Example `config.json`
131+
132+
```json
133+
{
134+
"usage_level": 2,
135+
"http_links": [
136+
"https://api.proxyscrape.com/?request=getproxies&proxytype=https&timeout=10000&country=all&ssl=all&anonymity=all",
137+
"https://api.proxyscrape.com/v2/?request=getproxies&protocol=http&timeout=10000&country=all&ssl=all&anonymity=all"
138+
],
139+
"socks5_links": [
140+
"https://raw.githubusercontent.com/B4RC0DE-TM/proxy-list/main/SOCKS5.txt",
141+
"https://raw.githubusercontent.com/saschazesiger/Free-Proxies/master/proxies/socks5.txt"
142+
]
143+
}
144+
```
145+
146+
By following this documentation, you should be able to set up, run, and understand the Proxy Scraper and Checker script with ease.
147+
148+
## Script Description
149+
150+
This script is designed to download and verify HTTP/s and SOCKS5 proxies from public databases and files. It offers the following key features:
151+
152+
- **Configurable Threading**: Adjust the number of threads based on your system's capability using a `usage_level` setting from 1 to 3.
153+
- **Scraping Proxies**: Automatically scrape HTTP/s and SOCKS5 proxies from various online sources.
154+
- **Checking Proxies**: Validate the functionality of the scraped proxies to ensure they are operational.
155+
- **System Monitoring**: Display the script's CPU and RAM usage in the console title for real-time performance monitoring.
156+
157+
### Usage
158+
159+
1. **Installation**:
160+
- Clone the repository or download the .zip file.
161+
- Navigate to the project directory.
162+
163+
2. **Running the Script**:
164+
- Execute the script using:
165+
```bash
166+
start.bat
167+
```
168+
or
169+
```bash
170+
python main.py
171+
```
172+
173+
3. **Configuration**:
174+
- The script uses a `config.json` file to manage settings.
175+
- Adjust the `usage_level`, and specify the list of URLs for HTTP/s and SOCKS5 proxies.
176+
177+
4. **Educational & Research Purposes Only**:
178+
- This script is intended for educational and research purposes only. Use it responsibly and in accordance with applicable laws.
179+
180+
### Requirements
181+
182+
- Python 3.8+
183+
- All necessary packages are automatically installed when the script is run.
184+
185+
### Example `config.json`
186+
187+
```json
188+
{
189+
"usage_level": 2,
190+
"http_links": [
191+
"https://api.proxyscrape.com/?request=getproxies&proxytype=https&timeout=10000&country=all&ssl=all&anonymity=all",
192+
"https://api.proxyscrape.com/v2/?request=getproxies&protocol=http&timeout=10000&country=all&ssl=all&anonymity=all"
193+
],
194+
"socks5_links": [
195+
"https://raw.githubusercontent.com/B4RC0DE-TM/proxy-list/main/SOCKS5.txt",
196+
"https://raw.githubusercontent.com/saschazesiger/Free-Proxies/master/proxies/socks5.txt"
197+
]
198+
}
199+
```
200+
201+
By following this documentation, you should be able to set up, run, and understand the Proxy Scraper and Checker script with ease.

0 commit comments

Comments
 (0)