Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: wiki and changelog updates for 5.7.2 release #303

Merged
merged 4 commits into from
Nov 21, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 20 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,26 @@ All notable changes to this project will be documented here. For more details, v
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [5.7.2] - 2024-11-20

This update introduces the following changes:

1. Add option to use cookies from any supported site
2. Apply cookies from flaresolverr when possible, even if the response is invalid
3. Add option to automatically import cookies at startup
4. Better validation of config values
5. Rework entire TUI user input options
6. General logging improvements and bug fixes

#### Details

1. User can import cookies from their browser. CDL will use these cookies to login to websites and pass clouflare DDoS challenges. For more information on cookies extraction and configuration, visit: https://script-ware.gitbook.io/cyberdrop-dl/reference/configuration-options/settings#browser-cookies
2. When using flaresolverr, CDL will try to apply the cookies from the response and make a new request if neccesary.
3. User can set CDL to automatically import cookies at startup. User must specify browser and domains to export cookies from
4. Add logic validation for config path values
5. Remove integrated config edit options. Modifications to the config must be done directly on the config file.


## [5.7.1] - 2024-11-05

⚠️**BREAKING CHANGES**
Expand Down
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@
# `cyberdrop-dl-patched`
*Bulk asynchronous downloader for multiple file hosts*

![PyPI - Version](https://img.shields.io/pypi/v/cyberdrop-dl-patched)
![PyPI - Python Version](https://img.shields.io/pypi/pyversions/cyberdrop-dl-patched)
![Docs](https://img.shields.io/badge/docs-wiki-blue?link=https%3A%2F%2Fscript-ware.gitbook.io%2Fcyberdrop-dl)
![GitHub License](https://img.shields.io/github/license/jbsparrow/CyberDropDownloader)
![PyPI - Downloads](https://img.shields.io/pypi/dm/cyberdrop-dl-patched)
[![PyPI - Version](https://img.shields.io/pypi/v/cyberdrop-dl-patched)](https://pypi.org/project/cyberdrop-dl-patched/)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/cyberdrop-dl-patched)](https://pypi.org/project/cyberdrop-dl-patched/)
[![Docs](https://img.shields.io/badge/docs-wiki-blue?link=https%3A%2F%2Fscript-ware.gitbook.io%2Fcyberdrop-dl)](https://script-ware.gitbook.io/cyberdrop-dl)
[![GitHub License](https://img.shields.io/github/license/jbsparrow/CyberDropDownloader)](https://github.com/jbsparrow/CyberDropDownloader/blob/master/LICENSE)
[![PyPI - Downloads](https://img.shields.io/pypi/dm/cyberdrop-dl-patched)](https://pypistats.org/packages/cyberdrop-dl-patched)

[![Discord](https://discordapp.com/api/guilds/1070206871564197908/widget.png?style=banner2)](https://discord.com/invite/P5nsbKErwy)

Expand Down
43 changes: 27 additions & 16 deletions docs/reference/configuration-options/authentication.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Once you have put your `session` cookie into the authentication file, you can ad

In order to scrape links/content from forums, you need to provide Cyberdrop-DL with your login details so it can access the website. This section also includes cookies for the support forums.

If you use the cookie extractor to load the XF\_User\_Cookies into the program, you don't need to provide the program with credentials. If you ever log out of the forum in your browser though, you will need to use the cookie extractor again to get new cookies.
If you use the cookie extractor to load the `XF_User` cookies into the program, you don't need to provide these credentials. If you ever log out of the forum in your browser though, you will need to use the cookie extractor again to get new cookies.

It is best to leave the authentication parameter for SimpCity blank, as they have made their forum public and have asked users scraping the website not to use logged in users.

Expand All @@ -30,15 +30,15 @@ In order to set specific authentication values for a config instead of the globa

* \<forum>\_xf\_user\_cookie

This is the value for the cookie I was talking about above. If you want to only use credentials, you can leave this blank.
This is the value for the `XF_User` cookie mentioned above. If you want to only use credentials, you can leave this blank.

* \<forum>\_username

This is your username for the forum. Again, if you use the cookie, you don't need to provide this.
This is your username for the forum. Again, if you use cookies, you don't need to provide this.

* \<forum>\_password

This is your password for the forum. Again, if you use the cookie, you don't need to provide this.
This is your password for the forum. Again, if you use cookies, you don't need to provide this.

</details>

Expand Down Expand Up @@ -66,11 +66,15 @@ In order to scrape images from Imgur, you'll need to create a client on Imgurs w

Some examples of what to put in for what it asks for:

* Application Name: Cyberdrop-DL
* OAuth2 without a callback URL
* Website: \<really doesn't matter>
* Email: Your email
* Description: Cyberdrop-DL client
> Application Name: Cyberdrop-DL

> OAuth2 without a callback URL

> Website: <really doesn't matter>

> Email: Your email

> Description: Cyberdrop-DL client

***

Expand Down Expand Up @@ -126,13 +130,6 @@ In order to scrape files from Reddit, you'll need to create an app on reddits we

[https://www.reddit.com/prefs/apps](https://www.reddit.com/prefs/apps)

Some examples of what to put in for what it asks for:

* name: Cyberdrop-DL
* script
*
*

***

* reddit\_personal\_use\_script
Expand All @@ -141,3 +138,17 @@ Some examples of what to put in for what it asks for:
after generating the app, you will need to give Cyberdrop-DL these values.

</details>

<details>

<summary>RealDebrid</summary>

In order to download files from sites supported by real-debrid, you'll need to get the api token from your account.

***

* realdebrid\_api\_key

You can get your API key here (you must be logged in): [https://real-debrid.com/apitoken](https://real-debrid.com/apitoken)

</details>
10 changes: 6 additions & 4 deletions docs/reference/configuration-options/global-settings.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,8 @@ Setting this to true will allow the program to connect to websites without ssl (

The user agent is the signature of your browser, it's how it is represented to websites you connect to. You can google "what is my user agent" to see what yours may be.

**Note:** if you use flaresolverr, this value must match with flaresolverr user agent for its cookies to work.

***

* proxy
Expand All @@ -34,8 +36,6 @@ The proxy you want CDL to utilize. Ex. `https://user:pass@ip:port`

The IP for flaresolverr you want CDL to utilize. Ex. `ip:port`

CDL will fill the rest of the URL.

***

* max\_file\_name\_length
Expand All @@ -52,7 +52,7 @@ This is the maximum number of characters allowable in a folder name.

* required\_free\_space

This is the amount of free space in gigabytes that the program will stop initiating downloads at.
This is the amount of free space (in gigabytes) that the program will stop initiating downloads at.

</details>

Expand Down Expand Up @@ -108,11 +108,12 @@ This is the maximum number of files that can be downloaded from a single domain

Some domains have internal limits set by the program, such as Bunkrr, CyberFile, etc.

***

* download\_speed\_limit

This is the max rate of downloading in KB for all downloads combined
Set to 0 or None to disable
Set to 0 or `null` to disable


</details>
Expand All @@ -125,6 +126,7 @@ These are options for enable/disable dupe clean up
***

* dedupe\_already\_downloaded

Allows files skipped for already existing on the filesystem to be added to the list of files to process for deduping

***
Expand Down
Loading
Loading