mirror of
https://github.com/yt-dlp/yt-dlp.git
synced 2024-09-21 09:51:25 +02:00
Compare commits
No commits in common. "b4924a38020744f83a97174723295f0a1c94bc0f" and "71cc9b0477f5d0fa005b855a7e43dae5799a2e2a" have entirely different histories.
b4924a3802
...
71cc9b0477
8
.github/ISSUE_TEMPLATE/1_broken_site.yml
vendored
8
.github/ISSUE_TEMPLATE/1_broken_site.yml
vendored
|
@ -18,7 +18,7 @@ body:
|
||||||
options:
|
options:
|
||||||
- label: I'm reporting that yt-dlp is broken on a **supported** site
|
- label: I'm reporting that yt-dlp is broken on a **supported** site
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **2023.10.13** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I'm running yt-dlp version **2023.10.07** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
required: true
|
required: true
|
||||||
|
@ -64,7 +64,7 @@ body:
|
||||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version 2023.10.13 [9d339c4] (win32_exe)
|
[debug] yt-dlp version 2023.10.07 [9d339c4] (win32_exe)
|
||||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] Checking exe version: ffmpeg -bsfs
|
[debug] Checking exe version: ffmpeg -bsfs
|
||||||
[debug] Checking exe version: ffprobe -bsfs
|
[debug] Checking exe version: ffprobe -bsfs
|
||||||
|
@ -72,8 +72,8 @@ body:
|
||||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||||
Latest version: 2023.10.13, Current version: 2023.10.13
|
Latest version: 2023.10.07, Current version: 2023.10.07
|
||||||
yt-dlp is up to date (2023.10.13)
|
yt-dlp is up to date (2023.10.07)
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
validations:
|
validations:
|
||||||
|
|
|
@ -18,7 +18,7 @@ body:
|
||||||
options:
|
options:
|
||||||
- label: I'm reporting a new site support request
|
- label: I'm reporting a new site support request
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **2023.10.13** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I'm running yt-dlp version **2023.10.07** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
required: true
|
required: true
|
||||||
|
@ -76,7 +76,7 @@ body:
|
||||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version 2023.10.13 [9d339c4] (win32_exe)
|
[debug] yt-dlp version 2023.10.07 [9d339c4] (win32_exe)
|
||||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] Checking exe version: ffmpeg -bsfs
|
[debug] Checking exe version: ffmpeg -bsfs
|
||||||
[debug] Checking exe version: ffprobe -bsfs
|
[debug] Checking exe version: ffprobe -bsfs
|
||||||
|
@ -84,8 +84,8 @@ body:
|
||||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||||
Latest version: 2023.10.13, Current version: 2023.10.13
|
Latest version: 2023.10.07, Current version: 2023.10.07
|
||||||
yt-dlp is up to date (2023.10.13)
|
yt-dlp is up to date (2023.10.07)
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
validations:
|
validations:
|
||||||
|
|
|
@ -18,7 +18,7 @@ body:
|
||||||
options:
|
options:
|
||||||
- label: I'm requesting a site-specific feature
|
- label: I'm requesting a site-specific feature
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **2023.10.13** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I'm running yt-dlp version **2023.10.07** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
required: true
|
required: true
|
||||||
|
@ -72,7 +72,7 @@ body:
|
||||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version 2023.10.13 [9d339c4] (win32_exe)
|
[debug] yt-dlp version 2023.10.07 [9d339c4] (win32_exe)
|
||||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] Checking exe version: ffmpeg -bsfs
|
[debug] Checking exe version: ffmpeg -bsfs
|
||||||
[debug] Checking exe version: ffprobe -bsfs
|
[debug] Checking exe version: ffprobe -bsfs
|
||||||
|
@ -80,8 +80,8 @@ body:
|
||||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||||
Latest version: 2023.10.13, Current version: 2023.10.13
|
Latest version: 2023.10.07, Current version: 2023.10.07
|
||||||
yt-dlp is up to date (2023.10.13)
|
yt-dlp is up to date (2023.10.07)
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
validations:
|
validations:
|
||||||
|
|
8
.github/ISSUE_TEMPLATE/4_bug_report.yml
vendored
8
.github/ISSUE_TEMPLATE/4_bug_report.yml
vendored
|
@ -18,7 +18,7 @@ body:
|
||||||
options:
|
options:
|
||||||
- label: I'm reporting a bug unrelated to a specific site
|
- label: I'm reporting a bug unrelated to a specific site
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **2023.10.13** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I'm running yt-dlp version **2023.10.07** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
required: true
|
required: true
|
||||||
|
@ -57,7 +57,7 @@ body:
|
||||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version 2023.10.13 [9d339c4] (win32_exe)
|
[debug] yt-dlp version 2023.10.07 [9d339c4] (win32_exe)
|
||||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] Checking exe version: ffmpeg -bsfs
|
[debug] Checking exe version: ffmpeg -bsfs
|
||||||
[debug] Checking exe version: ffprobe -bsfs
|
[debug] Checking exe version: ffprobe -bsfs
|
||||||
|
@ -65,8 +65,8 @@ body:
|
||||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||||
Latest version: 2023.10.13, Current version: 2023.10.13
|
Latest version: 2023.10.07, Current version: 2023.10.07
|
||||||
yt-dlp is up to date (2023.10.13)
|
yt-dlp is up to date (2023.10.07)
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
validations:
|
validations:
|
||||||
|
|
8
.github/ISSUE_TEMPLATE/5_feature_request.yml
vendored
8
.github/ISSUE_TEMPLATE/5_feature_request.yml
vendored
|
@ -20,7 +20,7 @@ body:
|
||||||
required: true
|
required: true
|
||||||
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **2023.10.13** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I'm running yt-dlp version **2023.10.07** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
required: true
|
required: true
|
||||||
|
@ -53,7 +53,7 @@ body:
|
||||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version 2023.10.13 [9d339c4] (win32_exe)
|
[debug] yt-dlp version 2023.10.07 [9d339c4] (win32_exe)
|
||||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] Checking exe version: ffmpeg -bsfs
|
[debug] Checking exe version: ffmpeg -bsfs
|
||||||
[debug] Checking exe version: ffprobe -bsfs
|
[debug] Checking exe version: ffprobe -bsfs
|
||||||
|
@ -61,7 +61,7 @@ body:
|
||||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||||
Latest version: 2023.10.13, Current version: 2023.10.13
|
Latest version: 2023.10.07, Current version: 2023.10.07
|
||||||
yt-dlp is up to date (2023.10.13)
|
yt-dlp is up to date (2023.10.07)
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
|
|
8
.github/ISSUE_TEMPLATE/6_question.yml
vendored
8
.github/ISSUE_TEMPLATE/6_question.yml
vendored
|
@ -26,7 +26,7 @@ body:
|
||||||
required: true
|
required: true
|
||||||
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **2023.10.13** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I'm running yt-dlp version **2023.10.07** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
|
||||||
required: true
|
required: true
|
||||||
|
@ -59,7 +59,7 @@ body:
|
||||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version 2023.10.13 [9d339c4] (win32_exe)
|
[debug] yt-dlp version 2023.10.07 [9d339c4] (win32_exe)
|
||||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] Checking exe version: ffmpeg -bsfs
|
[debug] Checking exe version: ffmpeg -bsfs
|
||||||
[debug] Checking exe version: ffprobe -bsfs
|
[debug] Checking exe version: ffprobe -bsfs
|
||||||
|
@ -67,7 +67,7 @@ body:
|
||||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||||
Latest version: 2023.10.13, Current version: 2023.10.13
|
Latest version: 2023.10.07, Current version: 2023.10.07
|
||||||
yt-dlp is up to date (2023.10.13)
|
yt-dlp is up to date (2023.10.07)
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
|
|
4
.github/workflows/core.yml
vendored
4
.github/workflows/core.yml
vendored
|
@ -32,8 +32,10 @@ jobs:
|
||||||
uses: actions/setup-python@v4
|
uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: ${{ matrix.python-version }}
|
python-version: ${{ matrix.python-version }}
|
||||||
|
- name: Install pytest
|
||||||
|
run: pip install pytest
|
||||||
- name: Install dependencies
|
- name: Install dependencies
|
||||||
run: pip install pytest -r requirements.txt
|
run: pip install -r requirements.txt
|
||||||
- name: Run tests
|
- name: Run tests
|
||||||
continue-on-error: False
|
continue-on-error: False
|
||||||
run: |
|
run: |
|
||||||
|
|
|
@ -509,7 +509,3 @@ handlerug
|
||||||
jiru
|
jiru
|
||||||
madewokherd
|
madewokherd
|
||||||
xofe
|
xofe
|
||||||
awalgarg
|
|
||||||
midnightveil
|
|
||||||
naginatana
|
|
||||||
Riteo
|
|
||||||
|
|
24
Changelog.md
24
Changelog.md
|
@ -4,30 +4,6 @@
|
||||||
# To create a release, dispatch the https://github.com/yt-dlp/yt-dlp/actions/workflows/release.yml workflow on master
|
# To create a release, dispatch the https://github.com/yt-dlp/yt-dlp/actions/workflows/release.yml workflow on master
|
||||||
-->
|
-->
|
||||||
|
|
||||||
### 2023.10.13
|
|
||||||
|
|
||||||
#### Core changes
|
|
||||||
- [Ensure thumbnail output directory exists](https://github.com/yt-dlp/yt-dlp/commit/2acd1d555ef89851c73773776715d3de9a0e30b9) ([#7985](https://github.com/yt-dlp/yt-dlp/issues/7985)) by [Riteo](https://github.com/Riteo)
|
|
||||||
- **utils**
|
|
||||||
- `js_to_json`: [Fix `Date` constructor parsing](https://github.com/yt-dlp/yt-dlp/commit/9d7ded6419089c1bf252496073f73ad90ed71004) ([#8295](https://github.com/yt-dlp/yt-dlp/issues/8295)) by [awalgarg](https://github.com/awalgarg), [Grub4K](https://github.com/Grub4K)
|
|
||||||
- `write_xattr`: [Use `os.setxattr` if available](https://github.com/yt-dlp/yt-dlp/commit/84e26038d4002e763ea51ca1bdce4f7e63c540bf) ([#8205](https://github.com/yt-dlp/yt-dlp/issues/8205)) by [bashonly](https://github.com/bashonly), [Grub4K](https://github.com/Grub4K)
|
|
||||||
|
|
||||||
#### Extractor changes
|
|
||||||
- **artetv**: [Support age-restricted content](https://github.com/yt-dlp/yt-dlp/commit/09f815ad52843219a7ee3f2a0dddf6c250c91f0c) ([#8301](https://github.com/yt-dlp/yt-dlp/issues/8301)) by [StefanLobbenmeier](https://github.com/StefanLobbenmeier)
|
|
||||||
- **jtbc**: [Add extractors](https://github.com/yt-dlp/yt-dlp/commit/b286ec68f1f28798b3e371f888a2ed97d399cf77) ([#8314](https://github.com/yt-dlp/yt-dlp/issues/8314)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **mbn**: [Add extractor](https://github.com/yt-dlp/yt-dlp/commit/e030b6b6fba7b2f4614ad2ab9f7649d40a2dd305) ([#8312](https://github.com/yt-dlp/yt-dlp/issues/8312)) by [seproDev](https://github.com/seproDev)
|
|
||||||
- **nhk**: [Fix Japanese-language VOD extraction](https://github.com/yt-dlp/yt-dlp/commit/4de94b9e165bfd6421a692f5f2eabcdb08edcb71) ([#8309](https://github.com/yt-dlp/yt-dlp/issues/8309)) by [garret1317](https://github.com/garret1317)
|
|
||||||
- **radiko**: [Fix bug with `downloader_options`](https://github.com/yt-dlp/yt-dlp/commit/b9316642313bbc9e209ac0d2276d37ba60bceb49) by [bashonly](https://github.com/bashonly)
|
|
||||||
- **tenplay**: [Add support for seasons](https://github.com/yt-dlp/yt-dlp/commit/88a99c87b680ae59002534a517e191f46c42cbd4) ([#7939](https://github.com/yt-dlp/yt-dlp/issues/7939)) by [midnightveil](https://github.com/midnightveil)
|
|
||||||
- **youku**: [Improve tudou.com support](https://github.com/yt-dlp/yt-dlp/commit/b7098d46b552a9322c6cea39ba80be5229f922de) ([#8160](https://github.com/yt-dlp/yt-dlp/issues/8160)) by [naginatana](https://github.com/naginatana)
|
|
||||||
- **youtube**: [Fix bug with `--extractor-retries inf`](https://github.com/yt-dlp/yt-dlp/commit/feebf6d02fc9651331eee2af5e08e6112288163b) ([#8328](https://github.com/yt-dlp/yt-dlp/issues/8328)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
|
|
||||||
#### Downloader changes
|
|
||||||
- **fragment**: [Improve progress calculation](https://github.com/yt-dlp/yt-dlp/commit/1c51c520f7b511ebd9e4eb7322285a8c31eedbbd) ([#8241](https://github.com/yt-dlp/yt-dlp/issues/8241)) by [Grub4K](https://github.com/Grub4K)
|
|
||||||
|
|
||||||
#### Misc. changes
|
|
||||||
- **cleanup**: Miscellaneous: [b634ba7](https://github.com/yt-dlp/yt-dlp/commit/b634ba742d8f38ce9ecfa0546485728b0c6c59d1) by [bashonly](https://github.com/bashonly), [gamer191](https://github.com/gamer191)
|
|
||||||
|
|
||||||
### 2023.10.07
|
### 2023.10.07
|
||||||
|
|
||||||
#### Extractor changes
|
#### Extractor changes
|
||||||
|
|
|
@ -89,6 +89,7 @@ yt-dlp is a [youtube-dl](https://github.com/ytdl-org/youtube-dl) fork based on t
|
||||||
* Fix for [n-sig based throttling](https://github.com/ytdl-org/youtube-dl/issues/29326) **\***
|
* Fix for [n-sig based throttling](https://github.com/ytdl-org/youtube-dl/issues/29326) **\***
|
||||||
* Supports some (but not all) age-gated content without cookies
|
* Supports some (but not all) age-gated content without cookies
|
||||||
* Download livestreams from the start using `--live-from-start` (*experimental*)
|
* Download livestreams from the start using `--live-from-start` (*experimental*)
|
||||||
|
* `255kbps` audio is extracted (if available) from YouTube Music when premium cookies are given
|
||||||
* Channel URLs download all uploads of the channel, including shorts and live
|
* Channel URLs download all uploads of the channel, including shorts and live
|
||||||
|
|
||||||
* **Cookies from browser**: Cookies can be automatically extracted from all major web browsers using `--cookies-from-browser BROWSER[+KEYRING][:PROFILE][::CONTAINER]`
|
* **Cookies from browser**: Cookies can be automatically extracted from all major web browsers using `--cookies-from-browser BROWSER[+KEYRING][:PROFILE][::CONTAINER]`
|
||||||
|
@ -157,7 +158,6 @@ Some of yt-dlp's default options are different from that of youtube-dl and youtu
|
||||||
* yt-dlp's sanitization of invalid characters in filenames is different/smarter than in youtube-dl. You can use `--compat-options filename-sanitization` to revert to youtube-dl's behavior
|
* yt-dlp's sanitization of invalid characters in filenames is different/smarter than in youtube-dl. You can use `--compat-options filename-sanitization` to revert to youtube-dl's behavior
|
||||||
* yt-dlp tries to parse the external downloader outputs into the standard progress output if possible (Currently implemented: [~~aria2c~~](https://github.com/yt-dlp/yt-dlp/issues/5931)). You can use `--compat-options no-external-downloader-progress` to get the downloader output as-is
|
* yt-dlp tries to parse the external downloader outputs into the standard progress output if possible (Currently implemented: [~~aria2c~~](https://github.com/yt-dlp/yt-dlp/issues/5931)). You can use `--compat-options no-external-downloader-progress` to get the downloader output as-is
|
||||||
* yt-dlp versions between 2021.09.01 and 2023.01.02 applies `--match-filter` to nested playlists. This was an unintentional side-effect of [8f18ac](https://github.com/yt-dlp/yt-dlp/commit/8f18aca8717bb0dd49054555af8d386e5eda3a88) and is fixed in [d7b460](https://github.com/yt-dlp/yt-dlp/commit/d7b460d0e5fc710950582baed2e3fc616ed98a80). Use `--compat-options playlist-match-filter` to revert this
|
* yt-dlp versions between 2021.09.01 and 2023.01.02 applies `--match-filter` to nested playlists. This was an unintentional side-effect of [8f18ac](https://github.com/yt-dlp/yt-dlp/commit/8f18aca8717bb0dd49054555af8d386e5eda3a88) and is fixed in [d7b460](https://github.com/yt-dlp/yt-dlp/commit/d7b460d0e5fc710950582baed2e3fc616ed98a80). Use `--compat-options playlist-match-filter` to revert this
|
||||||
* yt-dlp uses modern http client backends such as `requests`. Use `--compat-options prefer-legacy-http-handler` to prefer the legacy http handler (`urllib`) to be used for standard http requests.
|
|
||||||
|
|
||||||
For ease of use, a few more compat options are available:
|
For ease of use, a few more compat options are available:
|
||||||
|
|
||||||
|
@ -165,7 +165,7 @@ For ease of use, a few more compat options are available:
|
||||||
* `--compat-options youtube-dl`: Same as `--compat-options all,-multistreams,-playlist-match-filter`
|
* `--compat-options youtube-dl`: Same as `--compat-options all,-multistreams,-playlist-match-filter`
|
||||||
* `--compat-options youtube-dlc`: Same as `--compat-options all,-no-live-chat,-no-youtube-channel-redirect,-playlist-match-filter`
|
* `--compat-options youtube-dlc`: Same as `--compat-options all,-no-live-chat,-no-youtube-channel-redirect,-playlist-match-filter`
|
||||||
* `--compat-options 2021`: Same as `--compat-options 2022,no-certifi,filename-sanitization,no-youtube-prefer-utc-upload-date`
|
* `--compat-options 2021`: Same as `--compat-options 2022,no-certifi,filename-sanitization,no-youtube-prefer-utc-upload-date`
|
||||||
* `--compat-options 2022`: Same as `--compat-options playlist-match-filter,no-external-downloader-progress,prefer-legacy-http-handler`. Use this to enable all future compat options
|
* `--compat-options 2022`: Same as `--compat-options playlist-match-filter,no-external-downloader-progress`. Use this to enable all future compat options
|
||||||
|
|
||||||
|
|
||||||
# INSTALLATION
|
# INSTALLATION
|
||||||
|
@ -275,7 +275,6 @@ While all the other dependencies are optional, `ffmpeg` and `ffprobe` are highly
|
||||||
* [**certifi**](https://github.com/certifi/python-certifi)\* - Provides Mozilla's root certificate bundle. Licensed under [MPLv2](https://github.com/certifi/python-certifi/blob/master/LICENSE)
|
* [**certifi**](https://github.com/certifi/python-certifi)\* - Provides Mozilla's root certificate bundle. Licensed under [MPLv2](https://github.com/certifi/python-certifi/blob/master/LICENSE)
|
||||||
* [**brotli**](https://github.com/google/brotli)\* or [**brotlicffi**](https://github.com/python-hyper/brotlicffi) - [Brotli](https://en.wikipedia.org/wiki/Brotli) content encoding support. Both licensed under MIT <sup>[1](https://github.com/google/brotli/blob/master/LICENSE) [2](https://github.com/python-hyper/brotlicffi/blob/master/LICENSE) </sup>
|
* [**brotli**](https://github.com/google/brotli)\* or [**brotlicffi**](https://github.com/python-hyper/brotlicffi) - [Brotli](https://en.wikipedia.org/wiki/Brotli) content encoding support. Both licensed under MIT <sup>[1](https://github.com/google/brotli/blob/master/LICENSE) [2](https://github.com/python-hyper/brotlicffi/blob/master/LICENSE) </sup>
|
||||||
* [**websockets**](https://github.com/aaugustin/websockets)\* - For downloading over websocket. Licensed under [BSD-3-Clause](https://github.com/aaugustin/websockets/blob/main/LICENSE)
|
* [**websockets**](https://github.com/aaugustin/websockets)\* - For downloading over websocket. Licensed under [BSD-3-Clause](https://github.com/aaugustin/websockets/blob/main/LICENSE)
|
||||||
* [**requests**](https://github.com/psf/requests)\* - HTTP library. For HTTPS proxy and persistent connections support. Licensed under [Apache-2.0](https://github.com/psf/requests/blob/main/LICENSE)
|
|
||||||
|
|
||||||
### Metadata
|
### Metadata
|
||||||
|
|
||||||
|
@ -916,7 +915,7 @@ If you fork the project on GitHub, you can run your fork's [build workflow](.git
|
||||||
Defaults to ~/.netrc
|
Defaults to ~/.netrc
|
||||||
--netrc-cmd NETRC_CMD Command to execute to get the credentials
|
--netrc-cmd NETRC_CMD Command to execute to get the credentials
|
||||||
for an extractor.
|
for an extractor.
|
||||||
--video-password PASSWORD Video-specific password
|
--video-password PASSWORD Video password (vimeo, youku)
|
||||||
--ap-mso MSO Adobe Pass multiple-system operator (TV
|
--ap-mso MSO Adobe Pass multiple-system operator (TV
|
||||||
provider) identifier, use --ap-list-mso for
|
provider) identifier, use --ap-list-mso for
|
||||||
a list of available MSOs
|
a list of available MSOs
|
||||||
|
|
|
@ -56,7 +56,6 @@ class CommitGroup(enum.Enum):
|
||||||
},
|
},
|
||||||
cls.MISC: {
|
cls.MISC: {
|
||||||
'build',
|
'build',
|
||||||
'ci',
|
|
||||||
'cleanup',
|
'cleanup',
|
||||||
'devscripts',
|
'devscripts',
|
||||||
'docs',
|
'docs',
|
||||||
|
|
|
@ -3,6 +3,4 @@ pycryptodomex
|
||||||
websockets
|
websockets
|
||||||
brotli; platform_python_implementation=='CPython'
|
brotli; platform_python_implementation=='CPython'
|
||||||
brotlicffi; platform_python_implementation!='CPython'
|
brotlicffi; platform_python_implementation!='CPython'
|
||||||
certifi
|
certifi
|
||||||
requests>=2.31.0,<3
|
|
||||||
urllib3>=1.26.17,<3
|
|
9
setup.py
9
setup.py
|
@ -62,14 +62,7 @@ def py2exe_params():
|
||||||
'compressed': 1,
|
'compressed': 1,
|
||||||
'optimize': 2,
|
'optimize': 2,
|
||||||
'dist_dir': './dist',
|
'dist_dir': './dist',
|
||||||
'excludes': [
|
'excludes': ['Crypto', 'Cryptodome'], # py2exe cannot import Crypto
|
||||||
# py2exe cannot import Crypto
|
|
||||||
'Crypto',
|
|
||||||
'Cryptodome',
|
|
||||||
# py2exe appears to confuse this with our socks library.
|
|
||||||
# We don't use pysocks and urllib3.contrib.socks would fail to import if tried.
|
|
||||||
'urllib3.contrib.socks'
|
|
||||||
],
|
|
||||||
'dll_excludes': ['w9xpopen.exe', 'crypt32.dll'],
|
'dll_excludes': ['w9xpopen.exe', 'crypt32.dll'],
|
||||||
# Modules that are only imported dynamically must be added here
|
# Modules that are only imported dynamically must be added here
|
||||||
'includes': ['yt_dlp.compat._legacy', 'yt_dlp.compat._deprecated',
|
'includes': ['yt_dlp.compat._legacy', 'yt_dlp.compat._deprecated',
|
||||||
|
|
|
@ -657,8 +657,6 @@
|
||||||
- **Joj**
|
- **Joj**
|
||||||
- **Jove**
|
- **Jove**
|
||||||
- **JStream**
|
- **JStream**
|
||||||
- **JTBC**: jtbc.co.kr
|
|
||||||
- **JTBC:program**
|
|
||||||
- **JWPlatform**
|
- **JWPlatform**
|
||||||
- **Kakao**
|
- **Kakao**
|
||||||
- **Kaltura**
|
- **Kaltura**
|
||||||
|
@ -768,7 +766,6 @@
|
||||||
- **massengeschmack.tv**
|
- **massengeschmack.tv**
|
||||||
- **Masters**
|
- **Masters**
|
||||||
- **MatchTV**
|
- **MatchTV**
|
||||||
- **MBN**: mbn.co.kr (매일방송)
|
|
||||||
- **MDR**: MDR.DE and KiKA
|
- **MDR**: MDR.DE and KiKA
|
||||||
- **MedalTV**
|
- **MedalTV**
|
||||||
- **media.ccc.de**
|
- **media.ccc.de**
|
||||||
|
@ -1471,7 +1468,6 @@
|
||||||
- **Tempo**
|
- **Tempo**
|
||||||
- **TennisTV**: [*tennistv*](## "netrc machine")
|
- **TennisTV**: [*tennistv*](## "netrc machine")
|
||||||
- **TenPlay**: [*10play*](## "netrc machine")
|
- **TenPlay**: [*10play*](## "netrc machine")
|
||||||
- **TenPlaySeason**
|
|
||||||
- **TF1**
|
- **TF1**
|
||||||
- **TFO**
|
- **TFO**
|
||||||
- **TheHoleTv**
|
- **TheHoleTv**
|
||||||
|
|
|
@ -28,7 +28,7 @@ from http.cookiejar import CookieJar
|
||||||
|
|
||||||
from test.helper import FakeYDL, http_server_port
|
from test.helper import FakeYDL, http_server_port
|
||||||
from yt_dlp.cookies import YoutubeDLCookieJar
|
from yt_dlp.cookies import YoutubeDLCookieJar
|
||||||
from yt_dlp.dependencies import brotli, requests, urllib3
|
from yt_dlp.dependencies import brotli
|
||||||
from yt_dlp.networking import (
|
from yt_dlp.networking import (
|
||||||
HEADRequest,
|
HEADRequest,
|
||||||
PUTRequest,
|
PUTRequest,
|
||||||
|
@ -43,7 +43,6 @@ from yt_dlp.networking.exceptions import (
|
||||||
HTTPError,
|
HTTPError,
|
||||||
IncompleteRead,
|
IncompleteRead,
|
||||||
NoSupportingHandlers,
|
NoSupportingHandlers,
|
||||||
ProxyError,
|
|
||||||
RequestError,
|
RequestError,
|
||||||
SSLError,
|
SSLError,
|
||||||
TransportError,
|
TransportError,
|
||||||
|
@ -310,7 +309,7 @@ class TestRequestHandlerBase:
|
||||||
|
|
||||||
|
|
||||||
class TestHTTPRequestHandler(TestRequestHandlerBase):
|
class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_verify_cert(self, handler):
|
def test_verify_cert(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
with pytest.raises(CertificateVerifyError):
|
with pytest.raises(CertificateVerifyError):
|
||||||
|
@ -321,7 +320,7 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
assert r.status == 200
|
assert r.status == 200
|
||||||
r.close()
|
r.close()
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_ssl_error(self, handler):
|
def test_ssl_error(self, handler):
|
||||||
# HTTPS server with too old TLS version
|
# HTTPS server with too old TLS version
|
||||||
# XXX: is there a better way to test this than to create a new server?
|
# XXX: is there a better way to test this than to create a new server?
|
||||||
|
@ -339,7 +338,7 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
validate_and_send(rh, Request(f'https://127.0.0.1:{https_port}/headers'))
|
validate_and_send(rh, Request(f'https://127.0.0.1:{https_port}/headers'))
|
||||||
assert not issubclass(exc_info.type, CertificateVerifyError)
|
assert not issubclass(exc_info.type, CertificateVerifyError)
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_percent_encode(self, handler):
|
def test_percent_encode(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
# Unicode characters should be encoded with uppercase percent-encoding
|
# Unicode characters should be encoded with uppercase percent-encoding
|
||||||
|
@ -351,7 +350,7 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
assert res.status == 200
|
assert res.status == 200
|
||||||
res.close()
|
res.close()
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_remove_dot_segments(self, handler):
|
def test_remove_dot_segments(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
# This isn't a comprehensive test,
|
# This isn't a comprehensive test,
|
||||||
|
@ -367,14 +366,14 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
res.close()
|
res.close()
|
||||||
|
|
||||||
# Not supported by CurlCFFI (non-standard)
|
# Not supported by CurlCFFI (non-standard)
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib'], indirect=True)
|
||||||
def test_unicode_path_redirection(self, handler):
|
def test_unicode_path_redirection(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
r = validate_and_send(rh, Request(f'http://127.0.0.1:{self.http_port}/302-non-ascii-redirect'))
|
r = validate_and_send(rh, Request(f'http://127.0.0.1:{self.http_port}/302-non-ascii-redirect'))
|
||||||
assert r.url == f'http://127.0.0.1:{self.http_port}/%E4%B8%AD%E6%96%87.html'
|
assert r.url == f'http://127.0.0.1:{self.http_port}/%E4%B8%AD%E6%96%87.html'
|
||||||
r.close()
|
r.close()
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_raise_http_error(self, handler):
|
def test_raise_http_error(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
for bad_status in (400, 500, 599, 302):
|
for bad_status in (400, 500, 599, 302):
|
||||||
|
@ -384,7 +383,7 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
# Should not raise an error
|
# Should not raise an error
|
||||||
validate_and_send(rh, Request('http://127.0.0.1:%d/gen_200' % self.http_port)).close()
|
validate_and_send(rh, Request('http://127.0.0.1:%d/gen_200' % self.http_port)).close()
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_response_url(self, handler):
|
def test_response_url(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
# Response url should be that of the last url in redirect chain
|
# Response url should be that of the last url in redirect chain
|
||||||
|
@ -396,7 +395,7 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
res2.close()
|
res2.close()
|
||||||
|
|
||||||
# Covers some basic cases we expect some level of consistency between request handlers for
|
# Covers some basic cases we expect some level of consistency between request handlers for
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
@pytest.mark.parametrize('redirect_status,method,expected', [
|
@pytest.mark.parametrize('redirect_status,method,expected', [
|
||||||
# A 303 must either use GET or HEAD for subsequent request
|
# A 303 must either use GET or HEAD for subsequent request
|
||||||
(303, 'POST', ('', 'GET', False)),
|
(303, 'POST', ('', 'GET', False)),
|
||||||
|
@ -438,7 +437,7 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
assert expected[1] == res.headers.get('method')
|
assert expected[1] == res.headers.get('method')
|
||||||
assert expected[2] == ('content-length' in headers.decode().lower())
|
assert expected[2] == ('content-length' in headers.decode().lower())
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_request_cookie_header(self, handler):
|
def test_request_cookie_header(self, handler):
|
||||||
# We should accept a Cookie header being passed as in normal headers and handle it appropriately.
|
# We should accept a Cookie header being passed as in normal headers and handle it appropriately.
|
||||||
with handler(verbose=True) as rh:
|
with handler(verbose=True) as rh:
|
||||||
|
@ -471,19 +470,19 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
assert b'cookie: test=ytdlp' not in data.lower()
|
assert b'cookie: test=ytdlp' not in data.lower()
|
||||||
assert b'cookie: test=test3' in data.lower()
|
assert b'cookie: test=test3' in data.lower()
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_redirect_loop(self, handler):
|
def test_redirect_loop(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
with pytest.raises(HTTPError, match='redirect loop'):
|
with pytest.raises(HTTPError, match='redirect loop'):
|
||||||
validate_and_send(rh, Request(f'http://127.0.0.1:{self.http_port}/redirect_loop'))
|
validate_and_send(rh, Request(f'http://127.0.0.1:{self.http_port}/redirect_loop'))
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_incompleteread(self, handler):
|
def test_incompleteread(self, handler):
|
||||||
with handler(timeout=2) as rh:
|
with handler(timeout=2) as rh:
|
||||||
with pytest.raises(IncompleteRead, match='13 bytes read, 234221 more expected'):
|
with pytest.raises(IncompleteRead, match='13 bytes read, 234221 more expected'):
|
||||||
validate_and_send(rh, Request('http://127.0.0.1:%d/incompleteread' % self.http_port)).read()
|
validate_and_send(rh, Request('http://127.0.0.1:%d/incompleteread' % self.http_port)).read()
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_cookies(self, handler):
|
def test_cookies(self, handler):
|
||||||
cookiejar = YoutubeDLCookieJar()
|
cookiejar = YoutubeDLCookieJar()
|
||||||
cookiejar.set_cookie(http.cookiejar.Cookie(
|
cookiejar.set_cookie(http.cookiejar.Cookie(
|
||||||
|
@ -500,7 +499,7 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
rh, Request(f'http://127.0.0.1:{self.http_port}/headers', extensions={'cookiejar': cookiejar})).read()
|
rh, Request(f'http://127.0.0.1:{self.http_port}/headers', extensions={'cookiejar': cookiejar})).read()
|
||||||
assert b'cookie: test=ytdlp' in data.lower()
|
assert b'cookie: test=ytdlp' in data.lower()
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_headers(self, handler):
|
def test_headers(self, handler):
|
||||||
|
|
||||||
with handler(headers=HTTPHeaderDict({'test1': 'test', 'test2': 'test2'})) as rh:
|
with handler(headers=HTTPHeaderDict({'test1': 'test', 'test2': 'test2'})) as rh:
|
||||||
|
@ -516,7 +515,7 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
assert b'test2: test2' not in data
|
assert b'test2: test2' not in data
|
||||||
assert b'test3: test3' in data
|
assert b'test3: test3' in data
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_timeout(self, handler):
|
def test_timeout(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
# Default timeout is 20 seconds, so this should go through
|
# Default timeout is 20 seconds, so this should go through
|
||||||
|
@ -532,7 +531,7 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
validate_and_send(
|
validate_and_send(
|
||||||
rh, Request(f'http://127.0.0.1:{self.http_port}/timeout_1', extensions={'timeout': 4}))
|
rh, Request(f'http://127.0.0.1:{self.http_port}/timeout_1', extensions={'timeout': 4}))
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_source_address(self, handler):
|
def test_source_address(self, handler):
|
||||||
source_address = f'127.0.0.{random.randint(5, 255)}'
|
source_address = f'127.0.0.{random.randint(5, 255)}'
|
||||||
with handler(source_address=source_address) as rh:
|
with handler(source_address=source_address) as rh:
|
||||||
|
@ -541,13 +540,13 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
assert source_address == data
|
assert source_address == data
|
||||||
|
|
||||||
# Not supported by CurlCFFI
|
# Not supported by CurlCFFI
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib'], indirect=True)
|
||||||
def test_gzip_trailing_garbage(self, handler):
|
def test_gzip_trailing_garbage(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
data = validate_and_send(rh, Request(f'http://localhost:{self.http_port}/trailing_garbage')).read().decode()
|
data = validate_and_send(rh, Request(f'http://localhost:{self.http_port}/trailing_garbage')).read().decode()
|
||||||
assert data == '<html><video src="/vid.mp4" /></html>'
|
assert data == '<html><video src="/vid.mp4" /></html>'
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib'], indirect=True)
|
||||||
@pytest.mark.skipif(not brotli, reason='brotli support is not installed')
|
@pytest.mark.skipif(not brotli, reason='brotli support is not installed')
|
||||||
def test_brotli(self, handler):
|
def test_brotli(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
|
@ -558,7 +557,7 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
assert res.headers.get('Content-Encoding') == 'br'
|
assert res.headers.get('Content-Encoding') == 'br'
|
||||||
assert res.read() == b'<html><video src="/vid.mp4" /></html>'
|
assert res.read() == b'<html><video src="/vid.mp4" /></html>'
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_deflate(self, handler):
|
def test_deflate(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
res = validate_and_send(
|
res = validate_and_send(
|
||||||
|
@ -568,7 +567,7 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
assert res.headers.get('Content-Encoding') == 'deflate'
|
assert res.headers.get('Content-Encoding') == 'deflate'
|
||||||
assert res.read() == b'<html><video src="/vid.mp4" /></html>'
|
assert res.read() == b'<html><video src="/vid.mp4" /></html>'
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_gzip(self, handler):
|
def test_gzip(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
res = validate_and_send(
|
res = validate_and_send(
|
||||||
|
@ -578,7 +577,7 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
assert res.headers.get('Content-Encoding') == 'gzip'
|
assert res.headers.get('Content-Encoding') == 'gzip'
|
||||||
assert res.read() == b'<html><video src="/vid.mp4" /></html>'
|
assert res.read() == b'<html><video src="/vid.mp4" /></html>'
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_multiple_encodings(self, handler):
|
def test_multiple_encodings(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
for pair in ('gzip,deflate', 'deflate, gzip', 'gzip, gzip', 'deflate, deflate'):
|
for pair in ('gzip,deflate', 'deflate, gzip', 'gzip, gzip', 'deflate, deflate'):
|
||||||
|
@ -589,8 +588,8 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
assert res.headers.get('Content-Encoding') == pair
|
assert res.headers.get('Content-Encoding') == pair
|
||||||
assert res.read() == b'<html><video src="/vid.mp4" /></html>'
|
assert res.read() == b'<html><video src="/vid.mp4" /></html>'
|
||||||
|
|
||||||
# Not supported by curl_cffi
|
# Not supported by Curl
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib'], indirect=True)
|
||||||
def test_unsupported_encoding(self, handler):
|
def test_unsupported_encoding(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
res = validate_and_send(
|
res = validate_and_send(
|
||||||
|
@ -600,7 +599,7 @@ class TestHTTPRequestHandler(TestRequestHandlerBase):
|
||||||
assert res.headers.get('Content-Encoding') == 'unsupported'
|
assert res.headers.get('Content-Encoding') == 'unsupported'
|
||||||
assert res.read() == b'raw'
|
assert res.read() == b'raw'
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['CurlCFFI'], indirect=True)
|
||||||
def test_read(self, handler):
|
def test_read(self, handler):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
res = validate_and_send(
|
res = validate_and_send(
|
||||||
|
@ -633,7 +632,7 @@ class TestHTTPProxy(TestRequestHandlerBase):
|
||||||
cls.geo_proxy_thread.daemon = True
|
cls.geo_proxy_thread.daemon = True
|
||||||
cls.geo_proxy_thread.start()
|
cls.geo_proxy_thread.start()
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_http_proxy(self, handler):
|
def test_http_proxy(self, handler):
|
||||||
http_proxy = f'http://127.0.0.1:{self.proxy_port}'
|
http_proxy = f'http://127.0.0.1:{self.proxy_port}'
|
||||||
geo_proxy = f'http://127.0.0.1:{self.geo_port}'
|
geo_proxy = f'http://127.0.0.1:{self.geo_port}'
|
||||||
|
@ -659,7 +658,7 @@ class TestHTTPProxy(TestRequestHandlerBase):
|
||||||
assert res != f'normal: {real_url}'
|
assert res != f'normal: {real_url}'
|
||||||
assert 'Accept' in res
|
assert 'Accept' in res
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_noproxy(self, handler):
|
def test_noproxy(self, handler):
|
||||||
with handler(proxies={'proxy': f'http://127.0.0.1:{self.proxy_port}'}) as rh:
|
with handler(proxies={'proxy': f'http://127.0.0.1:{self.proxy_port}'}) as rh:
|
||||||
# NO_PROXY
|
# NO_PROXY
|
||||||
|
@ -669,7 +668,7 @@ class TestHTTPProxy(TestRequestHandlerBase):
|
||||||
'utf-8')
|
'utf-8')
|
||||||
assert 'Accept' in nop_response
|
assert 'Accept' in nop_response
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_allproxy(self, handler):
|
def test_allproxy(self, handler):
|
||||||
url = 'http://foo.com/bar'
|
url = 'http://foo.com/bar'
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
|
@ -677,7 +676,7 @@ class TestHTTPProxy(TestRequestHandlerBase):
|
||||||
'utf-8')
|
'utf-8')
|
||||||
assert response == f'normal: {url}'
|
assert response == f'normal: {url}'
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_http_proxy_with_idn(self, handler):
|
def test_http_proxy_with_idn(self, handler):
|
||||||
with handler(proxies={
|
with handler(proxies={
|
||||||
'http': f'http://127.0.0.1:{self.proxy_port}',
|
'http': f'http://127.0.0.1:{self.proxy_port}',
|
||||||
|
@ -714,27 +713,27 @@ class TestClientCertificate:
|
||||||
) as rh:
|
) as rh:
|
||||||
validate_and_send(rh, Request(f'https://127.0.0.1:{self.port}/video.html')).read().decode()
|
validate_and_send(rh, Request(f'https://127.0.0.1:{self.port}/video.html')).read().decode()
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_certificate_combined_nopass(self, handler):
|
def test_certificate_combined_nopass(self, handler):
|
||||||
self._run_test(handler, client_cert={
|
self._run_test(handler, client_cert={
|
||||||
'client_certificate': os.path.join(self.certdir, 'clientwithkey.crt'),
|
'client_certificate': os.path.join(self.certdir, 'clientwithkey.crt'),
|
||||||
})
|
})
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_certificate_nocombined_nopass(self, handler):
|
def test_certificate_nocombined_nopass(self, handler):
|
||||||
self._run_test(handler, client_cert={
|
self._run_test(handler, client_cert={
|
||||||
'client_certificate': os.path.join(self.certdir, 'client.crt'),
|
'client_certificate': os.path.join(self.certdir, 'client.crt'),
|
||||||
'client_certificate_key': os.path.join(self.certdir, 'client.key'),
|
'client_certificate_key': os.path.join(self.certdir, 'client.key'),
|
||||||
})
|
})
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_certificate_combined_pass(self, handler):
|
def test_certificate_combined_pass(self, handler):
|
||||||
self._run_test(handler, client_cert={
|
self._run_test(handler, client_cert={
|
||||||
'client_certificate': os.path.join(self.certdir, 'clientwithencryptedkey.crt'),
|
'client_certificate': os.path.join(self.certdir, 'clientwithencryptedkey.crt'),
|
||||||
'client_certificate_password': 'foobar',
|
'client_certificate_password': 'foobar',
|
||||||
})
|
})
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_certificate_nocombined_pass(self, handler):
|
def test_certificate_nocombined_pass(self, handler):
|
||||||
self._run_test(handler, client_cert={
|
self._run_test(handler, client_cert={
|
||||||
'client_certificate': os.path.join(self.certdir, 'client.crt'),
|
'client_certificate': os.path.join(self.certdir, 'client.crt'),
|
||||||
|
@ -818,75 +817,6 @@ class TestUrllibRequestHandler(TestRequestHandlerBase):
|
||||||
assert not isinstance(exc_info.value, TransportError)
|
assert not isinstance(exc_info.value, TransportError)
|
||||||
|
|
||||||
|
|
||||||
class TestRequestsRequestHandler(TestRequestHandlerBase):
|
|
||||||
@pytest.mark.parametrize('raised,expected', [
|
|
||||||
(lambda: requests.exceptions.ConnectTimeout(), TransportError),
|
|
||||||
(lambda: requests.exceptions.ReadTimeout(), TransportError),
|
|
||||||
(lambda: requests.exceptions.Timeout(), TransportError),
|
|
||||||
(lambda: requests.exceptions.ConnectionError(), TransportError),
|
|
||||||
(lambda: requests.exceptions.ProxyError(), ProxyError),
|
|
||||||
(lambda: requests.exceptions.SSLError('12[CERTIFICATE_VERIFY_FAILED]34'), CertificateVerifyError),
|
|
||||||
(lambda: requests.exceptions.SSLError(), SSLError),
|
|
||||||
(lambda: requests.exceptions.InvalidURL(), RequestError),
|
|
||||||
(lambda: requests.exceptions.InvalidHeader(), RequestError),
|
|
||||||
# catch-all: https://github.com/psf/requests/blob/main/src/requests/adapters.py#L535
|
|
||||||
(lambda: urllib3.exceptions.HTTPError(), TransportError),
|
|
||||||
(lambda: requests.exceptions.RequestException(), RequestError)
|
|
||||||
# (lambda: requests.exceptions.TooManyRedirects(), HTTPError) - Needs a response object
|
|
||||||
])
|
|
||||||
@pytest.mark.parametrize('handler', ['Requests'], indirect=True)
|
|
||||||
def test_request_error_mapping(self, handler, monkeypatch, raised, expected):
|
|
||||||
with handler() as rh:
|
|
||||||
def mock_get_instance(*args, **kwargs):
|
|
||||||
class MockSession:
|
|
||||||
def request(self, *args, **kwargs):
|
|
||||||
raise raised()
|
|
||||||
return MockSession()
|
|
||||||
|
|
||||||
monkeypatch.setattr(rh, '_get_instance', mock_get_instance)
|
|
||||||
|
|
||||||
with pytest.raises(expected) as exc_info:
|
|
||||||
rh.send(Request('http://fake'))
|
|
||||||
|
|
||||||
assert exc_info.type is expected
|
|
||||||
|
|
||||||
@pytest.mark.parametrize('raised,expected,match', [
|
|
||||||
(lambda: urllib3.exceptions.SSLError(), SSLError, None),
|
|
||||||
(lambda: urllib3.exceptions.TimeoutError(), TransportError, None),
|
|
||||||
(lambda: urllib3.exceptions.ReadTimeoutError(None, None, None), TransportError, None),
|
|
||||||
(lambda: urllib3.exceptions.ProtocolError(), TransportError, None),
|
|
||||||
(lambda: urllib3.exceptions.DecodeError(), TransportError, None),
|
|
||||||
(lambda: urllib3.exceptions.HTTPError(), TransportError, None), # catch-all
|
|
||||||
(
|
|
||||||
lambda: urllib3.exceptions.ProtocolError('error', http.client.IncompleteRead(partial=b'abc', expected=4)),
|
|
||||||
IncompleteRead,
|
|
||||||
'3 bytes read, 4 more expected'
|
|
||||||
),
|
|
||||||
(
|
|
||||||
lambda: urllib3.exceptions.ProtocolError('error', urllib3.exceptions.IncompleteRead(partial=3, expected=5)),
|
|
||||||
IncompleteRead,
|
|
||||||
'3 bytes read, 5 more expected'
|
|
||||||
),
|
|
||||||
])
|
|
||||||
@pytest.mark.parametrize('handler', ['Requests'], indirect=True)
|
|
||||||
def test_response_error_mapping(self, handler, monkeypatch, raised, expected, match):
|
|
||||||
from urllib3.response import HTTPResponse as Urllib3Response
|
|
||||||
from requests.models import Response as RequestsResponse
|
|
||||||
from yt_dlp.networking._requests import RequestsResponseAdapter
|
|
||||||
requests_res = RequestsResponse()
|
|
||||||
requests_res.raw = Urllib3Response(body=b'', status=200)
|
|
||||||
res = RequestsResponseAdapter(requests_res)
|
|
||||||
|
|
||||||
def mock_read(*args, **kwargs):
|
|
||||||
raise raised()
|
|
||||||
monkeypatch.setattr(res.fp, 'read', mock_read)
|
|
||||||
|
|
||||||
with pytest.raises(expected, match=match) as exc_info:
|
|
||||||
res.read()
|
|
||||||
|
|
||||||
assert exc_info.type is expected
|
|
||||||
|
|
||||||
|
|
||||||
class TestCurlCFFIRequestHandler(TestRequestHandlerBase):
|
class TestCurlCFFIRequestHandler(TestRequestHandlerBase):
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['CurlCFFI'], indirect=True)
|
||||||
|
@ -961,10 +891,6 @@ class TestRequestHandlerValidation:
|
||||||
('file', UnsupportedRequest, {}),
|
('file', UnsupportedRequest, {}),
|
||||||
('file', False, {'enable_file_urls': True}),
|
('file', False, {'enable_file_urls': True}),
|
||||||
]),
|
]),
|
||||||
('Requests', [
|
|
||||||
('http', False, {}),
|
|
||||||
('https', False, {}),
|
|
||||||
]),
|
|
||||||
('CurlCFFI', [
|
('CurlCFFI', [
|
||||||
('http', False, {}),
|
('http', False, {}),
|
||||||
('https', False, {}),
|
('https', False, {}),
|
||||||
|
@ -984,14 +910,6 @@ class TestRequestHandlerValidation:
|
||||||
('socks5h', False),
|
('socks5h', False),
|
||||||
('socks', UnsupportedRequest),
|
('socks', UnsupportedRequest),
|
||||||
]),
|
]),
|
||||||
('Requests', [
|
|
||||||
('http', False),
|
|
||||||
('https', False),
|
|
||||||
('socks4', False),
|
|
||||||
('socks4a', False),
|
|
||||||
('socks5', False),
|
|
||||||
('socks5h', False),
|
|
||||||
]),
|
|
||||||
('CurlCFFI', [
|
('CurlCFFI', [
|
||||||
('http', False),
|
('http', False),
|
||||||
('https', False),
|
('https', False),
|
||||||
|
@ -1010,10 +928,6 @@ class TestRequestHandlerValidation:
|
||||||
('all', False),
|
('all', False),
|
||||||
('unrelated', False),
|
('unrelated', False),
|
||||||
]),
|
]),
|
||||||
('Requests', [
|
|
||||||
('all', False),
|
|
||||||
('unrelated', False),
|
|
||||||
]),
|
|
||||||
('CurlCFFI', [
|
('CurlCFFI', [
|
||||||
('all', False),
|
('all', False),
|
||||||
('unrelated', False),
|
('unrelated', False),
|
||||||
|
@ -1032,13 +946,6 @@ class TestRequestHandlerValidation:
|
||||||
({'timeout': 'notatimeout'}, AssertionError),
|
({'timeout': 'notatimeout'}, AssertionError),
|
||||||
({'unsupported': 'value'}, UnsupportedRequest),
|
({'unsupported': 'value'}, UnsupportedRequest),
|
||||||
]),
|
]),
|
||||||
('Requests', [
|
|
||||||
({'cookiejar': 'notacookiejar'}, AssertionError),
|
|
||||||
({'cookiejar': YoutubeDLCookieJar()}, False),
|
|
||||||
({'timeout': 1}, False),
|
|
||||||
({'timeout': 'notatimeout'}, AssertionError),
|
|
||||||
({'unsupported': 'value'}, UnsupportedRequest),
|
|
||||||
]),
|
|
||||||
('CurlCFFI', [
|
('CurlCFFI', [
|
||||||
({'cookiejar': 'notacookiejar'}, AssertionError),
|
({'cookiejar': 'notacookiejar'}, AssertionError),
|
||||||
({'cookiejar': YoutubeDLCookieJar()}, False),
|
({'cookiejar': YoutubeDLCookieJar()}, False),
|
||||||
|
@ -1064,7 +971,7 @@ class TestRequestHandlerValidation:
|
||||||
def test_url_scheme(self, handler, scheme, fail, handler_kwargs):
|
def test_url_scheme(self, handler, scheme, fail, handler_kwargs):
|
||||||
run_validation(handler, fail, Request(f'{scheme}://'), **(handler_kwargs or {}))
|
run_validation(handler, fail, Request(f'{scheme}://'), **(handler_kwargs or {}))
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,fail', [('Urllib', False), ('Requests', False), ('CurlCFFI', False)], indirect=['handler'])
|
@pytest.mark.parametrize('handler,fail', [('Urllib', False), ('CurlCFFI', False)], indirect=['handler'])
|
||||||
def test_no_proxy(self, handler, fail):
|
def test_no_proxy(self, handler, fail):
|
||||||
run_validation(handler, fail, Request('http://', proxies={'no': '127.0.0.1,github.com'}))
|
run_validation(handler, fail, Request('http://', proxies={'no': '127.0.0.1,github.com'}))
|
||||||
run_validation(handler, fail, Request('http://'), proxies={'no': '127.0.0.1,github.com'})
|
run_validation(handler, fail, Request('http://'), proxies={'no': '127.0.0.1,github.com'})
|
||||||
|
@ -1087,13 +994,13 @@ class TestRequestHandlerValidation:
|
||||||
run_validation(handler, fail, Request('http://', proxies={'http': f'{scheme}://example.com'}))
|
run_validation(handler, fail, Request('http://', proxies={'http': f'{scheme}://example.com'}))
|
||||||
run_validation(handler, fail, Request('http://'), proxies={'http': f'{scheme}://example.com'})
|
run_validation(handler, fail, Request('http://'), proxies={'http': f'{scheme}://example.com'})
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', HTTPSupportedRH, 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', HTTPSupportedRH, 'CurlCFFI'], indirect=True)
|
||||||
def test_empty_proxy(self, handler):
|
def test_empty_proxy(self, handler):
|
||||||
run_validation(handler, False, Request('http://', proxies={'http': None}))
|
run_validation(handler, False, Request('http://', proxies={'http': None}))
|
||||||
run_validation(handler, False, Request('http://'), proxies={'http': None})
|
run_validation(handler, False, Request('http://'), proxies={'http': None})
|
||||||
|
|
||||||
@pytest.mark.parametrize('proxy_url', ['//example.com', 'example.com', '127.0.0.1', '/a/b/c'])
|
@pytest.mark.parametrize('proxy_url', ['//example.com', 'example.com', '127.0.0.1', '/a/b/c'])
|
||||||
@pytest.mark.parametrize('handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
@pytest.mark.parametrize('handler', ['Urllib', 'CurlCFFI'], indirect=True)
|
||||||
def test_invalid_proxy_url(self, handler, proxy_url):
|
def test_invalid_proxy_url(self, handler, proxy_url):
|
||||||
run_validation(handler, UnsupportedRequest, Request('http://', proxies={'http': proxy_url}))
|
run_validation(handler, UnsupportedRequest, Request('http://', proxies={'http': proxy_url}))
|
||||||
|
|
||||||
|
@ -1397,13 +1304,6 @@ class TestYoutubeDLNetworking:
|
||||||
rh = self.build_handler(ydl, UrllibRH)
|
rh = self.build_handler(ydl, UrllibRH)
|
||||||
assert rh.enable_file_urls is True
|
assert rh.enable_file_urls is True
|
||||||
|
|
||||||
def test_compat_opt_prefer_urllib(self):
|
|
||||||
# This assumes urllib only has a preference when this compat opt is given
|
|
||||||
with FakeYDL({'compat_opts': ['prefer-legacy-http-handler']}) as ydl:
|
|
||||||
director = ydl.build_request_director([UrllibRH])
|
|
||||||
assert len(director.preferences) == 1
|
|
||||||
assert director.preferences.pop()(UrllibRH, None)
|
|
||||||
|
|
||||||
|
|
||||||
class TestRequest:
|
class TestRequest:
|
||||||
|
|
||||||
|
|
|
@ -263,7 +263,7 @@ def ctx(request):
|
||||||
|
|
||||||
|
|
||||||
class TestSocks4Proxy:
|
class TestSocks4Proxy:
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_socks4_no_auth(self, handler, ctx):
|
def test_socks4_no_auth(self, handler, ctx):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
||||||
|
@ -271,7 +271,7 @@ class TestSocks4Proxy:
|
||||||
rh, proxies={'all': f'socks4://{server_address}'})
|
rh, proxies={'all': f'socks4://{server_address}'})
|
||||||
assert response['version'] == 4
|
assert response['version'] == 4
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_socks4_auth(self, handler, ctx):
|
def test_socks4_auth(self, handler, ctx):
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
with ctx.socks_server(Socks4ProxyHandler, user_id='user') as server_address:
|
with ctx.socks_server(Socks4ProxyHandler, user_id='user') as server_address:
|
||||||
|
@ -281,15 +281,18 @@ class TestSocks4Proxy:
|
||||||
rh, proxies={'all': f'socks4://user:@{server_address}'})
|
rh, proxies={'all': f'socks4://user:@{server_address}'})
|
||||||
assert response['version'] == 4
|
assert response['version'] == 4
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_socks4a_ipv4_target(self, handler, ctx):
|
def test_socks4a_ipv4_target(self, handler, ctx):
|
||||||
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
||||||
with handler(proxies={'all': f'socks4a://{server_address}'}) as rh:
|
with handler(proxies={'all': f'socks4a://{server_address}'}) as rh:
|
||||||
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
|
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
|
||||||
assert response['version'] == 4
|
assert response['version'] == 4
|
||||||
assert (response['ipv4_address'] == '127.0.0.1') != (response['domain_address'] == '127.0.0.1')
|
assert (
|
||||||
|
(response['ipv4_address'] == '127.0.0.1' and response['domain_address'] is None)
|
||||||
|
or (response['ipv4_address'] is None and response['domain_address'] == '127.0.0.1')
|
||||||
|
)
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_socks4a_domain_target(self, handler, ctx):
|
def test_socks4a_domain_target(self, handler, ctx):
|
||||||
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
||||||
with handler(proxies={'all': f'socks4a://{server_address}'}) as rh:
|
with handler(proxies={'all': f'socks4a://{server_address}'}) as rh:
|
||||||
|
@ -298,7 +301,7 @@ class TestSocks4Proxy:
|
||||||
assert response['ipv4_address'] is None
|
assert response['ipv4_address'] is None
|
||||||
assert response['domain_address'] == 'localhost'
|
assert response['domain_address'] == 'localhost'
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_ipv4_client_source_address(self, handler, ctx):
|
def test_ipv4_client_source_address(self, handler, ctx):
|
||||||
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
||||||
source_address = f'127.0.0.{random.randint(5, 255)}'
|
source_address = f'127.0.0.{random.randint(5, 255)}'
|
||||||
|
@ -308,7 +311,7 @@ class TestSocks4Proxy:
|
||||||
assert response['client_address'][0] == source_address
|
assert response['client_address'][0] == source_address
|
||||||
assert response['version'] == 4
|
assert response['version'] == 4
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
@pytest.mark.parametrize('reply_code', [
|
@pytest.mark.parametrize('reply_code', [
|
||||||
Socks4CD.REQUEST_REJECTED_OR_FAILED,
|
Socks4CD.REQUEST_REJECTED_OR_FAILED,
|
||||||
Socks4CD.REQUEST_REJECTED_CANNOT_CONNECT_TO_IDENTD,
|
Socks4CD.REQUEST_REJECTED_CANNOT_CONNECT_TO_IDENTD,
|
||||||
|
@ -320,7 +323,7 @@ class TestSocks4Proxy:
|
||||||
with pytest.raises(ProxyError):
|
with pytest.raises(ProxyError):
|
||||||
ctx.socks_info_request(rh)
|
ctx.socks_info_request(rh)
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_ipv6_socks4_proxy(self, handler, ctx):
|
def test_ipv6_socks4_proxy(self, handler, ctx):
|
||||||
with ctx.socks_server(Socks4ProxyHandler, bind_ip='::1') as server_address:
|
with ctx.socks_server(Socks4ProxyHandler, bind_ip='::1') as server_address:
|
||||||
with handler(proxies={'all': f'socks4://{server_address}'}) as rh:
|
with handler(proxies={'all': f'socks4://{server_address}'}) as rh:
|
||||||
|
@ -329,7 +332,7 @@ class TestSocks4Proxy:
|
||||||
assert response['ipv4_address'] == '127.0.0.1'
|
assert response['ipv4_address'] == '127.0.0.1'
|
||||||
assert response['version'] == 4
|
assert response['version'] == 4
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_timeout(self, handler, ctx):
|
def test_timeout(self, handler, ctx):
|
||||||
with ctx.socks_server(Socks4ProxyHandler, sleep=2) as server_address:
|
with ctx.socks_server(Socks4ProxyHandler, sleep=2) as server_address:
|
||||||
with handler(proxies={'all': f'socks4://{server_address}'}, timeout=0.5) as rh:
|
with handler(proxies={'all': f'socks4://{server_address}'}, timeout=0.5) as rh:
|
||||||
|
@ -339,7 +342,7 @@ class TestSocks4Proxy:
|
||||||
|
|
||||||
class TestSocks5Proxy:
|
class TestSocks5Proxy:
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_socks5_no_auth(self, handler, ctx):
|
def test_socks5_no_auth(self, handler, ctx):
|
||||||
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||||
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||||
|
@ -347,7 +350,7 @@ class TestSocks5Proxy:
|
||||||
assert response['auth_methods'] == [0x0]
|
assert response['auth_methods'] == [0x0]
|
||||||
assert response['version'] == 5
|
assert response['version'] == 5
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_socks5_user_pass(self, handler, ctx):
|
def test_socks5_user_pass(self, handler, ctx):
|
||||||
with ctx.socks_server(Socks5ProxyHandler, auth=('test', 'testpass')) as server_address:
|
with ctx.socks_server(Socks5ProxyHandler, auth=('test', 'testpass')) as server_address:
|
||||||
with handler() as rh:
|
with handler() as rh:
|
||||||
|
@ -360,7 +363,7 @@ class TestSocks5Proxy:
|
||||||
assert response['auth_methods'] == [Socks5Auth.AUTH_NONE, Socks5Auth.AUTH_USER_PASS]
|
assert response['auth_methods'] == [Socks5Auth.AUTH_NONE, Socks5Auth.AUTH_USER_PASS]
|
||||||
assert response['version'] == 5
|
assert response['version'] == 5
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_socks5_ipv4_target(self, handler, ctx):
|
def test_socks5_ipv4_target(self, handler, ctx):
|
||||||
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||||
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||||
|
@ -368,7 +371,7 @@ class TestSocks5Proxy:
|
||||||
assert response['ipv4_address'] == '127.0.0.1'
|
assert response['ipv4_address'] == '127.0.0.1'
|
||||||
assert response['version'] == 5
|
assert response['version'] == 5
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_socks5_domain_target(self, handler, ctx):
|
def test_socks5_domain_target(self, handler, ctx):
|
||||||
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||||
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||||
|
@ -376,7 +379,7 @@ class TestSocks5Proxy:
|
||||||
assert (response['ipv4_address'] == '127.0.0.1') != (response['ipv6_address'] == '::1')
|
assert (response['ipv4_address'] == '127.0.0.1') != (response['ipv6_address'] == '::1')
|
||||||
assert response['version'] == 5
|
assert response['version'] == 5
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_socks5h_domain_target(self, handler, ctx):
|
def test_socks5h_domain_target(self, handler, ctx):
|
||||||
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||||
with handler(proxies={'all': f'socks5h://{server_address}'}) as rh:
|
with handler(proxies={'all': f'socks5h://{server_address}'}) as rh:
|
||||||
|
@ -385,7 +388,7 @@ class TestSocks5Proxy:
|
||||||
assert response['domain_address'] == 'localhost'
|
assert response['domain_address'] == 'localhost'
|
||||||
assert response['version'] == 5
|
assert response['version'] == 5
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_socks5h_ip_target(self, handler, ctx):
|
def test_socks5h_ip_target(self, handler, ctx):
|
||||||
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||||
with handler(proxies={'all': f'socks5h://{server_address}'}) as rh:
|
with handler(proxies={'all': f'socks5h://{server_address}'}) as rh:
|
||||||
|
@ -394,7 +397,7 @@ class TestSocks5Proxy:
|
||||||
assert response['domain_address'] is None
|
assert response['domain_address'] is None
|
||||||
assert response['version'] == 5
|
assert response['version'] == 5
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_socks5_ipv6_destination(self, handler, ctx):
|
def test_socks5_ipv6_destination(self, handler, ctx):
|
||||||
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||||
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||||
|
@ -402,7 +405,7 @@ class TestSocks5Proxy:
|
||||||
assert response['ipv6_address'] == '::1'
|
assert response['ipv6_address'] == '::1'
|
||||||
assert response['version'] == 5
|
assert response['version'] == 5
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_ipv6_socks5_proxy(self, handler, ctx):
|
def test_ipv6_socks5_proxy(self, handler, ctx):
|
||||||
with ctx.socks_server(Socks5ProxyHandler, bind_ip='::1') as server_address:
|
with ctx.socks_server(Socks5ProxyHandler, bind_ip='::1') as server_address:
|
||||||
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||||
|
@ -413,7 +416,7 @@ class TestSocks5Proxy:
|
||||||
|
|
||||||
# XXX: is there any feasible way of testing IPv6 source addresses?
|
# XXX: is there any feasible way of testing IPv6 source addresses?
|
||||||
# Same would go for non-proxy source_address test...
|
# Same would go for non-proxy source_address test...
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_ipv4_client_source_address(self, handler, ctx):
|
def test_ipv4_client_source_address(self, handler, ctx):
|
||||||
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||||
source_address = f'127.0.0.{random.randint(5, 255)}'
|
source_address = f'127.0.0.{random.randint(5, 255)}'
|
||||||
|
@ -422,7 +425,7 @@ class TestSocks5Proxy:
|
||||||
assert response['client_address'][0] == source_address
|
assert response['client_address'][0] == source_address
|
||||||
assert response['version'] == 5
|
assert response['version'] == 5
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
@pytest.mark.parametrize('reply_code', [
|
@pytest.mark.parametrize('reply_code', [
|
||||||
Socks5Reply.GENERAL_FAILURE,
|
Socks5Reply.GENERAL_FAILURE,
|
||||||
Socks5Reply.CONNECTION_NOT_ALLOWED,
|
Socks5Reply.CONNECTION_NOT_ALLOWED,
|
||||||
|
@ -439,7 +442,7 @@ class TestSocks5Proxy:
|
||||||
with pytest.raises(ProxyError):
|
with pytest.raises(ProxyError):
|
||||||
ctx.socks_info_request(rh)
|
ctx.socks_info_request(rh)
|
||||||
|
|
||||||
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('CurlCFFI', 'http')], indirect=True)
|
||||||
def test_timeout(self, handler, ctx):
|
def test_timeout(self, handler, ctx):
|
||||||
with ctx.socks_server(Socks5ProxyHandler, sleep=2) as server_address:
|
with ctx.socks_server(Socks5ProxyHandler, sleep=2) as server_address:
|
||||||
with handler(proxies={'all': f'socks5://{server_address}'}, timeout=1) as rh:
|
with handler(proxies={'all': f'socks5://{server_address}'}, timeout=1) as rh:
|
||||||
|
|
|
@ -4097,12 +4097,6 @@ class YoutubeDL:
|
||||||
new_req.extensions.pop('impersonate')
|
new_req.extensions.pop('impersonate')
|
||||||
return _urlopen(new_req)
|
return _urlopen(new_req)
|
||||||
raise
|
raise
|
||||||
"""
|
|
||||||
TODO
|
|
||||||
if 'unsupported proxy type: "https"' in ue.msg.lower():
|
|
||||||
raise RequestError(
|
|
||||||
'To use an HTTPS proxy for this request, one of the following dependencies needs to be installed: requests')
|
|
||||||
"""
|
|
||||||
except SSLError as e:
|
except SSLError as e:
|
||||||
if 'UNSAFE_LEGACY_RENEGOTIATION_DISABLED' in str(e):
|
if 'UNSAFE_LEGACY_RENEGOTIATION_DISABLED' in str(e):
|
||||||
raise RequestError('UNSAFE_LEGACY_RENEGOTIATION_DISABLED: Try using --legacy-server-connect', cause=e) from e
|
raise RequestError('UNSAFE_LEGACY_RENEGOTIATION_DISABLED: Try using --legacy-server-connect', cause=e) from e
|
||||||
|
@ -4147,8 +4141,6 @@ class YoutubeDL:
|
||||||
}),
|
}),
|
||||||
))
|
))
|
||||||
director.preferences.update(preferences or [])
|
director.preferences.update(preferences or [])
|
||||||
if 'prefer-legacy-http-handler' in self.params['compat_opts']:
|
|
||||||
director.preferences.add(lambda rh, _: 500 if rh.RH_KEY == 'Urllib' else 0)
|
|
||||||
return director
|
return director
|
||||||
|
|
||||||
def encode(self, s):
|
def encode(self, s):
|
||||||
|
@ -4271,7 +4263,7 @@ class YoutubeDL:
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
def _write_thumbnails(self, label, info_dict, filename, thumb_filename_base=None):
|
def _write_thumbnails(self, label, info_dict, filename, thumb_filename_base=None):
|
||||||
''' Write thumbnails to file and return list of (thumb_filename, final_thumb_filename); or None if error '''
|
''' Write thumbnails to file and return list of (thumb_filename, final_thumb_filename) '''
|
||||||
write_all = self.params.get('write_all_thumbnails', False)
|
write_all = self.params.get('write_all_thumbnails', False)
|
||||||
thumbnails, ret = [], []
|
thumbnails, ret = [], []
|
||||||
if write_all or self.params.get('writethumbnail', False):
|
if write_all or self.params.get('writethumbnail', False):
|
||||||
|
@ -4287,9 +4279,6 @@ class YoutubeDL:
|
||||||
self.write_debug(f'Skipping writing {label} thumbnail')
|
self.write_debug(f'Skipping writing {label} thumbnail')
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
if not self._ensure_dir_exists(filename):
|
|
||||||
return None
|
|
||||||
|
|
||||||
for idx, t in list(enumerate(thumbnails))[::-1]:
|
for idx, t in list(enumerate(thumbnails))[::-1]:
|
||||||
thumb_ext = (f'{t["id"]}.' if multiple else '') + determine_ext(t['url'], 'jpg')
|
thumb_ext = (f'{t["id"]}.' if multiple else '') + determine_ext(t['url'], 'jpg')
|
||||||
thumb_display_id = f'{label} thumbnail {t["id"]}'
|
thumb_display_id = f'{label} thumbnail {t["id"]}'
|
||||||
|
|
|
@ -21,9 +21,7 @@ def get_hidden_imports():
|
||||||
yield from ('yt_dlp.compat._legacy', 'yt_dlp.compat._deprecated')
|
yield from ('yt_dlp.compat._legacy', 'yt_dlp.compat._deprecated')
|
||||||
yield from ('yt_dlp.utils._legacy', 'yt_dlp.utils._deprecated')
|
yield from ('yt_dlp.utils._legacy', 'yt_dlp.utils._deprecated')
|
||||||
yield pycryptodome_module()
|
yield pycryptodome_module()
|
||||||
# Only `websockets` is required, others are collected just in case
|
yield from collect_submodules('websockets')
|
||||||
for module in ('websockets', 'requests', 'urllib3'):
|
|
||||||
yield from collect_submodules(module)
|
|
||||||
# These are auto-detected, but explicitly add them just in case
|
# These are auto-detected, but explicitly add them just in case
|
||||||
yield from ('mutagen', 'brotli', 'certifi', 'curl_cffi')
|
yield from ('mutagen', 'brotli', 'certifi', 'curl_cffi')
|
||||||
|
|
||||||
|
|
|
@ -58,16 +58,6 @@ except (ImportError, SyntaxError):
|
||||||
# See https://github.com/yt-dlp/yt-dlp/issues/2633
|
# See https://github.com/yt-dlp/yt-dlp/issues/2633
|
||||||
websockets = None
|
websockets = None
|
||||||
|
|
||||||
try:
|
|
||||||
import urllib3
|
|
||||||
except ImportError:
|
|
||||||
urllib3 = None
|
|
||||||
|
|
||||||
try:
|
|
||||||
import requests
|
|
||||||
except ImportError:
|
|
||||||
requests = None
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
import xattr # xattr or pyxattr
|
import xattr # xattr or pyxattr
|
||||||
except ImportError:
|
except ImportError:
|
||||||
|
|
|
@ -896,10 +896,6 @@ from .jeuxvideo import JeuxVideoIE
|
||||||
from .jove import JoveIE
|
from .jove import JoveIE
|
||||||
from .joj import JojIE
|
from .joj import JojIE
|
||||||
from .jstream import JStreamIE
|
from .jstream import JStreamIE
|
||||||
from .jtbc import (
|
|
||||||
JTBCIE,
|
|
||||||
JTBCProgramIE,
|
|
||||||
)
|
|
||||||
from .jwplatform import JWPlatformIE
|
from .jwplatform import JWPlatformIE
|
||||||
from .kakao import KakaoIE
|
from .kakao import KakaoIE
|
||||||
from .kaltura import KalturaIE
|
from .kaltura import KalturaIE
|
||||||
|
@ -1057,7 +1053,6 @@ from .markiza import (
|
||||||
from .massengeschmacktv import MassengeschmackTVIE
|
from .massengeschmacktv import MassengeschmackTVIE
|
||||||
from .masters import MastersIE
|
from .masters import MastersIE
|
||||||
from .matchtv import MatchTVIE
|
from .matchtv import MatchTVIE
|
||||||
from .mbn import MBNIE
|
|
||||||
from .mdr import MDRIE
|
from .mdr import MDRIE
|
||||||
from .medaltv import MedalTVIE
|
from .medaltv import MedalTVIE
|
||||||
from .mediaite import MediaiteIE
|
from .mediaite import MediaiteIE
|
||||||
|
|
|
@ -31,7 +31,7 @@ class BanByeBaseIE(InfoExtractor):
|
||||||
|
|
||||||
|
|
||||||
class BanByeIE(BanByeBaseIE):
|
class BanByeIE(BanByeBaseIE):
|
||||||
_VALID_URL = r'https?://(?:www\.)?banbye\.com/(?:en/)?watch/(?P<id>[\w-]+)'
|
_VALID_URL = r'https?://(?:www\.)?banbye.com/(?:en/)?watch/(?P<id>[\w-]+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://banbye.com/watch/v_ytfmvkVYLE8T',
|
'url': 'https://banbye.com/watch/v_ytfmvkVYLE8T',
|
||||||
'md5': '2f4ea15c5ca259a73d909b2cfd558eb5',
|
'md5': '2f4ea15c5ca259a73d909b2cfd558eb5',
|
||||||
|
@ -120,7 +120,7 @@ class BanByeIE(BanByeBaseIE):
|
||||||
|
|
||||||
|
|
||||||
class BanByeChannelIE(BanByeBaseIE):
|
class BanByeChannelIE(BanByeBaseIE):
|
||||||
_VALID_URL = r'https?://(?:www\.)?banbye\.com/(?:en/)?channel/(?P<id>\w+)'
|
_VALID_URL = r'https?://(?:www\.)?banbye.com/(?:en/)?channel/(?P<id>\w+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://banbye.com/channel/ch_wrealu24',
|
'url': 'https://banbye.com/channel/ch_wrealu24',
|
||||||
'info_dict': {
|
'info_dict': {
|
||||||
|
|
|
@ -2,7 +2,7 @@ from .common import InfoExtractor
|
||||||
|
|
||||||
|
|
||||||
class BreitBartIE(InfoExtractor):
|
class BreitBartIE(InfoExtractor):
|
||||||
_VALID_URL = r'https?://(?:www\.)?breitbart\.com/videos/v/(?P<id>[^/?#]+)'
|
_VALID_URL = r'https?:\/\/(?:www\.)breitbart.com/videos/v/(?P<id>[^/]+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://www.breitbart.com/videos/v/5cOz1yup/?pl=Ij6NDOji',
|
'url': 'https://www.breitbart.com/videos/v/5cOz1yup/?pl=Ij6NDOji',
|
||||||
'md5': '0aa6d1d6e183ac5ca09207fe49f17ade',
|
'md5': '0aa6d1d6e183ac5ca09207fe49f17ade',
|
||||||
|
|
|
@ -10,7 +10,7 @@ from ..utils import (
|
||||||
|
|
||||||
|
|
||||||
class CraftsyIE(InfoExtractor):
|
class CraftsyIE(InfoExtractor):
|
||||||
_VALID_URL = r'https?://www\.craftsy\.com/class/(?P<id>[\w-]+)'
|
_VALID_URL = r'https?://www.craftsy.com/class/(?P<id>[a-z0-9_-]+)/'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://www.craftsy.com/class/the-midnight-quilt-show-season-5/',
|
'url': 'https://www.craftsy.com/class/the-midnight-quilt-show-season-5/',
|
||||||
'info_dict': {
|
'info_dict': {
|
||||||
|
|
|
@ -45,7 +45,7 @@ class CybraryBaseIE(InfoExtractor):
|
||||||
|
|
||||||
|
|
||||||
class CybraryIE(CybraryBaseIE):
|
class CybraryIE(CybraryBaseIE):
|
||||||
_VALID_URL = r'https?://app\.cybrary\.it/immersive/(?P<enrollment>[0-9]+)/activity/(?P<id>[0-9]+)'
|
_VALID_URL = r'https?://app.cybrary.it/immersive/(?P<enrollment>[0-9]+)/activity/(?P<id>[0-9]+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://app.cybrary.it/immersive/12487950/activity/63102',
|
'url': 'https://app.cybrary.it/immersive/12487950/activity/63102',
|
||||||
'md5': '9ae12d37e555cb2ed554223a71a701d0',
|
'md5': '9ae12d37e555cb2ed554223a71a701d0',
|
||||||
|
@ -110,7 +110,7 @@ class CybraryIE(CybraryBaseIE):
|
||||||
|
|
||||||
|
|
||||||
class CybraryCourseIE(CybraryBaseIE):
|
class CybraryCourseIE(CybraryBaseIE):
|
||||||
_VALID_URL = r'https://app\.cybrary\.it/browse/course/(?P<id>[\w-]+)/?(?:$|[#?])'
|
_VALID_URL = r'https://app.cybrary.it/browse/course/(?P<id>[\w-]+)/?(?:$|[#?])'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://app.cybrary.it/browse/course/az-500-microsoft-azure-security-technologies',
|
'url': 'https://app.cybrary.it/browse/course/az-500-microsoft-azure-security-technologies',
|
||||||
'info_dict': {
|
'info_dict': {
|
||||||
|
|
|
@ -8,7 +8,7 @@ from ..utils import (
|
||||||
|
|
||||||
|
|
||||||
class FifaIE(InfoExtractor):
|
class FifaIE(InfoExtractor):
|
||||||
_VALID_URL = r'https?://www\.fifa\.com/fifaplus/(?P<locale>\w{2})/watch/([^#?]+/)?(?P<id>\w+)'
|
_VALID_URL = r'https?://www.fifa.com/fifaplus/(?P<locale>\w{2})/watch/([^#?]+/)?(?P<id>\w+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://www.fifa.com/fifaplus/en/watch/7on10qPcnyLajDDU3ntg6y',
|
'url': 'https://www.fifa.com/fifaplus/en/watch/7on10qPcnyLajDDU3ntg6y',
|
||||||
'info_dict': {
|
'info_dict': {
|
||||||
|
|
|
@ -3,7 +3,7 @@ from ..utils import int_or_none
|
||||||
|
|
||||||
|
|
||||||
class FilmmoduIE(InfoExtractor):
|
class FilmmoduIE(InfoExtractor):
|
||||||
_VALID_URL = r'https?://(?:www\.)?filmmodu\.org/(?P<id>[^/]+-(?:turkce-dublaj-izle|altyazili-izle))'
|
_VALID_URL = r'https?://(?:www.)?filmmodu.org/(?P<id>[^/]+-(?:turkce-dublaj-izle|altyazili-izle))'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://www.filmmodu.org/f9-altyazili-izle',
|
'url': 'https://www.filmmodu.org/f9-altyazili-izle',
|
||||||
'md5': 'aeefd955c2a508a5bdaa3bcec8eeb0d4',
|
'md5': 'aeefd955c2a508a5bdaa3bcec8eeb0d4',
|
||||||
|
|
|
@ -31,7 +31,7 @@ class ITProTVBaseIE(InfoExtractor):
|
||||||
|
|
||||||
|
|
||||||
class ITProTVIE(ITProTVBaseIE):
|
class ITProTVIE(ITProTVBaseIE):
|
||||||
_VALID_URL = r'https://app\.itpro\.tv/course/(?P<course>[\w-]+)/(?P<id>[\w-]+)'
|
_VALID_URL = r'https://app.itpro.tv/course/(?P<course>[\w-]+)/(?P<id>[\w-]+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://app.itpro.tv/course/guided-tour/introductionitprotv',
|
'url': 'https://app.itpro.tv/course/guided-tour/introductionitprotv',
|
||||||
'md5': 'bca4a28c2667fd1a63052e71a94bb88c',
|
'md5': 'bca4a28c2667fd1a63052e71a94bb88c',
|
||||||
|
@ -102,7 +102,7 @@ class ITProTVIE(ITProTVBaseIE):
|
||||||
|
|
||||||
|
|
||||||
class ITProTVCourseIE(ITProTVBaseIE):
|
class ITProTVCourseIE(ITProTVBaseIE):
|
||||||
_VALID_URL = r'https?://app\.itpro\.tv/course/(?P<id>[\w-]+)/?(?:$|[#?])'
|
_VALID_URL = r'https?://app.itpro.tv/course/(?P<id>[\w-]+)/?(?:$|[#?])'
|
||||||
_TESTS = [
|
_TESTS = [
|
||||||
{
|
{
|
||||||
'url': 'https://app.itpro.tv/course/guided-tour',
|
'url': 'https://app.itpro.tv/course/guided-tour',
|
||||||
|
|
|
@ -10,7 +10,7 @@ from ..utils import (
|
||||||
|
|
||||||
|
|
||||||
class JableIE(InfoExtractor):
|
class JableIE(InfoExtractor):
|
||||||
_VALID_URL = r'https?://(?:www\.)?jable\.tv/videos/(?P<id>[\w-]+)'
|
_VALID_URL = r'https?://(?:www\.)?jable.tv/videos/(?P<id>[\w-]+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://jable.tv/videos/pppd-812/',
|
'url': 'https://jable.tv/videos/pppd-812/',
|
||||||
'md5': 'f1537283a9bc073c31ff86ca35d9b2a6',
|
'md5': 'f1537283a9bc073c31ff86ca35d9b2a6',
|
||||||
|
@ -64,7 +64,7 @@ class JableIE(InfoExtractor):
|
||||||
|
|
||||||
|
|
||||||
class JablePlaylistIE(InfoExtractor):
|
class JablePlaylistIE(InfoExtractor):
|
||||||
_VALID_URL = r'https?://(?:www\.)?jable\.tv/(?:categories|models|tags)/(?P<id>[\w-]+)'
|
_VALID_URL = r'https?://(?:www\.)?jable.tv/(?:categories|models|tags)/(?P<id>[\w-]+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://jable.tv/models/kaede-karen/',
|
'url': 'https://jable.tv/models/kaede-karen/',
|
||||||
'info_dict': {
|
'info_dict': {
|
||||||
|
|
|
@ -1,156 +0,0 @@
|
||||||
import re
|
|
||||||
|
|
||||||
from .common import InfoExtractor
|
|
||||||
from ..utils import (
|
|
||||||
int_or_none,
|
|
||||||
parse_duration,
|
|
||||||
url_or_none,
|
|
||||||
)
|
|
||||||
from ..utils.traversal import traverse_obj
|
|
||||||
|
|
||||||
|
|
||||||
class JTBCIE(InfoExtractor):
|
|
||||||
IE_DESC = 'jtbc.co.kr'
|
|
||||||
_VALID_URL = r'''(?x)
|
|
||||||
https?://(?:
|
|
||||||
vod\.jtbc\.co\.kr/player/(?:program|clip)
|
|
||||||
|tv\.jtbc\.co\.kr/(?:replay|trailer|clip)/pr\d+/pm\d+
|
|
||||||
)/(?P<id>(?:ep|vo)\d+)'''
|
|
||||||
_GEO_COUNTRIES = ['KR']
|
|
||||||
|
|
||||||
_TESTS = [{
|
|
||||||
'url': 'https://tv.jtbc.co.kr/replay/pr10011629/pm10067930/ep20216321/view',
|
|
||||||
'md5': 'e6ade71d8c8685bbfd6e6ce4167c6a6c',
|
|
||||||
'info_dict': {
|
|
||||||
'id': 'VO10721192',
|
|
||||||
'display_id': 'ep20216321',
|
|
||||||
'ext': 'mp4',
|
|
||||||
'title': '힘쎈여자 강남순 2회 다시보기',
|
|
||||||
'description': 'md5:043c1d9019100ce271dba09995dbd1e2',
|
|
||||||
'duration': 3770.0,
|
|
||||||
'release_date': '20231008',
|
|
||||||
'age_limit': 15,
|
|
||||||
'thumbnail': 'https://fs.jtbc.co.kr//joydata/CP00000001/prog/drama/stronggirlnamsoon/img/20231008_163541_522_1.jpg',
|
|
||||||
'series': '힘쎈여자 강남순',
|
|
||||||
},
|
|
||||||
}, {
|
|
||||||
'url': 'https://vod.jtbc.co.kr/player/program/ep20216733',
|
|
||||||
'md5': '217a6d190f115a75e4bda0ceaa4cd7f4',
|
|
||||||
'info_dict': {
|
|
||||||
'id': 'VO10721429',
|
|
||||||
'display_id': 'ep20216733',
|
|
||||||
'ext': 'mp4',
|
|
||||||
'title': '헬로 마이 닥터 친절한 진료실 149회 다시보기',
|
|
||||||
'description': 'md5:1d70788a982dd5de26874a92fcffddb8',
|
|
||||||
'duration': 2720.0,
|
|
||||||
'release_date': '20231009',
|
|
||||||
'age_limit': 15,
|
|
||||||
'thumbnail': 'https://fs.jtbc.co.kr//joydata/CP00000001/prog/culture/hellomydoctor/img/20231009_095002_528_1.jpg',
|
|
||||||
'series': '헬로 마이 닥터 친절한 진료실',
|
|
||||||
},
|
|
||||||
}, {
|
|
||||||
'url': 'https://vod.jtbc.co.kr/player/clip/vo10721270',
|
|
||||||
'md5': '05782e2dc22a9c548aebefe62ae4328a',
|
|
||||||
'info_dict': {
|
|
||||||
'id': 'VO10721270',
|
|
||||||
'display_id': 'vo10721270',
|
|
||||||
'ext': 'mp4',
|
|
||||||
'title': '뭉쳐야 찬다3 2회 예고편 - A매치로 향하는 마지막 관문💥',
|
|
||||||
'description': 'md5:d48b51a8655c84843b4ed8d0c39aae68',
|
|
||||||
'duration': 46.0,
|
|
||||||
'release_date': '20231015',
|
|
||||||
'age_limit': 15,
|
|
||||||
'thumbnail': 'https://fs.jtbc.co.kr//joydata/CP00000001/prog/enter/soccer3/img/20231008_210957_775_1.jpg',
|
|
||||||
'series': '뭉쳐야 찬다3',
|
|
||||||
},
|
|
||||||
}, {
|
|
||||||
'url': 'https://tv.jtbc.co.kr/trailer/pr10010392/pm10032526/vo10720912/view',
|
|
||||||
'md5': '367d480eb3ef54a9cd7a4b4d69c4b32d',
|
|
||||||
'info_dict': {
|
|
||||||
'id': 'VO10720912',
|
|
||||||
'display_id': 'vo10720912',
|
|
||||||
'ext': 'mp4',
|
|
||||||
'title': '아는 형님 404회 예고편 | 10월 14일(토) 저녁 8시 50분 방송!',
|
|
||||||
'description': 'md5:2743bb1079ceb85bb00060f2ad8f0280',
|
|
||||||
'duration': 148.0,
|
|
||||||
'release_date': '20231014',
|
|
||||||
'age_limit': 15,
|
|
||||||
'thumbnail': 'https://fs.jtbc.co.kr//joydata/CP00000001/prog/enter/jtbcbros/img/20231006_230023_802_1.jpg',
|
|
||||||
'series': '아는 형님',
|
|
||||||
},
|
|
||||||
}]
|
|
||||||
|
|
||||||
def _real_extract(self, url):
|
|
||||||
display_id = self._match_id(url)
|
|
||||||
|
|
||||||
if display_id.startswith('vo'):
|
|
||||||
video_id = display_id.upper()
|
|
||||||
else:
|
|
||||||
webpage = self._download_webpage(url, display_id)
|
|
||||||
video_id = self._search_regex(r'data-vod="(VO\d+)"', webpage, 'vod id')
|
|
||||||
|
|
||||||
playback_data = self._download_json(
|
|
||||||
f'https://api.jtbc.co.kr/vod/{video_id}', video_id, note='Downloading VOD playback data')
|
|
||||||
|
|
||||||
subtitles = {}
|
|
||||||
for sub in traverse_obj(playback_data, ('tracks', lambda _, v: v['file'])):
|
|
||||||
subtitles.setdefault(sub.get('label', 'und'), []).append({'url': sub['file']})
|
|
||||||
|
|
||||||
formats = []
|
|
||||||
for stream_url in traverse_obj(playback_data, ('sources', 'HLS', ..., 'file', {url_or_none})):
|
|
||||||
stream_url = re.sub(r'/playlist(?:_pd\d+)?\.m3u8', '/index.m3u8', stream_url)
|
|
||||||
formats.extend(self._extract_m3u8_formats(stream_url, video_id, fatal=False))
|
|
||||||
|
|
||||||
metadata = self._download_json(
|
|
||||||
'https://now-api.jtbc.co.kr/v1/vod/detail', video_id,
|
|
||||||
note='Downloading mobile details', fatal=False, query={'vodFileId': video_id})
|
|
||||||
return {
|
|
||||||
'id': video_id,
|
|
||||||
'display_id': display_id,
|
|
||||||
**traverse_obj(metadata, ('vodDetail', {
|
|
||||||
'title': 'vodTitleView',
|
|
||||||
'series': 'programTitle',
|
|
||||||
'age_limit': ('watchAge', {int_or_none}),
|
|
||||||
'release_date': ('broadcastDate', {lambda x: re.match(r'\d{8}', x.replace('.', ''))}, 0),
|
|
||||||
'description': 'episodeContents',
|
|
||||||
'thumbnail': ('imgFileUrl', {url_or_none}),
|
|
||||||
})),
|
|
||||||
'duration': parse_duration(playback_data.get('playTime')),
|
|
||||||
'formats': formats,
|
|
||||||
'subtitles': subtitles,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
class JTBCProgramIE(InfoExtractor):
|
|
||||||
IE_NAME = 'JTBC:program'
|
|
||||||
_VALID_URL = r'https?://(?:vod\.jtbc\.co\.kr/program|tv\.jtbc\.co\.kr/replay)/(?P<id>pr\d+)/(?:replay|pm\d+)/?(?:$|[?#])'
|
|
||||||
|
|
||||||
_TESTS = [{
|
|
||||||
'url': 'https://tv.jtbc.co.kr/replay/pr10010392/pm10032710',
|
|
||||||
'info_dict': {
|
|
||||||
'_type': 'playlist',
|
|
||||||
'id': 'pr10010392',
|
|
||||||
},
|
|
||||||
'playlist_count': 398,
|
|
||||||
}, {
|
|
||||||
'url': 'https://vod.jtbc.co.kr/program/pr10011491/replay',
|
|
||||||
'info_dict': {
|
|
||||||
'_type': 'playlist',
|
|
||||||
'id': 'pr10011491',
|
|
||||||
},
|
|
||||||
'playlist_count': 59,
|
|
||||||
}]
|
|
||||||
|
|
||||||
def _real_extract(self, url):
|
|
||||||
program_id = self._match_id(url)
|
|
||||||
|
|
||||||
vod_list = self._download_json(
|
|
||||||
'https://now-api.jtbc.co.kr/v1/vodClip/programHome/programReplayVodList', program_id,
|
|
||||||
note='Downloading program replay list', query={
|
|
||||||
'programId': program_id,
|
|
||||||
'rowCount': '10000',
|
|
||||||
})
|
|
||||||
|
|
||||||
entries = [self.url_result(f'https://vod.jtbc.co.kr/player/program/{video_id}', JTBCIE, video_id)
|
|
||||||
for video_id in traverse_obj(vod_list, ('programReplayVodList', ..., 'episodeId'))]
|
|
||||||
return self.playlist_result(entries, program_id)
|
|
|
@ -3,7 +3,7 @@ from ..utils import update_url
|
||||||
|
|
||||||
|
|
||||||
class KommunetvIE(InfoExtractor):
|
class KommunetvIE(InfoExtractor):
|
||||||
_VALID_URL = r'https://\w+\.kommunetv\.no/archive/(?P<id>\w+)'
|
_VALID_URL = r'https://(\w+).kommunetv.no/archive/(?P<id>\w+)'
|
||||||
_TEST = {
|
_TEST = {
|
||||||
'url': 'https://oslo.kommunetv.no/archive/921',
|
'url': 'https://oslo.kommunetv.no/archive/921',
|
||||||
'md5': '5f102be308ee759be1e12b63d5da4bbc',
|
'md5': '5f102be308ee759be1e12b63d5da4bbc',
|
||||||
|
|
|
@ -13,7 +13,7 @@ from ..utils import (
|
||||||
|
|
||||||
|
|
||||||
class MainStreamingIE(InfoExtractor):
|
class MainStreamingIE(InfoExtractor):
|
||||||
_VALID_URL = r'https?://(?:webtools-?)?(?P<host>[A-Za-z0-9-]*\.msvdn\.net)/(?:embed|amp_embed|content)/(?P<id>\w+)'
|
_VALID_URL = r'https?://(?:webtools-?)?(?P<host>[A-Za-z0-9-]*\.msvdn.net)/(?:embed|amp_embed|content)/(?P<id>\w+)'
|
||||||
_EMBED_REGEX = [rf'<iframe[^>]+?src=["\']?(?P<url>{_VALID_URL})["\']?']
|
_EMBED_REGEX = [rf'<iframe[^>]+?src=["\']?(?P<url>{_VALID_URL})["\']?']
|
||||||
IE_DESC = 'MainStreaming Player'
|
IE_DESC = 'MainStreaming Player'
|
||||||
|
|
||||||
|
|
|
@ -1,89 +0,0 @@
|
||||||
import re
|
|
||||||
|
|
||||||
from .common import InfoExtractor
|
|
||||||
from ..utils import (
|
|
||||||
int_or_none,
|
|
||||||
unified_strdate,
|
|
||||||
url_or_none,
|
|
||||||
)
|
|
||||||
from ..utils.traversal import traverse_obj
|
|
||||||
|
|
||||||
|
|
||||||
class MBNIE(InfoExtractor):
|
|
||||||
IE_DESC = 'mbn.co.kr (매일방송)'
|
|
||||||
_VALID_URL = r'https?://(?:www\.)?mbn\.co\.kr/vod/programContents/preview(?:list)?/\d+/\d+/(?P<id>\d+)'
|
|
||||||
_TESTS = [{
|
|
||||||
'url': 'https://mbn.co.kr/vod/programContents/previewlist/861/5433/1276155',
|
|
||||||
'md5': '85e1694e5b247c04d1386b7e3c90fd76',
|
|
||||||
'info_dict': {
|
|
||||||
'id': '1276155',
|
|
||||||
'ext': 'mp4',
|
|
||||||
'title': '결국 사로잡힌 권유리, 그녀를 목숨 걸고 구하려는 정일우!',
|
|
||||||
'duration': 3891,
|
|
||||||
'release_date': '20210703',
|
|
||||||
'thumbnail': 'http://img.vod.mbn.co.kr/mbnvod2img/861/2021/07/03/20210703230811_20_861_1276155_360_7_0.jpg',
|
|
||||||
'series': '보쌈 - 운명을 훔치다',
|
|
||||||
'episode': 'Episode 19',
|
|
||||||
'episode_number': 19,
|
|
||||||
},
|
|
||||||
}, {
|
|
||||||
'url': 'https://www.mbn.co.kr/vod/programContents/previewlist/835/5294/1084744',
|
|
||||||
'md5': 'fc65d3aac85e85e0b5056f4ef99cde4a',
|
|
||||||
'info_dict': {
|
|
||||||
'id': '1084744',
|
|
||||||
'ext': 'mp4',
|
|
||||||
'title': '김정은♥최원영, 제자리를 찾은 위험한 부부! "결혼은 투쟁이면서, 어려운 방식이야.."',
|
|
||||||
'duration': 93,
|
|
||||||
'release_date': '20201124',
|
|
||||||
'thumbnail': 'http://img.vod.mbn.co.kr/mbnvod2img/835/2020/11/25/20201125000221_21_835_1084744_360_7_0.jpg',
|
|
||||||
'series': '나의 위험한 아내',
|
|
||||||
},
|
|
||||||
}, {
|
|
||||||
'url': 'https://www.mbn.co.kr/vod/programContents/preview/952/6088/1054797?next=1',
|
|
||||||
'md5': 'c711103c72aeac8323a5cf1751f10097',
|
|
||||||
'info_dict': {
|
|
||||||
'id': '1054797',
|
|
||||||
'ext': 'mp4',
|
|
||||||
'title': '[2차 티저] MBN 주말 미니시리즈 <완벽한 결혼의 정석> l 그녀에게 주어진 두 번째 인생',
|
|
||||||
'duration': 65,
|
|
||||||
'release_date': '20231028',
|
|
||||||
'thumbnail': 'http://img.vod.mbn.co.kr/vod2/952/2023/09/11/20230911130223_22_952_1054797_1080_7.jpg',
|
|
||||||
'series': '완벽한 결혼의 정석',
|
|
||||||
},
|
|
||||||
}]
|
|
||||||
|
|
||||||
def _real_extract(self, url):
|
|
||||||
content_id = self._match_id(url)
|
|
||||||
webpage = self._download_webpage(url, content_id)
|
|
||||||
|
|
||||||
content_cls_cd = self._search_regex(
|
|
||||||
r'"\?content_cls_cd=(\d+)&', webpage, 'content cls cd', fatal=False) or '20'
|
|
||||||
media_info = self._download_json(
|
|
||||||
'https://www.mbn.co.kr/player/mbnVodPlayer_2020.mbn', content_id,
|
|
||||||
note='Fetching playback data', query={
|
|
||||||
'content_cls_cd': content_cls_cd,
|
|
||||||
'content_id': content_id,
|
|
||||||
'relay_type': '1',
|
|
||||||
})
|
|
||||||
|
|
||||||
formats = []
|
|
||||||
for stream_url in traverse_obj(media_info, ('movie_list', ..., 'url', {url_or_none})):
|
|
||||||
stream_url = re.sub(r'/(?:chunk|play)list(?:_pd\d+)?\.m3u8', '/manifest.m3u8', stream_url)
|
|
||||||
final_url = url_or_none(self._download_webpage(
|
|
||||||
f'https://www.mbn.co.kr/player/mbnStreamAuth_new_vod.mbn?vod_url={stream_url}',
|
|
||||||
content_id, note='Fetching authenticated m3u8 url'))
|
|
||||||
|
|
||||||
formats.extend(self._extract_m3u8_formats(final_url, content_id, fatal=False))
|
|
||||||
|
|
||||||
return {
|
|
||||||
'id': content_id,
|
|
||||||
**traverse_obj(media_info, {
|
|
||||||
'title': ('movie_title', {str}),
|
|
||||||
'duration': ('play_sec', {int_or_none}),
|
|
||||||
'release_date': ('bcast_date', {lambda x: x.replace('.', '')}, {unified_strdate}),
|
|
||||||
'thumbnail': ('movie_start_Img', {url_or_none}),
|
|
||||||
'series': ('prog_nm', {str}),
|
|
||||||
'episode_number': ('ad_contentnumber', {int_or_none}),
|
|
||||||
}),
|
|
||||||
'formats': formats,
|
|
||||||
}
|
|
|
@ -2,7 +2,7 @@ from .common import InfoExtractor
|
||||||
|
|
||||||
|
|
||||||
class MediaiteIE(InfoExtractor):
|
class MediaiteIE(InfoExtractor):
|
||||||
_VALID_URL = r'https?://(?:www\.)?mediaite\.com(?!/category)(?:/[\w-]+){2}'
|
_VALID_URL = r'https?://(?:www\.)?mediaite.com(?!/category)(?:/[\w-]+){2}'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://www.mediaite.com/sports/bill-burr-roasts-nfl-for-promoting-black-lives-matter-while-scheduling-more-games-after-all-the-sht-they-know-about-cte/',
|
'url': 'https://www.mediaite.com/sports/bill-burr-roasts-nfl-for-promoting-black-lives-matter-while-scheduling-more-games-after-all-the-sht-they-know-about-cte/',
|
||||||
'info_dict': {
|
'info_dict': {
|
||||||
|
|
|
@ -3,7 +3,7 @@ from ..utils import int_or_none, traverse_obj
|
||||||
|
|
||||||
|
|
||||||
class MochaVideoIE(InfoExtractor):
|
class MochaVideoIE(InfoExtractor):
|
||||||
_VALID_URL = r'https?://video\.mocha\.com\.vn/(?P<video_slug>[\w-]+)'
|
_VALID_URL = r'https?://video.mocha.com.vn/(?P<video_slug>[\w-]+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'http://video.mocha.com.vn/chuyen-meo-gia-su-tu-thong-diep-cuoc-song-v18694039',
|
'url': 'http://video.mocha.com.vn/chuyen-meo-gia-su-tu-thong-diep-cuoc-song-v18694039',
|
||||||
'info_dict': {
|
'info_dict': {
|
||||||
|
|
|
@ -247,7 +247,7 @@ class NFLArticleIE(NFLBaseIE):
|
||||||
|
|
||||||
class NFLPlusReplayIE(NFLBaseIE):
|
class NFLPlusReplayIE(NFLBaseIE):
|
||||||
IE_NAME = 'nfl.com:plus:replay'
|
IE_NAME = 'nfl.com:plus:replay'
|
||||||
_VALID_URL = r'https?://(?:www\.)?nfl\.com/plus/games/(?P<slug>[\w-]+)(?:/(?P<id>\d+))?'
|
_VALID_URL = r'https?://(?:www\.)?nfl.com/plus/games/(?P<slug>[\w-]+)(?:/(?P<id>\d+))?'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://www.nfl.com/plus/games/giants-at-vikings-2022-post-1/1572108',
|
'url': 'https://www.nfl.com/plus/games/giants-at-vikings-2022-post-1/1572108',
|
||||||
'info_dict': {
|
'info_dict': {
|
||||||
|
@ -342,7 +342,7 @@ class NFLPlusReplayIE(NFLBaseIE):
|
||||||
|
|
||||||
class NFLPlusEpisodeIE(NFLBaseIE):
|
class NFLPlusEpisodeIE(NFLBaseIE):
|
||||||
IE_NAME = 'nfl.com:plus:episode'
|
IE_NAME = 'nfl.com:plus:episode'
|
||||||
_VALID_URL = r'https?://(?:www\.)?nfl\.com/plus/episodes/(?P<id>[\w-]+)'
|
_VALID_URL = r'https?://(?:www\.)?nfl.com/plus/episodes/(?P<id>[\w-]+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'note': 'Subscription required',
|
'note': 'Subscription required',
|
||||||
'url': 'https://www.nfl.com/plus/episodes/kurt-s-qb-insider-conference-championships',
|
'url': 'https://www.nfl.com/plus/episodes/kurt-s-qb-insider-conference-championships',
|
||||||
|
|
|
@ -3,7 +3,7 @@ from ..utils import int_or_none, parse_duration, parse_iso8601
|
||||||
|
|
||||||
|
|
||||||
class NovaPlayIE(InfoExtractor):
|
class NovaPlayIE(InfoExtractor):
|
||||||
_VALID_URL = r'https://play\.nova\.bg/video/[^?#]+/(?P<id>\d+)'
|
_VALID_URL = r'https://play.nova\.bg/video/.*/(?P<id>\d+)'
|
||||||
_TESTS = [
|
_TESTS = [
|
||||||
{
|
{
|
||||||
'url': 'https://play.nova.bg/video/ochakvaite/season-0/ochakvaite-2022-07-22-sybudi-se-sat/606627',
|
'url': 'https://play.nova.bg/video/ochakvaite/season-0/ochakvaite-2022-07-22-sybudi-se-sat/606627',
|
||||||
|
|
|
@ -19,7 +19,7 @@ from ..utils import (
|
||||||
class NubilesPornIE(InfoExtractor):
|
class NubilesPornIE(InfoExtractor):
|
||||||
_NETRC_MACHINE = 'nubiles-porn'
|
_NETRC_MACHINE = 'nubiles-porn'
|
||||||
_VALID_URL = r'''(?x)
|
_VALID_URL = r'''(?x)
|
||||||
https://members\.nubiles-porn\.com/video/watch/(?P<id>\d+)
|
https://members.nubiles-porn.com/video/watch/(?P<id>\d+)
|
||||||
(?:/(?P<display_id>[\w\-]+-s(?P<season>\d+)e(?P<episode>\d+)))?
|
(?:/(?P<display_id>[\w\-]+-s(?P<season>\d+)e(?P<episode>\d+)))?
|
||||||
'''
|
'''
|
||||||
|
|
||||||
|
|
|
@ -4,7 +4,7 @@ from ..utils import traverse_obj
|
||||||
|
|
||||||
|
|
||||||
class OfTVIE(InfoExtractor):
|
class OfTVIE(InfoExtractor):
|
||||||
_VALID_URL = r'https?://(?:www\.)?of\.tv/video/(?P<id>\w+)'
|
_VALID_URL = r'https?://(?:www\.)?of.tv/video/(?P<id>\w+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://of.tv/video/627d7d95b353db0001dadd1a',
|
'url': 'https://of.tv/video/627d7d95b353db0001dadd1a',
|
||||||
'md5': 'cb9cd5db3bb9ee0d32bfd7e373d6ef0a',
|
'md5': 'cb9cd5db3bb9ee0d32bfd7e373d6ef0a',
|
||||||
|
@ -34,7 +34,7 @@ class OfTVIE(InfoExtractor):
|
||||||
|
|
||||||
|
|
||||||
class OfTVPlaylistIE(InfoExtractor):
|
class OfTVPlaylistIE(InfoExtractor):
|
||||||
_VALID_URL = r'https?://(?:www\.)?of\.tv/creators/(?P<id>[a-zA-Z0-9-]+)/?(?:$|[?#])'
|
_VALID_URL = r'https?://(?:www\.)?of.tv/creators/(?P<id>[a-zA-Z0-9-]+)/.?'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://of.tv/creators/this-is-fire/',
|
'url': 'https://of.tv/creators/this-is-fire/',
|
||||||
'playlist_count': 8,
|
'playlist_count': 8,
|
||||||
|
|
|
@ -154,7 +154,7 @@ class RadikoBaseIE(InfoExtractor):
|
||||||
sf['preference'] = -100
|
sf['preference'] = -100
|
||||||
sf['format_note'] = 'not preferred'
|
sf['format_note'] = 'not preferred'
|
||||||
if not is_onair and timefree_int == 1 and time_to_skip:
|
if not is_onair and timefree_int == 1 and time_to_skip:
|
||||||
sf['downloader_options'] = {'ffmpeg_args': ['-ss', str(time_to_skip)]}
|
sf['downloader_options'] = {'ffmpeg_args': ['-ss', time_to_skip]}
|
||||||
formats.extend(subformats)
|
formats.extend(subformats)
|
||||||
|
|
||||||
return formats
|
return formats
|
||||||
|
|
|
@ -11,7 +11,7 @@ from ..utils import (
|
||||||
|
|
||||||
|
|
||||||
class SinaIE(InfoExtractor):
|
class SinaIE(InfoExtractor):
|
||||||
_VALID_URL = r'''(?x)https?://(?:[^/?#]+\.)?video\.sina\.com\.cn/
|
_VALID_URL = r'''(?x)https?://(?:.*?\.)?video\.sina\.com\.cn/
|
||||||
(?:
|
(?:
|
||||||
(?:view/|.*\#)(?P<id>\d+)|
|
(?:view/|.*\#)(?P<id>\d+)|
|
||||||
.+?/(?P<pseudo_id>[^/?#]+)(?:\.s?html)|
|
.+?/(?P<pseudo_id>[^/?#]+)(?:\.s?html)|
|
||||||
|
|
|
@ -1741,7 +1741,7 @@ class TwitterSpacesIE(TwitterBaseIE):
|
||||||
|
|
||||||
class TwitterShortenerIE(TwitterBaseIE):
|
class TwitterShortenerIE(TwitterBaseIE):
|
||||||
IE_NAME = 'twitter:shortener'
|
IE_NAME = 'twitter:shortener'
|
||||||
_VALID_URL = r'https?://t\.co/(?P<id>[^?#]+)|tco:(?P<eid>[^?#]+)'
|
_VALID_URL = r'https?://t.co/(?P<id>[^?]+)|tco:(?P<eid>[^?]+)'
|
||||||
_BASE_URL = 'https://t.co/'
|
_BASE_URL = 'https://t.co/'
|
||||||
|
|
||||||
def _real_extract(self, url):
|
def _real_extract(self, url):
|
||||||
|
|
|
@ -10,7 +10,7 @@ from ..utils import (
|
||||||
|
|
||||||
|
|
||||||
class UtreonIE(InfoExtractor):
|
class UtreonIE(InfoExtractor):
|
||||||
_VALID_URL = r'https?://(?:www\.)?utreon\.com/v/(?P<id>[\w-]+)'
|
_VALID_URL = r'https?://(?:www\.)?utreon.com/v/(?P<id>[a-zA-Z0-9_-]+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://utreon.com/v/z_I7ikQbuDw',
|
'url': 'https://utreon.com/v/z_I7ikQbuDw',
|
||||||
'info_dict': {
|
'info_dict': {
|
||||||
|
|
|
@ -97,12 +97,12 @@ class VKIE(VKBaseIE):
|
||||||
(?:
|
(?:
|
||||||
(?:
|
(?:
|
||||||
(?:(?:m|new)\.)?vk\.com/video_|
|
(?:(?:m|new)\.)?vk\.com/video_|
|
||||||
(?:www\.)?daxab\.com/
|
(?:www\.)?daxab.com/
|
||||||
)
|
)
|
||||||
ext\.php\?(?P<embed_query>.*?\boid=(?P<oid>-?\d+).*?\bid=(?P<id>\d+).*)|
|
ext\.php\?(?P<embed_query>.*?\boid=(?P<oid>-?\d+).*?\bid=(?P<id>\d+).*)|
|
||||||
(?:
|
(?:
|
||||||
(?:(?:m|new)\.)?vk\.com/(?:.+?\?.*?z=)?(?:video|clip)|
|
(?:(?:m|new)\.)?vk\.com/(?:.+?\?.*?z=)?(?:video|clip)|
|
||||||
(?:www\.)?daxab\.com/embed/
|
(?:www\.)?daxab.com/embed/
|
||||||
)
|
)
|
||||||
(?P<videoid>-?\d+_\d+)(?:.*\blist=(?P<list_id>([\da-f]+)|(ln-[\da-zA-Z]+)))?
|
(?P<videoid>-?\d+_\d+)(?:.*\blist=(?P<list_id>([\da-f]+)|(ln-[\da-zA-Z]+)))?
|
||||||
)
|
)
|
||||||
|
|
|
@ -182,7 +182,7 @@ class WeverseBaseIE(InfoExtractor):
|
||||||
|
|
||||||
|
|
||||||
class WeverseIE(WeverseBaseIE):
|
class WeverseIE(WeverseBaseIE):
|
||||||
_VALID_URL = r'https?://(?:www\.|m\.)?weverse\.io/(?P<artist>[^/?#]+)/live/(?P<id>[\d-]+)'
|
_VALID_URL = r'https?://(?:www\.|m\.)?weverse.io/(?P<artist>[^/?#]+)/live/(?P<id>[\d-]+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://weverse.io/billlie/live/0-107323480',
|
'url': 'https://weverse.io/billlie/live/0-107323480',
|
||||||
'md5': '1fa849f00181eef9100d3c8254c47979',
|
'md5': '1fa849f00181eef9100d3c8254c47979',
|
||||||
|
@ -344,7 +344,7 @@ class WeverseIE(WeverseBaseIE):
|
||||||
|
|
||||||
|
|
||||||
class WeverseMediaIE(WeverseBaseIE):
|
class WeverseMediaIE(WeverseBaseIE):
|
||||||
_VALID_URL = r'https?://(?:www\.|m\.)?weverse\.io/(?P<artist>[^/?#]+)/media/(?P<id>[\d-]+)'
|
_VALID_URL = r'https?://(?:www\.|m\.)?weverse.io/(?P<artist>[^/?#]+)/media/(?P<id>[\d-]+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://weverse.io/billlie/media/4-116372884',
|
'url': 'https://weverse.io/billlie/media/4-116372884',
|
||||||
'md5': '8efc9cfd61b2f25209eb1a5326314d28',
|
'md5': '8efc9cfd61b2f25209eb1a5326314d28',
|
||||||
|
@ -420,7 +420,7 @@ class WeverseMediaIE(WeverseBaseIE):
|
||||||
|
|
||||||
|
|
||||||
class WeverseMomentIE(WeverseBaseIE):
|
class WeverseMomentIE(WeverseBaseIE):
|
||||||
_VALID_URL = r'https?://(?:www\.|m\.)?weverse\.io/(?P<artist>[^/?#]+)/moment/(?P<uid>[\da-f]+)/post/(?P<id>[\d-]+)'
|
_VALID_URL = r'https?://(?:www\.|m\.)?weverse.io/(?P<artist>[^/?#]+)/moment/(?P<uid>[\da-f]+)/post/(?P<id>[\d-]+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://weverse.io/secretnumber/moment/66a07e164b56a696ee71c99315ffe27b/post/1-117229444',
|
'url': 'https://weverse.io/secretnumber/moment/66a07e164b56a696ee71c99315ffe27b/post/1-117229444',
|
||||||
'md5': '87733ac19a54081b7dfc2442036d282b',
|
'md5': '87733ac19a54081b7dfc2442036d282b',
|
||||||
|
@ -516,7 +516,7 @@ class WeverseTabBaseIE(WeverseBaseIE):
|
||||||
|
|
||||||
|
|
||||||
class WeverseLiveTabIE(WeverseTabBaseIE):
|
class WeverseLiveTabIE(WeverseTabBaseIE):
|
||||||
_VALID_URL = r'https?://(?:www\.|m\.)?weverse\.io/(?P<id>[^/?#]+)/live/?(?:[?#]|$)'
|
_VALID_URL = r'https?://(?:www\.|m\.)?weverse.io/(?P<id>[^/?#]+)/live/?(?:[?#]|$)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://weverse.io/billlie/live/',
|
'url': 'https://weverse.io/billlie/live/',
|
||||||
'playlist_mincount': 55,
|
'playlist_mincount': 55,
|
||||||
|
@ -534,7 +534,7 @@ class WeverseLiveTabIE(WeverseTabBaseIE):
|
||||||
|
|
||||||
|
|
||||||
class WeverseMediaTabIE(WeverseTabBaseIE):
|
class WeverseMediaTabIE(WeverseTabBaseIE):
|
||||||
_VALID_URL = r'https?://(?:www\.|m\.)?weverse\.io/(?P<id>[^/?#]+)/media(?:/|/all|/new)?(?:[?#]|$)'
|
_VALID_URL = r'https?://(?:www\.|m\.)?weverse.io/(?P<id>[^/?#]+)/media(?:/|/all|/new)?(?:[?#]|$)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://weverse.io/billlie/media/',
|
'url': 'https://weverse.io/billlie/media/',
|
||||||
'playlist_mincount': 231,
|
'playlist_mincount': 231,
|
||||||
|
@ -558,7 +558,7 @@ class WeverseMediaTabIE(WeverseTabBaseIE):
|
||||||
|
|
||||||
|
|
||||||
class WeverseLiveIE(WeverseBaseIE):
|
class WeverseLiveIE(WeverseBaseIE):
|
||||||
_VALID_URL = r'https?://(?:www\.|m\.)?weverse\.io/(?P<id>[^/?#]+)/?(?:[?#]|$)'
|
_VALID_URL = r'https?://(?:www\.|m\.)?weverse.io/(?P<id>[^/?#]+)/?(?:[?#]|$)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://weverse.io/purplekiss',
|
'url': 'https://weverse.io/purplekiss',
|
||||||
'info_dict': {
|
'info_dict': {
|
||||||
|
|
|
@ -11,7 +11,7 @@ class WimTVIE(InfoExtractor):
|
||||||
_player = None
|
_player = None
|
||||||
_UUID_RE = r'[\da-f]{8}-[\da-f]{4}-[\da-f]{4}-[\da-f]{4}-[\da-f]{12}'
|
_UUID_RE = r'[\da-f]{8}-[\da-f]{4}-[\da-f]{4}-[\da-f]{4}-[\da-f]{12}'
|
||||||
_VALID_URL = r'''(?x:
|
_VALID_URL = r'''(?x:
|
||||||
https?://platform\.wim\.tv/
|
https?://platform.wim.tv/
|
||||||
(?:
|
(?:
|
||||||
(?:embed/)?\?
|
(?:embed/)?\?
|
||||||
|\#/webtv/.+?/
|
|\#/webtv/.+?/
|
||||||
|
|
|
@ -24,7 +24,7 @@ class XHamsterIE(InfoExtractor):
|
||||||
_DOMAINS = r'(?:xhamster\.(?:com|one|desi)|xhms\.pro|xhamster\d+\.com|xhday\.com|xhvid\.com)'
|
_DOMAINS = r'(?:xhamster\.(?:com|one|desi)|xhms\.pro|xhamster\d+\.com|xhday\.com|xhvid\.com)'
|
||||||
_VALID_URL = r'''(?x)
|
_VALID_URL = r'''(?x)
|
||||||
https?://
|
https?://
|
||||||
(?:[^/?#]+\.)?%s/
|
(?:.+?\.)?%s/
|
||||||
(?:
|
(?:
|
||||||
movies/(?P<id>[\dA-Za-z]+)/(?P<display_id>[^/]*)\.html|
|
movies/(?P<id>[\dA-Za-z]+)/(?P<display_id>[^/]*)\.html|
|
||||||
videos/(?P<display_id_2>[^/]*)-(?P<id_2>[\dA-Za-z]+)
|
videos/(?P<display_id_2>[^/]*)-(?P<id_2>[\dA-Za-z]+)
|
||||||
|
@ -372,7 +372,7 @@ class XHamsterIE(InfoExtractor):
|
||||||
|
|
||||||
|
|
||||||
class XHamsterEmbedIE(InfoExtractor):
|
class XHamsterEmbedIE(InfoExtractor):
|
||||||
_VALID_URL = r'https?://(?:[^/?#]+\.)?%s/xembed\.php\?video=(?P<id>\d+)' % XHamsterIE._DOMAINS
|
_VALID_URL = r'https?://(?:.+?\.)?%s/xembed\.php\?video=(?P<id>\d+)' % XHamsterIE._DOMAINS
|
||||||
_EMBED_REGEX = [r'<iframe[^>]+?src=(["\'])(?P<url>(?:https?:)?//(?:www\.)?xhamster\.com/xembed\.php\?video=\d+)\1']
|
_EMBED_REGEX = [r'<iframe[^>]+?src=(["\'])(?P<url>(?:https?:)?//(?:www\.)?xhamster\.com/xembed\.php\?video=\d+)\1']
|
||||||
_TEST = {
|
_TEST = {
|
||||||
'url': 'http://xhamster.com/xembed.php?video=3328539',
|
'url': 'http://xhamster.com/xembed.php?video=3328539',
|
||||||
|
|
|
@ -949,7 +949,7 @@ class YoutubeBaseInfoExtractor(InfoExtractor):
|
||||||
main_rm = next(main_retries)
|
main_rm = next(main_retries)
|
||||||
# Manual retry loop for multiple RetryManagers
|
# Manual retry loop for multiple RetryManagers
|
||||||
# The proper RetryManager MUST be advanced after an error
|
# The proper RetryManager MUST be advanced after an error
|
||||||
# and its result MUST be checked if the manager is non fatal
|
# and it's result MUST be checked if the manager is non fatal
|
||||||
while True:
|
while True:
|
||||||
try:
|
try:
|
||||||
response = self._call_api(
|
response = self._call_api(
|
||||||
|
|
|
@ -13,7 +13,7 @@ from ..utils import (
|
||||||
|
|
||||||
class ZoomIE(InfoExtractor):
|
class ZoomIE(InfoExtractor):
|
||||||
IE_NAME = 'zoom'
|
IE_NAME = 'zoom'
|
||||||
_VALID_URL = r'(?P<base_url>https?://(?:[^.]+\.)?zoom\.us/)rec(?:ording)?/(?P<type>play|share)/(?P<id>[\w.-]+)'
|
_VALID_URL = r'(?P<base_url>https?://(?:[^.]+\.)?zoom.us/)rec(?:ording)?/(?P<type>play|share)/(?P<id>[A-Za-z0-9_.-]+)'
|
||||||
_TESTS = [{
|
_TESTS = [{
|
||||||
'url': 'https://economist.zoom.us/rec/play/dUk_CNBETmZ5VA2BwEl-jjakPpJ3M1pcfVYAPRsoIbEByGsLjUZtaa4yCATQuOL3der8BlTwxQePl_j0.EImBkXzTIaPvdZO5',
|
'url': 'https://economist.zoom.us/rec/play/dUk_CNBETmZ5VA2BwEl-jjakPpJ3M1pcfVYAPRsoIbEByGsLjUZtaa4yCATQuOL3der8BlTwxQePl_j0.EImBkXzTIaPvdZO5',
|
||||||
'md5': 'ab445e8c911fddc4f9adc842c2c5d434',
|
'md5': 'ab445e8c911fddc4f9adc842c2c5d434',
|
||||||
|
|
|
@ -13,18 +13,10 @@ from .common import (
|
||||||
# isort: split
|
# isort: split
|
||||||
# TODO: all request handlers should be safely imported
|
# TODO: all request handlers should be safely imported
|
||||||
from . import _urllib
|
from . import _urllib
|
||||||
from ..utils import bug_reports_message
|
|
||||||
|
|
||||||
try:
|
|
||||||
from . import _requests
|
|
||||||
except ImportError:
|
|
||||||
pass
|
|
||||||
except Exception as e:
|
|
||||||
warnings.warn(f'Failed to import "requests" request handler: {e}' + bug_reports_message())
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from . import _curlcffi # noqa: F401
|
from . import _curlcffi # noqa: F401
|
||||||
except ImportError:
|
except ImportError:
|
||||||
pass
|
pass
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
warnings.warn(f'Failed to import "curl_cffi" request handler: {e}' + bug_reports_message())
|
warnings.warn(f'Failed to import curl_cffi handler: {e}')
|
||||||
|
|
|
@ -37,29 +37,19 @@ class CurlCFFIResponseReader(io.IOBase):
|
||||||
return True
|
return True
|
||||||
|
|
||||||
def read(self, size=None):
|
def read(self, size=None):
|
||||||
try:
|
if self._eof:
|
||||||
while not self._eof and (size is None or len(self._buffer) < size):
|
return b''
|
||||||
chunk = next(self._response.iter_content(), None)
|
|
||||||
if chunk is None:
|
|
||||||
self._eof = True
|
|
||||||
break
|
|
||||||
self._buffer += chunk
|
|
||||||
|
|
||||||
if size is None:
|
while size is None or len(self._buffer) < size:
|
||||||
data = self._buffer
|
chunk = next(self._response.iter_content(), None)
|
||||||
self._buffer = b''
|
if chunk is None:
|
||||||
else:
|
self._eof = True
|
||||||
data = self._buffer[:size]
|
break
|
||||||
self._buffer = self._buffer[size:]
|
self._buffer += chunk
|
||||||
|
|
||||||
# "free" the curl instance if the response is fully read.
|
data = self._buffer[:size]
|
||||||
# curl_cffi doesn't do this automatically and only allows one open response per thread
|
self._buffer = self._buffer[size:]
|
||||||
if self._eof and len(self._buffer) == 0:
|
return data
|
||||||
self.close()
|
|
||||||
return data
|
|
||||||
except Exception:
|
|
||||||
self.close()
|
|
||||||
raise
|
|
||||||
|
|
||||||
def close(self):
|
def close(self):
|
||||||
self._response.close()
|
self._response.close()
|
||||||
|
|
|
@ -11,7 +11,7 @@ import urllib.request
|
||||||
|
|
||||||
from .exceptions import RequestError, UnsupportedRequest
|
from .exceptions import RequestError, UnsupportedRequest
|
||||||
from ..dependencies import certifi
|
from ..dependencies import certifi
|
||||||
from ..socks import ProxyType, sockssocket
|
from ..socks import ProxyType
|
||||||
from ..utils import format_field, traverse_obj
|
from ..utils import format_field, traverse_obj
|
||||||
|
|
||||||
if typing.TYPE_CHECKING:
|
if typing.TYPE_CHECKING:
|
||||||
|
@ -224,24 +224,6 @@ def _socket_connect(ip_addr, timeout, source_address):
|
||||||
raise
|
raise
|
||||||
|
|
||||||
|
|
||||||
def create_socks_proxy_socket(dest_addr, proxy_args, proxy_ip_addr, timeout, source_address):
|
|
||||||
af, socktype, proto, canonname, sa = proxy_ip_addr
|
|
||||||
sock = sockssocket(af, socktype, proto)
|
|
||||||
try:
|
|
||||||
connect_proxy_args = proxy_args.copy()
|
|
||||||
connect_proxy_args.update({'addr': sa[0], 'port': sa[1]})
|
|
||||||
sock.setproxy(**connect_proxy_args)
|
|
||||||
if timeout is not socket._GLOBAL_DEFAULT_TIMEOUT: # noqa: E721
|
|
||||||
sock.settimeout(timeout)
|
|
||||||
if source_address:
|
|
||||||
sock.bind(source_address)
|
|
||||||
sock.connect(dest_addr)
|
|
||||||
return sock
|
|
||||||
except socket.error:
|
|
||||||
sock.close()
|
|
||||||
raise
|
|
||||||
|
|
||||||
|
|
||||||
def create_connection(
|
def create_connection(
|
||||||
address,
|
address,
|
||||||
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
|
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
|
||||||
|
|
|
@ -1,397 +0,0 @@
|
||||||
import contextlib
|
|
||||||
import functools
|
|
||||||
import http.client
|
|
||||||
import logging
|
|
||||||
import re
|
|
||||||
import socket
|
|
||||||
import warnings
|
|
||||||
|
|
||||||
from ..dependencies import brotli, requests, urllib3
|
|
||||||
from ..utils import bug_reports_message, int_or_none, variadic
|
|
||||||
|
|
||||||
if requests is None:
|
|
||||||
raise ImportError('requests module is not installed')
|
|
||||||
|
|
||||||
if urllib3 is None:
|
|
||||||
raise ImportError('urllib3 module is not installed')
|
|
||||||
|
|
||||||
urllib3_version = tuple(int_or_none(x, default=0) for x in urllib3.__version__.split('.'))
|
|
||||||
|
|
||||||
if urllib3_version < (1, 26, 17):
|
|
||||||
raise ImportError('Only urllib3 >= 1.26.17 is supported')
|
|
||||||
|
|
||||||
if requests.__build__ < 0x023100:
|
|
||||||
raise ImportError('Only requests >= 2.31.0 is supported')
|
|
||||||
|
|
||||||
import requests.adapters
|
|
||||||
import requests.utils
|
|
||||||
import urllib3.connection
|
|
||||||
import urllib3.exceptions
|
|
||||||
|
|
||||||
from ._helper import (
|
|
||||||
InstanceStoreMixin,
|
|
||||||
add_accept_encoding_header,
|
|
||||||
create_connection,
|
|
||||||
create_socks_proxy_socket,
|
|
||||||
get_redirect_method,
|
|
||||||
make_socks_proxy_opts,
|
|
||||||
select_proxy,
|
|
||||||
)
|
|
||||||
from .common import (
|
|
||||||
Features,
|
|
||||||
RequestHandler,
|
|
||||||
Response,
|
|
||||||
register_preference,
|
|
||||||
register_rh,
|
|
||||||
)
|
|
||||||
from .exceptions import (
|
|
||||||
CertificateVerifyError,
|
|
||||||
HTTPError,
|
|
||||||
IncompleteRead,
|
|
||||||
ProxyError,
|
|
||||||
RequestError,
|
|
||||||
SSLError,
|
|
||||||
TransportError,
|
|
||||||
)
|
|
||||||
from ..socks import ProxyError as SocksProxyError
|
|
||||||
|
|
||||||
SUPPORTED_ENCODINGS = [
|
|
||||||
'gzip', 'deflate'
|
|
||||||
]
|
|
||||||
|
|
||||||
if brotli is not None:
|
|
||||||
SUPPORTED_ENCODINGS.append('br')
|
|
||||||
|
|
||||||
"""
|
|
||||||
Override urllib3's behavior to not convert lower-case percent-encoded characters
|
|
||||||
to upper-case during url normalization process.
|
|
||||||
|
|
||||||
RFC3986 defines that the lower or upper case percent-encoded hexidecimal characters are equivalent
|
|
||||||
and normalizers should convert them to uppercase for consistency [1].
|
|
||||||
|
|
||||||
However, some sites may have an incorrect implementation where they provide
|
|
||||||
a percent-encoded url that is then compared case-sensitively.[2]
|
|
||||||
|
|
||||||
While this is a very rare case, since urllib does not do this normalization step, it
|
|
||||||
is best to avoid it in requests too for compatability reasons.
|
|
||||||
|
|
||||||
1: https://tools.ietf.org/html/rfc3986#section-2.1
|
|
||||||
2: https://github.com/streamlink/streamlink/pull/4003
|
|
||||||
"""
|
|
||||||
|
|
||||||
|
|
||||||
class Urllib3PercentREOverride:
|
|
||||||
def __init__(self, r: re.Pattern):
|
|
||||||
self.re = r
|
|
||||||
|
|
||||||
# pass through all other attribute calls to the original re
|
|
||||||
def __getattr__(self, item):
|
|
||||||
return self.re.__getattribute__(item)
|
|
||||||
|
|
||||||
def subn(self, repl, string, *args, **kwargs):
|
|
||||||
return string, self.re.subn(repl, string, *args, **kwargs)[1]
|
|
||||||
|
|
||||||
|
|
||||||
# urllib3 >= 1.25.8 uses subn:
|
|
||||||
# https://github.com/urllib3/urllib3/commit/a2697e7c6b275f05879b60f593c5854a816489f0
|
|
||||||
import urllib3.util.url # noqa: E305
|
|
||||||
|
|
||||||
if hasattr(urllib3.util.url, 'PERCENT_RE'):
|
|
||||||
urllib3.util.url.PERCENT_RE = Urllib3PercentREOverride(urllib3.util.url.PERCENT_RE)
|
|
||||||
elif hasattr(urllib3.util.url, '_PERCENT_RE'): # urllib3 >= 2.0.0
|
|
||||||
urllib3.util.url._PERCENT_RE = Urllib3PercentREOverride(urllib3.util.url._PERCENT_RE)
|
|
||||||
else:
|
|
||||||
warnings.warn('Failed to patch PERCENT_RE in urllib3 (does the attribute exist?)' + bug_reports_message())
|
|
||||||
|
|
||||||
"""
|
|
||||||
Workaround for issue in urllib.util.ssl_.py: ssl_wrap_context does not pass
|
|
||||||
server_hostname to SSLContext.wrap_socket if server_hostname is an IP,
|
|
||||||
however this is an issue because we set check_hostname to True in our SSLContext.
|
|
||||||
|
|
||||||
Monkey-patching IS_SECURETRANSPORT forces ssl_wrap_context to pass server_hostname regardless.
|
|
||||||
|
|
||||||
This has been fixed in urllib3 2.0+.
|
|
||||||
See: https://github.com/urllib3/urllib3/issues/517
|
|
||||||
"""
|
|
||||||
|
|
||||||
if urllib3_version < (2, 0, 0):
|
|
||||||
with contextlib.suppress():
|
|
||||||
urllib3.util.IS_SECURETRANSPORT = urllib3.util.ssl_.IS_SECURETRANSPORT = True
|
|
||||||
|
|
||||||
|
|
||||||
# Requests will not automatically handle no_proxy by default
|
|
||||||
# due to buggy no_proxy handling with proxy dict [1].
|
|
||||||
# 1. https://github.com/psf/requests/issues/5000
|
|
||||||
requests.adapters.select_proxy = select_proxy
|
|
||||||
|
|
||||||
|
|
||||||
class RequestsResponseAdapter(Response):
|
|
||||||
def __init__(self, res: requests.models.Response):
|
|
||||||
super().__init__(
|
|
||||||
fp=res.raw, headers=res.headers, url=res.url,
|
|
||||||
status=res.status_code, reason=res.reason)
|
|
||||||
|
|
||||||
self._requests_response = res
|
|
||||||
|
|
||||||
def read(self, amt: int = None):
|
|
||||||
try:
|
|
||||||
# Interact with urllib3 response directly.
|
|
||||||
return self.fp.read(amt, decode_content=True)
|
|
||||||
|
|
||||||
# See urllib3.response.HTTPResponse.read() for exceptions raised on read
|
|
||||||
except urllib3.exceptions.SSLError as e:
|
|
||||||
raise SSLError(cause=e) from e
|
|
||||||
|
|
||||||
except urllib3.exceptions.ProtocolError as e:
|
|
||||||
# IncompleteRead is always contained within ProtocolError
|
|
||||||
# See urllib3.response.HTTPResponse._error_catcher()
|
|
||||||
ir_err = next(
|
|
||||||
(err for err in (e.__context__, e.__cause__, *variadic(e.args))
|
|
||||||
if isinstance(err, http.client.IncompleteRead)), None)
|
|
||||||
if ir_err is not None:
|
|
||||||
# `urllib3.exceptions.IncompleteRead` is subclass of `http.client.IncompleteRead`
|
|
||||||
# but uses an `int` for its `partial` property.
|
|
||||||
partial = ir_err.partial if isinstance(ir_err.partial, int) else len(ir_err.partial)
|
|
||||||
raise IncompleteRead(partial=partial, expected=ir_err.expected) from e
|
|
||||||
raise TransportError(cause=e) from e
|
|
||||||
|
|
||||||
except urllib3.exceptions.HTTPError as e:
|
|
||||||
# catch-all for any other urllib3 response exceptions
|
|
||||||
raise TransportError(cause=e) from e
|
|
||||||
|
|
||||||
|
|
||||||
class RequestsHTTPAdapter(requests.adapters.HTTPAdapter):
|
|
||||||
def __init__(self, ssl_context=None, proxy_ssl_context=None, source_address=None, **kwargs):
|
|
||||||
self._pm_args = {}
|
|
||||||
if ssl_context:
|
|
||||||
self._pm_args['ssl_context'] = ssl_context
|
|
||||||
if source_address:
|
|
||||||
self._pm_args['source_address'] = (source_address, 0)
|
|
||||||
self._proxy_ssl_context = proxy_ssl_context or ssl_context
|
|
||||||
super().__init__(**kwargs)
|
|
||||||
|
|
||||||
def init_poolmanager(self, *args, **kwargs):
|
|
||||||
return super().init_poolmanager(*args, **kwargs, **self._pm_args)
|
|
||||||
|
|
||||||
def proxy_manager_for(self, proxy, **proxy_kwargs):
|
|
||||||
extra_kwargs = {}
|
|
||||||
if not proxy.lower().startswith('socks') and self._proxy_ssl_context:
|
|
||||||
extra_kwargs['proxy_ssl_context'] = self._proxy_ssl_context
|
|
||||||
return super().proxy_manager_for(proxy, **proxy_kwargs, **self._pm_args, **extra_kwargs)
|
|
||||||
|
|
||||||
def cert_verify(*args, **kwargs):
|
|
||||||
# lean on SSLContext for cert verification
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
class RequestsSession(requests.sessions.Session):
|
|
||||||
"""
|
|
||||||
Ensure unified redirect method handling with our urllib redirect handler.
|
|
||||||
"""
|
|
||||||
def rebuild_method(self, prepared_request, response):
|
|
||||||
new_method = get_redirect_method(prepared_request.method, response.status_code)
|
|
||||||
|
|
||||||
# HACK: requests removes headers/body on redirect unless code was a 307/308.
|
|
||||||
if new_method == prepared_request.method:
|
|
||||||
response._real_status_code = response.status_code
|
|
||||||
response.status_code = 308
|
|
||||||
|
|
||||||
prepared_request.method = new_method
|
|
||||||
|
|
||||||
def rebuild_auth(self, prepared_request, response):
|
|
||||||
# HACK: undo status code change from rebuild_method, if applicable.
|
|
||||||
# rebuild_auth runs after requests would remove headers/body based on status code
|
|
||||||
if hasattr(response, '_real_status_code'):
|
|
||||||
response.status_code = response._real_status_code
|
|
||||||
del response._real_status_code
|
|
||||||
return super().rebuild_auth(prepared_request, response)
|
|
||||||
|
|
||||||
|
|
||||||
class Urllib3LoggingFilter(logging.Filter):
|
|
||||||
|
|
||||||
def filter(self, record):
|
|
||||||
# Ignore HTTP request messages since HTTPConnection prints those
|
|
||||||
if record.msg == '%s://%s:%s "%s %s %s" %s %s':
|
|
||||||
return False
|
|
||||||
return True
|
|
||||||
|
|
||||||
|
|
||||||
class Urllib3LoggingHandler(logging.Handler):
|
|
||||||
"""Redirect urllib3 logs to our logger"""
|
|
||||||
def __init__(self, logger, *args, **kwargs):
|
|
||||||
super().__init__(*args, **kwargs)
|
|
||||||
self._logger = logger
|
|
||||||
|
|
||||||
def emit(self, record):
|
|
||||||
try:
|
|
||||||
msg = self.format(record)
|
|
||||||
if record.levelno >= logging.ERROR:
|
|
||||||
self._logger.error(msg)
|
|
||||||
else:
|
|
||||||
self._logger.stdout(msg)
|
|
||||||
|
|
||||||
except Exception:
|
|
||||||
self.handleError(record)
|
|
||||||
|
|
||||||
|
|
||||||
@register_rh
|
|
||||||
class RequestsRH(RequestHandler, InstanceStoreMixin):
|
|
||||||
|
|
||||||
"""Requests RequestHandler
|
|
||||||
https://github.com/psf/requests
|
|
||||||
"""
|
|
||||||
_SUPPORTED_URL_SCHEMES = ('http', 'https')
|
|
||||||
_SUPPORTED_ENCODINGS = tuple(SUPPORTED_ENCODINGS)
|
|
||||||
_SUPPORTED_PROXY_SCHEMES = ('http', 'https', 'socks4', 'socks4a', 'socks5', 'socks5h')
|
|
||||||
_SUPPORTED_FEATURES = (Features.NO_PROXY, Features.ALL_PROXY)
|
|
||||||
RH_NAME = 'requests'
|
|
||||||
|
|
||||||
def __init__(self, *args, **kwargs):
|
|
||||||
super().__init__(*args, **kwargs)
|
|
||||||
|
|
||||||
# Forward urllib3 debug messages to our logger
|
|
||||||
logger = logging.getLogger('urllib3')
|
|
||||||
handler = Urllib3LoggingHandler(logger=self._logger)
|
|
||||||
handler.setFormatter(logging.Formatter('requests: %(message)s'))
|
|
||||||
handler.addFilter(Urllib3LoggingFilter())
|
|
||||||
logger.addHandler(handler)
|
|
||||||
logger.setLevel(logging.WARNING)
|
|
||||||
|
|
||||||
if self.verbose:
|
|
||||||
# Setting this globally is not ideal, but is easier than hacking with urllib3.
|
|
||||||
# It could technically be problematic for scripts embedding yt-dlp.
|
|
||||||
# However, it is unlikely debug traffic is used in that context in a way this will cause problems.
|
|
||||||
urllib3.connection.HTTPConnection.debuglevel = 1
|
|
||||||
logger.setLevel(logging.DEBUG)
|
|
||||||
# this is expected if we are using --no-check-certificate
|
|
||||||
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
|
|
||||||
|
|
||||||
def close(self):
|
|
||||||
self._clear_instances()
|
|
||||||
|
|
||||||
def _check_extensions(self, extensions):
|
|
||||||
super()._check_extensions(extensions)
|
|
||||||
extensions.pop('cookiejar', None)
|
|
||||||
extensions.pop('timeout', None)
|
|
||||||
|
|
||||||
def _create_instance(self, cookiejar):
|
|
||||||
session = RequestsSession()
|
|
||||||
http_adapter = RequestsHTTPAdapter(
|
|
||||||
ssl_context=self._make_sslcontext(),
|
|
||||||
source_address=self.source_address,
|
|
||||||
max_retries=urllib3.util.retry.Retry(False),
|
|
||||||
)
|
|
||||||
session.adapters.clear()
|
|
||||||
session.headers = requests.models.CaseInsensitiveDict({'Connection': 'keep-alive'})
|
|
||||||
session.mount('https://', http_adapter)
|
|
||||||
session.mount('http://', http_adapter)
|
|
||||||
session.cookies = cookiejar
|
|
||||||
session.trust_env = False # no need, we already load proxies from env
|
|
||||||
return session
|
|
||||||
|
|
||||||
def _send(self, request):
|
|
||||||
|
|
||||||
headers = self._merge_headers(request.headers)
|
|
||||||
add_accept_encoding_header(headers, SUPPORTED_ENCODINGS)
|
|
||||||
|
|
||||||
max_redirects_exceeded = False
|
|
||||||
|
|
||||||
session = self._get_instance(
|
|
||||||
cookiejar=request.extensions.get('cookiejar') or self.cookiejar)
|
|
||||||
|
|
||||||
try:
|
|
||||||
requests_res = session.request(
|
|
||||||
method=request.method,
|
|
||||||
url=request.url,
|
|
||||||
data=request.data,
|
|
||||||
headers=headers,
|
|
||||||
timeout=float(request.extensions.get('timeout') or self.timeout),
|
|
||||||
proxies=request.proxies or self.proxies,
|
|
||||||
allow_redirects=True,
|
|
||||||
stream=True
|
|
||||||
)
|
|
||||||
|
|
||||||
except requests.exceptions.TooManyRedirects as e:
|
|
||||||
max_redirects_exceeded = True
|
|
||||||
requests_res = e.response
|
|
||||||
|
|
||||||
except requests.exceptions.SSLError as e:
|
|
||||||
if 'CERTIFICATE_VERIFY_FAILED' in str(e):
|
|
||||||
raise CertificateVerifyError(cause=e) from e
|
|
||||||
raise SSLError(cause=e) from e
|
|
||||||
|
|
||||||
except requests.exceptions.ProxyError as e:
|
|
||||||
raise ProxyError(cause=e) from e
|
|
||||||
|
|
||||||
except (requests.exceptions.ConnectionError, requests.exceptions.Timeout) as e:
|
|
||||||
raise TransportError(cause=e) from e
|
|
||||||
|
|
||||||
except urllib3.exceptions.HTTPError as e:
|
|
||||||
# Catch any urllib3 exceptions that may leak through
|
|
||||||
raise TransportError(cause=e) from e
|
|
||||||
|
|
||||||
except requests.exceptions.RequestException as e:
|
|
||||||
# Miscellaneous Requests exceptions. May not necessary be network related e.g. InvalidURL
|
|
||||||
raise RequestError(cause=e) from e
|
|
||||||
|
|
||||||
res = RequestsResponseAdapter(requests_res)
|
|
||||||
|
|
||||||
if not 200 <= res.status < 300:
|
|
||||||
raise HTTPError(res, redirect_loop=max_redirects_exceeded)
|
|
||||||
|
|
||||||
return res
|
|
||||||
|
|
||||||
|
|
||||||
@register_preference(RequestsRH)
|
|
||||||
def requests_preference(rh, request):
|
|
||||||
return 100
|
|
||||||
|
|
||||||
|
|
||||||
# Use our socks proxy implementation with requests to avoid an extra dependency.
|
|
||||||
class SocksHTTPConnection(urllib3.connection.HTTPConnection):
|
|
||||||
def __init__(self, _socks_options, *args, **kwargs): # must use _socks_options to pass PoolKey checks
|
|
||||||
self._proxy_args = _socks_options
|
|
||||||
super().__init__(*args, **kwargs)
|
|
||||||
|
|
||||||
def _new_conn(self):
|
|
||||||
try:
|
|
||||||
return create_connection(
|
|
||||||
address=(self._proxy_args['addr'], self._proxy_args['port']),
|
|
||||||
timeout=self.timeout,
|
|
||||||
source_address=self.source_address,
|
|
||||||
_create_socket_func=functools.partial(
|
|
||||||
create_socks_proxy_socket, (self.host, self.port), self._proxy_args))
|
|
||||||
except (socket.timeout, TimeoutError) as e:
|
|
||||||
raise urllib3.exceptions.ConnectTimeoutError(
|
|
||||||
self, f'Connection to {self.host} timed out. (connect timeout={self.timeout})') from e
|
|
||||||
except SocksProxyError as e:
|
|
||||||
raise urllib3.exceptions.ProxyError(str(e), e) from e
|
|
||||||
except (OSError, socket.error) as e:
|
|
||||||
raise urllib3.exceptions.NewConnectionError(
|
|
||||||
self, f'Failed to establish a new connection: {e}') from e
|
|
||||||
|
|
||||||
|
|
||||||
class SocksHTTPSConnection(SocksHTTPConnection, urllib3.connection.HTTPSConnection):
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
class SocksHTTPConnectionPool(urllib3.HTTPConnectionPool):
|
|
||||||
ConnectionCls = SocksHTTPConnection
|
|
||||||
|
|
||||||
|
|
||||||
class SocksHTTPSConnectionPool(urllib3.HTTPSConnectionPool):
|
|
||||||
ConnectionCls = SocksHTTPSConnection
|
|
||||||
|
|
||||||
|
|
||||||
class SocksProxyManager(urllib3.PoolManager):
|
|
||||||
|
|
||||||
def __init__(self, socks_proxy, username=None, password=None, num_pools=10, headers=None, **connection_pool_kw):
|
|
||||||
connection_pool_kw['_socks_options'] = make_socks_proxy_opts(socks_proxy)
|
|
||||||
super().__init__(num_pools, headers, **connection_pool_kw)
|
|
||||||
self.pool_classes_by_scheme = {
|
|
||||||
'http': SocksHTTPConnectionPool,
|
|
||||||
'https': SocksHTTPSConnectionPool
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
requests.adapters.SOCKSProxyManager = SocksProxyManager
|
|
|
@ -3,6 +3,7 @@ from __future__ import annotations
|
||||||
import functools
|
import functools
|
||||||
import http.client
|
import http.client
|
||||||
import io
|
import io
|
||||||
|
import socket
|
||||||
import ssl
|
import ssl
|
||||||
import urllib.error
|
import urllib.error
|
||||||
import urllib.parse
|
import urllib.parse
|
||||||
|
@ -23,7 +24,6 @@ from ._helper import (
|
||||||
InstanceStoreMixin,
|
InstanceStoreMixin,
|
||||||
add_accept_encoding_header,
|
add_accept_encoding_header,
|
||||||
create_connection,
|
create_connection,
|
||||||
create_socks_proxy_socket,
|
|
||||||
get_redirect_method,
|
get_redirect_method,
|
||||||
make_socks_proxy_opts,
|
make_socks_proxy_opts,
|
||||||
select_proxy,
|
select_proxy,
|
||||||
|
@ -40,6 +40,7 @@ from .exceptions import (
|
||||||
)
|
)
|
||||||
from ..dependencies import brotli
|
from ..dependencies import brotli
|
||||||
from ..socks import ProxyError as SocksProxyError
|
from ..socks import ProxyError as SocksProxyError
|
||||||
|
from ..socks import sockssocket
|
||||||
from ..utils import update_url_query
|
from ..utils import update_url_query
|
||||||
from ..utils.networking import normalize_url
|
from ..utils.networking import normalize_url
|
||||||
|
|
||||||
|
@ -189,12 +190,25 @@ def make_socks_conn_class(base_class, socks_proxy):
|
||||||
_create_connection = create_connection
|
_create_connection = create_connection
|
||||||
|
|
||||||
def connect(self):
|
def connect(self):
|
||||||
|
def sock_socket_connect(ip_addr, timeout, source_address):
|
||||||
|
af, socktype, proto, canonname, sa = ip_addr
|
||||||
|
sock = sockssocket(af, socktype, proto)
|
||||||
|
try:
|
||||||
|
connect_proxy_args = proxy_args.copy()
|
||||||
|
connect_proxy_args.update({'addr': sa[0], 'port': sa[1]})
|
||||||
|
sock.setproxy(**connect_proxy_args)
|
||||||
|
if timeout is not socket._GLOBAL_DEFAULT_TIMEOUT: # noqa: E721
|
||||||
|
sock.settimeout(timeout)
|
||||||
|
if source_address:
|
||||||
|
sock.bind(source_address)
|
||||||
|
sock.connect((self.host, self.port))
|
||||||
|
return sock
|
||||||
|
except socket.error:
|
||||||
|
sock.close()
|
||||||
|
raise
|
||||||
self.sock = create_connection(
|
self.sock = create_connection(
|
||||||
(proxy_args['addr'], proxy_args['port']),
|
(proxy_args['addr'], proxy_args['port']), timeout=self.timeout,
|
||||||
timeout=self.timeout,
|
source_address=self.source_address, _create_socket_func=sock_socket_connect)
|
||||||
source_address=self.source_address,
|
|
||||||
_create_socket_func=functools.partial(
|
|
||||||
create_socks_proxy_socket, (self.host, self.port), proxy_args))
|
|
||||||
if isinstance(self, http.client.HTTPSConnection):
|
if isinstance(self, http.client.HTTPSConnection):
|
||||||
self.sock = self._context.wrap_socket(self.sock, server_hostname=self.host)
|
self.sock = self._context.wrap_socket(self.sock, server_hostname=self.host)
|
||||||
|
|
||||||
|
|
|
@ -75,7 +75,7 @@ class HTTPError(RequestError):
|
||||||
|
|
||||||
|
|
||||||
class IncompleteRead(TransportError):
|
class IncompleteRead(TransportError):
|
||||||
def __init__(self, partial: int, expected: int | None = None, **kwargs):
|
def __init__(self, partial: int, expected: int = None, **kwargs):
|
||||||
self.partial = partial
|
self.partial = partial
|
||||||
self.expected = expected
|
self.expected = expected
|
||||||
msg = f'{partial} bytes read'
|
msg = f'{partial} bytes read'
|
||||||
|
|
|
@ -471,12 +471,11 @@ def create_parser():
|
||||||
'no-attach-info-json', 'embed-thumbnail-atomicparsley', 'no-external-downloader-progress',
|
'no-attach-info-json', 'embed-thumbnail-atomicparsley', 'no-external-downloader-progress',
|
||||||
'embed-metadata', 'seperate-video-versions', 'no-clean-infojson', 'no-keep-subs', 'no-certifi',
|
'embed-metadata', 'seperate-video-versions', 'no-clean-infojson', 'no-keep-subs', 'no-certifi',
|
||||||
'no-youtube-channel-redirect', 'no-youtube-unavailable-videos', 'no-youtube-prefer-utc-upload-date',
|
'no-youtube-channel-redirect', 'no-youtube-unavailable-videos', 'no-youtube-prefer-utc-upload-date',
|
||||||
'prefer-legacy-http-handler'
|
|
||||||
}, 'aliases': {
|
}, 'aliases': {
|
||||||
'youtube-dl': ['all', '-multistreams', '-playlist-match-filter'],
|
'youtube-dl': ['all', '-multistreams', '-playlist-match-filter'],
|
||||||
'youtube-dlc': ['all', '-no-youtube-channel-redirect', '-no-live-chat', '-playlist-match-filter'],
|
'youtube-dlc': ['all', '-no-youtube-channel-redirect', '-no-live-chat', '-playlist-match-filter'],
|
||||||
'2021': ['2022', 'no-certifi', 'filename-sanitization', 'no-youtube-prefer-utc-upload-date'],
|
'2021': ['2022', 'no-certifi', 'filename-sanitization', 'no-youtube-prefer-utc-upload-date'],
|
||||||
'2022': ['no-external-downloader-progress', 'playlist-match-filter', 'prefer-legacy-http-handler'],
|
'2022': ['no-external-downloader-progress', 'playlist-match-filter'],
|
||||||
}
|
}
|
||||||
}, help=(
|
}, help=(
|
||||||
'Options that can help keep compatibility with youtube-dl or youtube-dlc '
|
'Options that can help keep compatibility with youtube-dl or youtube-dlc '
|
||||||
|
@ -733,7 +732,7 @@ def create_parser():
|
||||||
authentication.add_option(
|
authentication.add_option(
|
||||||
'--video-password',
|
'--video-password',
|
||||||
dest='videopassword', metavar='PASSWORD',
|
dest='videopassword', metavar='PASSWORD',
|
||||||
help='Video-specific password')
|
help='Video password (vimeo, youku)')
|
||||||
authentication.add_option(
|
authentication.add_option(
|
||||||
'--ap-mso',
|
'--ap-mso',
|
||||||
dest='ap_mso', metavar='MSO',
|
dest='ap_mso', metavar='MSO',
|
||||||
|
|
|
@ -1,8 +1,8 @@
|
||||||
# Autogenerated by devscripts/update-version.py
|
# Autogenerated by devscripts/update-version.py
|
||||||
|
|
||||||
__version__ = '2023.10.13'
|
__version__ = '2023.10.07'
|
||||||
|
|
||||||
RELEASE_GIT_HEAD = 'b634ba742d8f38ce9ecfa0546485728b0c6c59d1'
|
RELEASE_GIT_HEAD = '377e85a1797db9e98b78b38203ed9d4ded229991'
|
||||||
|
|
||||||
VARIANT = None
|
VARIANT = None
|
||||||
|
|
||||||
|
|
Loading…
Reference in New Issue
Block a user