mirror of
https://github.com/mitmproxy/mitmproxy.git
synced 2024-11-23 05:09:57 +00:00
Bump the github-actions group with 3 updates (#6701)
Bumps the github-actions group with 3 updates: [install-pinned/ruff](https://github.com/install-pinned/ruff), [apple-actions/import-codesign-certs](https://github.com/apple-actions/import-codesign-certs) and [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action). Updates `install-pinned/ruff` from fe472defb50a6a2c00ea3a3982534e86e69991e8 to 38b373a3a8635c2be31d92314e816a491fda910a <details> <summary>Commits</summary> <ul> <li><a href="38b373a3a8
"><code>38b373a</code></a> update README.md (ruff 0.3.0)</li> <li><a href="06af3ea1c3
"><code>06af3ea</code></a> update pins (ruff 0.3.0)</li> <li><a href="be1c354876
"><code>be1c354</code></a> update README.md (ruff 0.2.2)</li> <li><a href="c9779bbd5b
"><code>c9779bb</code></a> update pins (ruff 0.2.2)</li> <li><a href="48831a86ce
"><code>48831a8</code></a> update README.md (ruff 0.2.1)</li> <li><a href="6775b5f352
"><code>6775b5f</code></a> update pins (ruff 0.2.1)</li> <li><a href="bc12a64c2f
"><code>bc12a64</code></a> update README.md (ruff 0.2.0)</li> <li><a href="3b8cceff45
"><code>3b8ccef</code></a> update pins (ruff 0.2.0)</li> <li>See full diff in <a href="fe472defb5...38b373a3a8
">compare view</a></li> </ul> </details> <br /> Updates `apple-actions/import-codesign-certs` from 5565bb656f60c98c8fc515f3444dd8db73545dc2 to 493007ed063995cf2d4fbca064704150548f8bb5 <details> <summary>Commits</summary> <ul> <li><a href="493007ed06
"><code>493007e</code></a> Merge pull request <a href="https://redirect.github.com/apple-actions/import-codesign-certs/issues/62">#62</a> from himself65/patch-1</li> <li><a href="2e5aa07267
"><code>2e5aa07</code></a> Update README.md</li> <li>See full diff in <a href="5565bb656f...493007ed06
">compare view</a></li> </ul> </details> <br /> Updates `docker/setup-buildx-action` from 3.0.0 to 3.1.0 <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/docker/setup-buildx-action/releases">docker/setup-buildx-action's releases</a>.</em></p> <blockquote> <h2>v3.1.0</h2> <ul> <li><code>cache-binary</code> input to enable/disable caching binary to GHA cache backend by <a href="https://github.com/crazy-max"><code>@crazy-max</code></a> in <a href="https://redirect.github.com/docker/setup-buildx-action/pull/300">docker/setup-buildx-action#300</a></li> <li>build(deps): bump <code>@babel/traverse</code> from 7.17.3 to 7.23.2 in <a href="https://redirect.github.com/docker/setup-buildx-action/pull/282">docker/setup-buildx-action#282</a></li> <li>build(deps): bump <code>@docker/actions-toolkit</code> from 0.12.0 to 0.17.0 in <a href="https://redirect.github.com/docker/setup-buildx-action/pull/281">docker/setup-buildx-action#281</a> <a href="https://redirect.github.com/docker/setup-buildx-action/pull/284">docker/setup-buildx-action#284</a> <a href="https://redirect.github.com/docker/setup-buildx-action/pull/299">docker/setup-buildx-action#299</a></li> <li>build(deps): bump uuid from 9.0.0 to 9.0.1 in <a href="https://redirect.github.com/docker/setup-buildx-action/pull/271">docker/setup-buildx-action#271</a></li> <li>build(deps): bump undici from 5.26.3 to 5.28.3 in <a href="https://redirect.github.com/docker/setup-buildx-action/pull/297">docker/setup-buildx-action#297</a></li> </ul> <p><strong>Full Changelog</strong>: <a href="https://github.com/docker/setup-buildx-action/compare/v3.0.0...v3.1.0">https://github.com/docker/setup-buildx-action/compare/v3.0.0...v3.1.0</a></p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="0d103c3126
"><code>0d103c3</code></a> Merge pull request <a href="https://redirect.github.com/docker/setup-buildx-action/issues/300">#300</a> from crazy-max/cache-binary</li> <li><a href="f19477aacd
"><code>f19477a</code></a> chore: update generated content</li> <li><a href="a4180f835d
"><code>a4180f8</code></a> cache-binary input to enable/disable caching binary to GHA cache backend</li> <li><a href="524315340d
"><code>5243153</code></a> Merge pull request <a href="https://redirect.github.com/docker/setup-buildx-action/issues/299">#299</a> from docker/dependabot/npm_and_yarn/docker/actions-to...</li> <li><a href="3679a54023
"><code>3679a54</code></a> chore: update generated content</li> <li><a href="37a22a2fb2
"><code>37a22a2</code></a> build(deps): bump <code>@docker/actions-toolkit</code> from 0.14.0 to 0.17.0</li> <li><a href="65afe610a1
"><code>65afe61</code></a> Merge pull request <a href="https://redirect.github.com/docker/setup-buildx-action/issues/297">#297</a> from docker/dependabot/npm_and_yarn/undici-5.28.3</li> <li><a href="fcb8f722fd
"><code>fcb8f72</code></a> chore: update generated content</li> <li><a href="f62b9a17c0
"><code>f62b9a1</code></a> Merge pull request <a href="https://redirect.github.com/docker/setup-buildx-action/issues/298">#298</a> from crazy-max/bump-gha</li> <li><a href="74c5b717e5
"><code>74c5b71</code></a> bump codecov/codecov-action from 3 to 4</li> <li>Additional commits viewable in <a href="f95db51fdd...0d103c3126
">compare view</a></li> </ul> </details> <br /> Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore <dependency name> major version` will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself) - `@dependabot ignore <dependency name> minor version` will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself) - `@dependabot ignore <dependency name>` will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself) - `@dependabot unignore <dependency name>` will remove all of the ignore conditions of the specified dependency - `@dependabot unignore <dependency name> <ignore condition>` will remove the ignore condition of the specified dependency and ignore conditions </details> --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
This commit is contained in:
parent
766b5451b7
commit
a91989b7ba
2
.github/workflows/autofix.yml
vendored
2
.github/workflows/autofix.yml
vendored
@ -15,7 +15,7 @@ jobs:
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- uses: install-pinned/ruff@fe472defb50a6a2c00ea3a3982534e86e69991e8
|
||||
- uses: install-pinned/ruff@38b373a3a8635c2be31d92314e816a491fda910a
|
||||
- run: ruff --fix-only .
|
||||
- run: ruff format .
|
||||
|
||||
|
4
.github/workflows/main.yml
vendored
4
.github/workflows/main.yml
vendored
@ -110,7 +110,7 @@ jobs:
|
||||
- if: startsWith(matrix.platform, 'macos') && github.repository == 'mitmproxy/mitmproxy'
|
||||
&& (startsWith(github.ref, 'refs/heads/') || startsWith(github.ref, 'refs/tags/'))
|
||||
id: keychain
|
||||
uses: apple-actions/import-codesign-certs@5565bb656f60c98c8fc515f3444dd8db73545dc2
|
||||
uses: apple-actions/import-codesign-certs@493007ed063995cf2d4fbca064704150548f8bb5
|
||||
with:
|
||||
keychain: ${{ runner.temp }}/temp
|
||||
p12-file-base64: ${{ secrets.APPLE_CERTIFICATE }}
|
||||
@ -238,7 +238,7 @@ jobs:
|
||||
name: binaries.linux
|
||||
path: release/dist
|
||||
- uses: docker/setup-qemu-action@68827325e0b33c7199eb31dd4e31fbe9023e06e3 # v3.0.0
|
||||
- uses: docker/setup-buildx-action@f95db51fddba0c2d1ec667646a06c2ce06100226 # v1.6.0
|
||||
- uses: docker/setup-buildx-action@0d103c3126aa41d772a8362f6aa67afac040f80c # v1.6.0
|
||||
- run: python release/build-and-deploy-docker.py
|
||||
|
||||
deploy:
|
||||
|
@ -3,6 +3,7 @@ Basic skeleton of a mitmproxy addon.
|
||||
|
||||
Run as follows: mitmproxy -s anatomy.py
|
||||
"""
|
||||
|
||||
import logging
|
||||
|
||||
|
||||
|
@ -1,4 +1,5 @@
|
||||
"""Handle flows as command arguments."""
|
||||
|
||||
import logging
|
||||
from collections.abc import Sequence
|
||||
|
||||
|
@ -1,4 +1,5 @@
|
||||
"""Handle file paths as command arguments."""
|
||||
|
||||
import logging
|
||||
from collections.abc import Sequence
|
||||
|
||||
|
@ -1,4 +1,5 @@
|
||||
"""Add a custom command to mitmproxy's command prompt."""
|
||||
|
||||
import logging
|
||||
|
||||
from mitmproxy import command
|
||||
|
@ -3,6 +3,7 @@ Add a custom version of the gRPC/protobuf content view, which parses
|
||||
protobuf messages based on a user defined rule set.
|
||||
|
||||
"""
|
||||
|
||||
from mitmproxy import contentviews
|
||||
from mitmproxy.addonmanager import Loader
|
||||
from mitmproxy.contentviews.grpc import ProtoParser
|
||||
|
@ -5,6 +5,7 @@ This example shows how one can add a custom contentview to mitmproxy,
|
||||
which is used to pretty-print HTTP bodies for example.
|
||||
The content view API is explained in the mitmproxy.contentviews module.
|
||||
"""
|
||||
|
||||
from mitmproxy import contentviews
|
||||
from mitmproxy import flow
|
||||
from mitmproxy import http
|
||||
|
@ -1,4 +1,5 @@
|
||||
"""Take incoming HTTP requests and replay them with modified parameters."""
|
||||
|
||||
from mitmproxy import ctx
|
||||
|
||||
|
||||
|
@ -1,6 +1,7 @@
|
||||
"""
|
||||
Use mitmproxy's filter pattern in scripts.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
|
@ -1,4 +1,5 @@
|
||||
"""Modify an HTTP form submission."""
|
||||
|
||||
from mitmproxy import http
|
||||
|
||||
|
||||
|
@ -1,4 +1,5 @@
|
||||
"""Modify HTTP query parameters."""
|
||||
|
||||
from mitmproxy import http
|
||||
|
||||
|
||||
|
@ -1,4 +1,5 @@
|
||||
"""Redirect HTTP requests to another server."""
|
||||
|
||||
from mitmproxy import http
|
||||
|
||||
|
||||
|
@ -1,4 +1,5 @@
|
||||
"""Send a reply from the proxy without sending the request to the remote server."""
|
||||
|
||||
from mitmproxy import http
|
||||
|
||||
|
||||
|
@ -7,6 +7,7 @@ Modifying streamed responses is tricky and brittle:
|
||||
- If you want to replace all occurrences of "foobar", make sure to catch the cases
|
||||
where one chunk ends with [...]foo" and the next starts with "bar[...].
|
||||
"""
|
||||
|
||||
from collections.abc import Iterable
|
||||
|
||||
|
||||
|
@ -6,6 +6,7 @@ the body is fully transmitted. Such trailers need to be announced in the initial
|
||||
headers by name, so the receiving endpoint can wait and read them after the
|
||||
body.
|
||||
"""
|
||||
|
||||
from mitmproxy import http
|
||||
from mitmproxy.http import Headers
|
||||
|
||||
|
@ -3,6 +3,7 @@ Mirror all web pages.
|
||||
|
||||
Useful if you are living down under.
|
||||
"""
|
||||
|
||||
from mitmproxy import http
|
||||
|
||||
|
||||
|
@ -2,6 +2,7 @@
|
||||
"""
|
||||
Read a mitmproxy dump file.
|
||||
"""
|
||||
|
||||
import pprint
|
||||
import sys
|
||||
|
||||
|
@ -7,6 +7,7 @@ In contrast to `-w`, this gives you full control over which
|
||||
flows should be saved and also allows you to rotate files or log
|
||||
to multiple files in parallel.
|
||||
"""
|
||||
|
||||
import random
|
||||
import sys
|
||||
from typing import BinaryIO
|
||||
|
@ -1,4 +1,5 @@
|
||||
"""Post messages to mitmproxy's event log."""
|
||||
|
||||
import logging
|
||||
|
||||
from mitmproxy.addonmanager import Loader
|
||||
|
@ -1,6 +1,7 @@
|
||||
"""
|
||||
Make events hooks non-blocking using async or @concurrent.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
import time
|
||||
|
@ -1,4 +1,5 @@
|
||||
"""React to configuration changes."""
|
||||
|
||||
from typing import Optional
|
||||
|
||||
from mitmproxy import ctx
|
||||
|
@ -5,6 +5,7 @@ Usage:
|
||||
|
||||
mitmproxy -s options-simple.py --set addheader=true
|
||||
"""
|
||||
|
||||
from mitmproxy import ctx
|
||||
|
||||
|
||||
|
@ -8,6 +8,7 @@ Usage:
|
||||
and then send a HTTP request to trigger the shutdown:
|
||||
curl --proxy localhost:8080 http://example.com/path
|
||||
"""
|
||||
|
||||
import logging
|
||||
|
||||
from mitmproxy import ctx
|
||||
|
@ -10,6 +10,7 @@ Example Invocation:
|
||||
|
||||
mitmdump --tcp-hosts ".*" -s examples/tcp-simple.py
|
||||
"""
|
||||
|
||||
import logging
|
||||
|
||||
from mitmproxy import tcp
|
||||
|
@ -3,6 +3,7 @@ Inject a WebSocket message into a running connection.
|
||||
|
||||
This example shows how to inject a WebSocket message into a running connection.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
|
||||
from mitmproxy import ctx
|
||||
|
@ -1,4 +1,5 @@
|
||||
"""Process individual messages from a WebSocket connection."""
|
||||
|
||||
import logging
|
||||
import re
|
||||
|
||||
|
@ -5,6 +5,7 @@ This example shows how to graft a WSGI app onto mitmproxy. In this
|
||||
instance, we're using the Flask framework (http://flask.pocoo.org/) to expose
|
||||
a single simplest-possible page.
|
||||
"""
|
||||
|
||||
from flask import Flask
|
||||
|
||||
from mitmproxy.addons import asgiapp
|
||||
|
@ -4,6 +4,7 @@ This module is for blocking DNS over HTTPS requests.
|
||||
It loads a blocklist of IPs and hostnames that are known to serve DNS over HTTPS requests.
|
||||
It also uses headers, query params, and paths to detect DoH (and block it)
|
||||
"""
|
||||
|
||||
import logging
|
||||
|
||||
# known DoH providers' hostnames and IP addresses to block
|
||||
|
@ -8,6 +8,7 @@ Example usage:
|
||||
- mitmdump -s custom_next_layer.py
|
||||
- curl -x localhost:8080 -k https://example.com
|
||||
"""
|
||||
|
||||
import logging
|
||||
|
||||
from mitmproxy import ctx
|
||||
|
@ -23,6 +23,7 @@ Usage:
|
||||
(Setting up a single proxy instance and using iptables to redirect to it
|
||||
works as well)
|
||||
"""
|
||||
|
||||
import re
|
||||
|
||||
# This regex extracts splits the host header into host and port.
|
||||
|
@ -14,6 +14,7 @@ Note:
|
||||
https://stackoverflow.com/questions/55358072/cookie-manipulation-in-mitmproxy-requests-and-responses
|
||||
|
||||
"""
|
||||
|
||||
import json
|
||||
|
||||
from mitmproxy import http
|
||||
|
@ -30,6 +30,7 @@ Configuration:
|
||||
dump_destination: "/user/rastley/output.log"
|
||||
EOF
|
||||
"""
|
||||
|
||||
import base64
|
||||
import json
|
||||
import logging
|
||||
|
@ -18,6 +18,7 @@ for associating a file with its corresponding flow in the stream saved with
|
||||
This addon is not compatible with addons that use the same mechanism to
|
||||
capture streamed data, http-stream-modify.py for instance.
|
||||
"""
|
||||
|
||||
import logging
|
||||
import os
|
||||
from datetime import datetime
|
||||
|
@ -2,6 +2,7 @@
|
||||
This script implements an sslstrip-like attack based on mitmproxy.
|
||||
https://moxie.org/software/sslstrip/
|
||||
"""
|
||||
|
||||
import re
|
||||
import urllib.parse
|
||||
|
||||
|
@ -4,6 +4,7 @@ For example, this functionality can be helpful if mitmproxy is used in between a
|
||||
Without this script, if the web application under test crashes, mitmproxy will send 502 Bad Gateway responses.
|
||||
These responses are irritating the web application scanner since they obfuscate the actual problem.
|
||||
"""
|
||||
|
||||
from mitmproxy import http
|
||||
from mitmproxy.exceptions import HttpSyntaxException
|
||||
|
||||
|
@ -42,7 +42,7 @@ class TestJSONDump:
|
||||
with taddons.context() as tctx:
|
||||
a = tctx.script(example_dir.path("complex/jsondump.py"))
|
||||
path = str(tmpdir.join("jsondump.out"))
|
||||
content = b"foo" + b"\xFF" * 10
|
||||
content = b"foo" + b"\xff" * 10
|
||||
tctx.configure(a, dump_destination=path, dump_encodecontent=True)
|
||||
|
||||
tctx.invoke(a, "response", self.flow(resp_content=content))
|
||||
|
@ -14,6 +14,7 @@ Example:
|
||||
3. curl --proxy http://localhost:8080 https://example.com
|
||||
// works again, but mitmproxy does not intercept and we do *not* see the contents
|
||||
"""
|
||||
|
||||
import collections
|
||||
import logging
|
||||
import random
|
||||
|
@ -34,6 +34,7 @@ Suggested Exploit: <script>alert(0)</script>
|
||||
Line: 1029zxcs'd"ao<ac>so[sb]po(pc)se;sl/bsl\eq=3847asd
|
||||
|
||||
"""
|
||||
|
||||
import logging
|
||||
import re
|
||||
import socket
|
||||
|
@ -48,9 +48,9 @@ class MockServer(layers.http.HttpConnection):
|
||||
def _handle_event(self, event: events.Event) -> CommandGenerator[None]:
|
||||
if isinstance(event, events.Start):
|
||||
content = self.flow.request.raw_content
|
||||
self.flow.request.timestamp_start = (
|
||||
self.flow.request.timestamp_end
|
||||
) = time.time()
|
||||
self.flow.request.timestamp_start = self.flow.request.timestamp_end = (
|
||||
time.time()
|
||||
)
|
||||
yield layers.http.ReceiveHttp(
|
||||
layers.http.RequestHeaders(
|
||||
1,
|
||||
|
@ -2,7 +2,6 @@ import logging
|
||||
|
||||
|
||||
class DisableH2C:
|
||||
|
||||
"""
|
||||
We currently only support HTTP/2 over a TLS connection.
|
||||
|
||||
|
@ -14,6 +14,7 @@ Sometimes it's useful to hardcode specific logic in next_layer when one wants to
|
||||
In that case it's not necessary to modify mitmproxy's source, adding a custom addon with a next_layer event hook
|
||||
that sets nextlayer.layer works just as well.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
|
@ -25,9 +25,9 @@ class ProxyAuth:
|
||||
validator: Validator | None = None
|
||||
|
||||
def __init__(self) -> None:
|
||||
self.authenticated: MutableMapping[
|
||||
connection.Client, tuple[str, str]
|
||||
] = weakref.WeakKeyDictionary()
|
||||
self.authenticated: MutableMapping[connection.Client, tuple[str, str]] = (
|
||||
weakref.WeakKeyDictionary()
|
||||
)
|
||||
"""Contains all connections that are permanently authenticated after an HTTP CONNECT"""
|
||||
|
||||
def load(self, loader):
|
||||
|
@ -1,6 +1,7 @@
|
||||
"""
|
||||
This addon is responsible for starting/stopping the proxy server sockets/instances specified by the mode option.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
|
@ -1,4 +1,5 @@
|
||||
"""Write flow objects to a HAR file"""
|
||||
|
||||
import base64
|
||||
import json
|
||||
import logging
|
||||
|
@ -34,9 +34,9 @@ def domain_match(a: str, b: str) -> bool:
|
||||
|
||||
class StickyCookie:
|
||||
def __init__(self) -> None:
|
||||
self.jar: collections.defaultdict[
|
||||
TOrigin, dict[str, str]
|
||||
] = collections.defaultdict(dict)
|
||||
self.jar: collections.defaultdict[TOrigin, dict[str, str]] = (
|
||||
collections.defaultdict(dict)
|
||||
)
|
||||
self.flt: flowfilter.TFilter | None = None
|
||||
|
||||
def load(self, loader):
|
||||
|
@ -8,6 +8,7 @@ The View:
|
||||
- Exposes a settings store for flows that automatically expires if the flow is
|
||||
removed from the store.
|
||||
"""
|
||||
|
||||
import collections
|
||||
import logging
|
||||
import re
|
||||
@ -136,20 +137,18 @@ orders = [
|
||||
]
|
||||
|
||||
|
||||
def _signal_with_flow(flow: mitmproxy.flow.Flow) -> None:
|
||||
...
|
||||
def _signal_with_flow(flow: mitmproxy.flow.Flow) -> None: ...
|
||||
|
||||
|
||||
def _sig_view_remove(flow: mitmproxy.flow.Flow, index: int) -> None:
|
||||
...
|
||||
def _sig_view_remove(flow: mitmproxy.flow.Flow, index: int) -> None: ...
|
||||
|
||||
|
||||
class View(collections.abc.Sequence):
|
||||
def __init__(self) -> None:
|
||||
super().__init__()
|
||||
self._store: collections.OrderedDict[
|
||||
str, mitmproxy.flow.Flow
|
||||
] = collections.OrderedDict()
|
||||
self._store: collections.OrderedDict[str, mitmproxy.flow.Flow] = (
|
||||
collections.OrderedDict()
|
||||
)
|
||||
self.filter = flowfilter.match_all
|
||||
# Should we show only marked flows?
|
||||
self.show_marked = False
|
||||
|
@ -1,6 +1,7 @@
|
||||
"""
|
||||
This module manages and invokes typed commands.
|
||||
This module manages and invokes typed commands.
|
||||
"""
|
||||
|
||||
import functools
|
||||
import inspect
|
||||
import logging
|
||||
|
@ -11,6 +11,7 @@ Thus, the View API is very minimalistic. The only arguments are `data` and
|
||||
metadata depend on the protocol in use. Known attributes can be found in
|
||||
`base.View`.
|
||||
"""
|
||||
|
||||
import traceback
|
||||
|
||||
from ..tcp import TCPMessage
|
||||
@ -50,8 +51,7 @@ from mitmproxy.utils import strutils
|
||||
views: list[View] = []
|
||||
|
||||
|
||||
def _update(view: View) -> None:
|
||||
...
|
||||
def _update(view: View) -> None: ...
|
||||
|
||||
|
||||
on_add = signals.SyncSignal(_update)
|
||||
|
@ -1,5 +1,4 @@
|
||||
class BiDi:
|
||||
|
||||
"""
|
||||
A wee utility class for keeping bi-directional mappings, like field
|
||||
constants in protocols. Names are attributes on the object, dict-like
|
||||
|
@ -1,37 +1,38 @@
|
||||
"""
|
||||
The following operators are understood:
|
||||
The following operators are understood:
|
||||
|
||||
~q Request
|
||||
~s Response
|
||||
~q Request
|
||||
~s Response
|
||||
|
||||
Headers:
|
||||
Headers:
|
||||
|
||||
Patterns are matched against "name: value" strings. Field names are
|
||||
all-lowercase.
|
||||
Patterns are matched against "name: value" strings. Field names are
|
||||
all-lowercase.
|
||||
|
||||
~a Asset content-type in response. Asset content types are:
|
||||
text/javascript
|
||||
application/x-javascript
|
||||
application/javascript
|
||||
text/css
|
||||
image/*
|
||||
font/*
|
||||
application/font-*
|
||||
~h rex Header line in either request or response
|
||||
~hq rex Header in request
|
||||
~hs rex Header in response
|
||||
~a Asset content-type in response. Asset content types are:
|
||||
text/javascript
|
||||
application/x-javascript
|
||||
application/javascript
|
||||
text/css
|
||||
image/*
|
||||
font/*
|
||||
application/font-*
|
||||
~h rex Header line in either request or response
|
||||
~hq rex Header in request
|
||||
~hs rex Header in response
|
||||
|
||||
~b rex Expression in the body of either request or response
|
||||
~bq rex Expression in the body of request
|
||||
~bs rex Expression in the body of response
|
||||
~t rex Shortcut for content-type header.
|
||||
~b rex Expression in the body of either request or response
|
||||
~bq rex Expression in the body of request
|
||||
~bs rex Expression in the body of response
|
||||
~t rex Shortcut for content-type header.
|
||||
|
||||
~d rex Request domain
|
||||
~m rex Method
|
||||
~u rex URL
|
||||
~c CODE Response code.
|
||||
rex Equivalent to ~u rex
|
||||
~d rex Request domain
|
||||
~m rex Method
|
||||
~u rex URL
|
||||
~c CODE Response code.
|
||||
rex Equivalent to ~u rex
|
||||
"""
|
||||
|
||||
import functools
|
||||
import re
|
||||
import sys
|
||||
@ -643,8 +644,7 @@ bnf = _make()
|
||||
class TFilter(Protocol):
|
||||
pattern: str
|
||||
|
||||
def __call__(self, f: flow.Flow) -> bool:
|
||||
... # pragma: no cover
|
||||
def __call__(self, f: flow.Flow) -> bool: ... # pragma: no cover
|
||||
|
||||
|
||||
def parse(s: str) -> TFilter:
|
||||
|
@ -979,9 +979,9 @@ class Request(Message):
|
||||
on generating the boundary.
|
||||
"""
|
||||
boundary = "-" * 20 + binascii.hexlify(os.urandom(16)).decode()
|
||||
self.headers["content-type"] = (
|
||||
ct
|
||||
) = f"multipart/form-data; boundary={boundary}"
|
||||
self.headers["content-type"] = ct = (
|
||||
f"multipart/form-data; boundary={boundary}"
|
||||
)
|
||||
self.content = multipart.encode_multipart(ct, value)
|
||||
|
||||
@property
|
||||
|
@ -5,6 +5,7 @@ The flow file version is decoupled from the mitmproxy release cycle (since
|
||||
v3.0.0dev) and versioning. Every change or migration gets a new flow file
|
||||
version number, this prevents issues with developer builds and snapshots.
|
||||
"""
|
||||
|
||||
import copy
|
||||
import uuid
|
||||
from typing import Any
|
||||
|
@ -1,4 +1,5 @@
|
||||
"""Reads HAR files into flow objects"""
|
||||
|
||||
import base64
|
||||
import logging
|
||||
import time
|
||||
|
@ -39,6 +39,7 @@ all other strings are returned as plain bytes.
|
||||
|
||||
:License: MIT
|
||||
"""
|
||||
|
||||
import collections
|
||||
from typing import BinaryIO
|
||||
from typing import Union
|
||||
|
@ -1,6 +1,7 @@
|
||||
"""
|
||||
Utility functions for decoding response bodies.
|
||||
"""
|
||||
|
||||
import codecs
|
||||
import collections
|
||||
import gzip
|
||||
@ -20,18 +21,15 @@ _cache = CachedDecode(None, None, None, None)
|
||||
|
||||
|
||||
@overload
|
||||
def decode(encoded: None, encoding: str, errors: str = "strict") -> None:
|
||||
...
|
||||
def decode(encoded: None, encoding: str, errors: str = "strict") -> None: ...
|
||||
|
||||
|
||||
@overload
|
||||
def decode(encoded: str, encoding: str, errors: str = "strict") -> str:
|
||||
...
|
||||
def decode(encoded: str, encoding: str, errors: str = "strict") -> str: ...
|
||||
|
||||
|
||||
@overload
|
||||
def decode(encoded: bytes, encoding: str, errors: str = "strict") -> str | bytes:
|
||||
...
|
||||
def decode(encoded: bytes, encoding: str, errors: str = "strict") -> str | bytes: ...
|
||||
|
||||
|
||||
def decode(
|
||||
@ -81,18 +79,15 @@ def decode(
|
||||
|
||||
|
||||
@overload
|
||||
def encode(decoded: None, encoding: str, errors: str = "strict") -> None:
|
||||
...
|
||||
def encode(decoded: None, encoding: str, errors: str = "strict") -> None: ...
|
||||
|
||||
|
||||
@overload
|
||||
def encode(decoded: str, encoding: str, errors: str = "strict") -> str | bytes:
|
||||
...
|
||||
def encode(decoded: str, encoding: str, errors: str = "strict") -> str | bytes: ...
|
||||
|
||||
|
||||
@overload
|
||||
def encode(decoded: bytes, encoding: str, errors: str = "strict") -> bytes:
|
||||
...
|
||||
def encode(decoded: bytes, encoding: str, errors: str = "strict") -> bytes: ...
|
||||
|
||||
|
||||
def encode(
|
||||
|
@ -1,6 +1,6 @@
|
||||
"""
|
||||
A small collection of useful user-agent header strings. These should be
|
||||
kept reasonably current to reflect common usage.
|
||||
A small collection of useful user-agent header strings. These should be
|
||||
kept reasonably current to reflect common usage.
|
||||
"""
|
||||
# pylint: line-too-long
|
||||
# A collection of (name, shortcut, string) tuples.
|
||||
|
@ -1,6 +1,7 @@
|
||||
"""
|
||||
Server specs are used to describe an upstream proxy or server.
|
||||
"""
|
||||
|
||||
import re
|
||||
from functools import cache
|
||||
from typing import Literal
|
||||
|
@ -6,6 +6,7 @@ possibly to the master and addons.
|
||||
|
||||
The counterpart to commands are events.
|
||||
"""
|
||||
|
||||
import logging
|
||||
import warnings
|
||||
from typing import TYPE_CHECKING
|
||||
|
@ -3,6 +3,7 @@ When IO actions occur at the proxy server, they are passed down to layers as eve
|
||||
Events represent the only way for layers to receive new data from sockets.
|
||||
The counterpart to events are commands.
|
||||
"""
|
||||
|
||||
import typing
|
||||
import warnings
|
||||
from dataclasses import dataclass
|
||||
|
@ -1,6 +1,7 @@
|
||||
"""
|
||||
Base class for protocol layers.
|
||||
"""
|
||||
|
||||
import collections
|
||||
import textwrap
|
||||
from abc import abstractmethod
|
||||
|
@ -64,7 +64,9 @@ class QuicTlsSettings:
|
||||
"""The certificate to use for the connection."""
|
||||
certificate_chain: list[x509.Certificate] = field(default_factory=list)
|
||||
"""A list of additional certificates to send to the peer."""
|
||||
certificate_private_key: dsa.DSAPrivateKey | ec.EllipticCurvePrivateKey | rsa.RSAPrivateKey | None = None
|
||||
certificate_private_key: (
|
||||
dsa.DSAPrivateKey | ec.EllipticCurvePrivateKey | rsa.RSAPrivateKey | None
|
||||
) = None
|
||||
"""The certificate's private key."""
|
||||
cipher_suites: list[CipherSuite] | None = None
|
||||
"""An optional list of allowed/advertised cipher suites."""
|
||||
@ -481,9 +483,9 @@ class QuicStreamLayer(layer.Layer):
|
||||
else:
|
||||
break # pragma: no cover
|
||||
if isinstance(child_layer, (UDPLayer, TCPLayer)) and child_layer.flow:
|
||||
child_layer.flow.metadata[
|
||||
"quic_is_unidirectional"
|
||||
] = stream_is_unidirectional(self._client_stream_id)
|
||||
child_layer.flow.metadata["quic_is_unidirectional"] = (
|
||||
stream_is_unidirectional(self._client_stream_id)
|
||||
)
|
||||
child_layer.flow.metadata["quic_initiator"] = (
|
||||
"client"
|
||||
if stream_is_client_initiated(self._client_stream_id)
|
||||
|
@ -9,6 +9,7 @@ Example:
|
||||
await inst.start()
|
||||
# TCP server is running now.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
@ -84,8 +85,7 @@ class ServerManager(typing.Protocol):
|
||||
@contextmanager
|
||||
def register_connection(
|
||||
self, connection_id: tuple | str, handler: ProxyConnectionHandler
|
||||
):
|
||||
... # pragma: no cover
|
||||
): ... # pragma: no cover
|
||||
|
||||
|
||||
class ServerInstance(Generic[M], metaclass=ABCMeta):
|
||||
|
@ -19,6 +19,7 @@ Examples:
|
||||
RegularMode.parse("socks5") # ValueError
|
||||
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import dataclasses
|
||||
|
@ -6,6 +6,7 @@ The very high level overview is as follows:
|
||||
- Process any commands from layer (such as opening a server connection)
|
||||
- Wait for any IO and send it as events to top layer.
|
||||
"""
|
||||
|
||||
import abc
|
||||
import asyncio
|
||||
import collections
|
||||
|
@ -1,6 +1,7 @@
|
||||
"""
|
||||
Utility decorators that help build state machines
|
||||
"""
|
||||
|
||||
import functools
|
||||
|
||||
from mitmproxy.proxy import events
|
||||
|
@ -2,6 +2,7 @@
|
||||
This module provides a @concurrent decorator primitive to
|
||||
offload computations from mitmproxy's main master thread.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import inspect
|
||||
|
||||
|
@ -95,8 +95,8 @@ if urwid.util.detected_encoding:
|
||||
SYMBOL_REPLAY = "\u21ba"
|
||||
SYMBOL_RETURN = "\u2190"
|
||||
SYMBOL_MARK = "\u25cf"
|
||||
SYMBOL_UP = "\u21E7"
|
||||
SYMBOL_DOWN = "\u21E9"
|
||||
SYMBOL_UP = "\u21e7"
|
||||
SYMBOL_DOWN = "\u21e9"
|
||||
SYMBOL_ELLIPSIS = "\u2026"
|
||||
SYMBOL_FROM_CLIENT = "\u21d2"
|
||||
SYMBOL_TO_CLIENT = "\u21d0"
|
||||
|
@ -21,13 +21,11 @@ from mitmproxy.utils import strutils
|
||||
|
||||
|
||||
@overload
|
||||
def read_file(filename: str, escaped: Literal[True]) -> bytes:
|
||||
...
|
||||
def read_file(filename: str, escaped: Literal[True]) -> bytes: ...
|
||||
|
||||
|
||||
@overload
|
||||
def read_file(filename: str, escaped: Literal[False]) -> str:
|
||||
...
|
||||
def read_file(filename: str, escaped: Literal[False]) -> str: ...
|
||||
|
||||
|
||||
def read_file(filename: str, escaped: bool) -> bytes | str:
|
||||
|
@ -4,6 +4,7 @@ Welcome to the encoding dance!
|
||||
In a nutshell, text columns are actually a proxy class for byte columns,
|
||||
which just encode/decodes contents.
|
||||
"""
|
||||
|
||||
from mitmproxy.tools.console import signals
|
||||
from mitmproxy.tools.console.grideditor import col_bytes
|
||||
|
||||
|
@ -1,6 +1,7 @@
|
||||
"""
|
||||
A display-only column that displays any data type.
|
||||
"""
|
||||
|
||||
from typing import Any
|
||||
|
||||
import urwid
|
||||
|
@ -146,7 +146,6 @@ def gen_rgb_gradient(palette, cols):
|
||||
|
||||
|
||||
class LowDark(Palette):
|
||||
|
||||
"""
|
||||
Low-color dark background
|
||||
"""
|
||||
@ -247,7 +246,6 @@ class Dark(LowDark):
|
||||
|
||||
|
||||
class LowLight(Palette):
|
||||
|
||||
"""
|
||||
Low-color light background
|
||||
"""
|
||||
|
@ -1,6 +1,7 @@
|
||||
"""
|
||||
This module is reponsible for drawing the quick key help at the bottom of mitmproxy.
|
||||
"""
|
||||
|
||||
from dataclasses import dataclass
|
||||
from typing import Union
|
||||
|
||||
|
@ -10,8 +10,7 @@ StatusMessage = Union[tuple[str, str], str]
|
||||
|
||||
# Show a status message in the action bar
|
||||
# Instead of using this signal directly, consider emitting a log event.
|
||||
def _status_message(message: StatusMessage, expire: int = 5) -> None:
|
||||
...
|
||||
def _status_message(message: StatusMessage, expire: int = 5) -> None: ...
|
||||
|
||||
|
||||
status_message = signals.SyncSignal(_status_message)
|
||||
@ -20,8 +19,7 @@ status_message = signals.SyncSignal(_status_message)
|
||||
# Prompt for input
|
||||
def _status_prompt(
|
||||
prompt: str, text: str | None, callback: Callable[[str], None]
|
||||
) -> None:
|
||||
...
|
||||
) -> None: ...
|
||||
|
||||
|
||||
status_prompt = signals.SyncSignal(_status_prompt)
|
||||
@ -30,24 +28,21 @@ status_prompt = signals.SyncSignal(_status_prompt)
|
||||
# Prompt for a single keystroke
|
||||
def _status_prompt_onekey(
|
||||
prompt: str, keys: list[tuple[str, str]], callback: Callable[[str], None]
|
||||
) -> None:
|
||||
...
|
||||
) -> None: ...
|
||||
|
||||
|
||||
status_prompt_onekey = signals.SyncSignal(_status_prompt_onekey)
|
||||
|
||||
|
||||
# Prompt for a command
|
||||
def _status_prompt_command(partial: str = "", cursor: int | None = None) -> None:
|
||||
...
|
||||
def _status_prompt_command(partial: str = "", cursor: int | None = None) -> None: ...
|
||||
|
||||
|
||||
status_prompt_command = signals.SyncSignal(_status_prompt_command)
|
||||
|
||||
|
||||
# Call a callback in N seconds
|
||||
def _call_in(seconds: float, callback: Callable[[], None]) -> None:
|
||||
...
|
||||
def _call_in(seconds: float, callback: Callable[[], None]) -> None: ...
|
||||
|
||||
|
||||
call_in = signals.SyncSignal(_call_in)
|
||||
|
@ -7,6 +7,7 @@ This is similar to the Blinker library (https://pypi.org/project/blinker/), with
|
||||
- supports type hints
|
||||
- supports async receivers.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
|
@ -8,13 +8,11 @@ from typing import overload
|
||||
|
||||
|
||||
@overload
|
||||
def always_bytes(str_or_bytes: None, *encode_args) -> None:
|
||||
...
|
||||
def always_bytes(str_or_bytes: None, *encode_args) -> None: ...
|
||||
|
||||
|
||||
@overload
|
||||
def always_bytes(str_or_bytes: str | bytes, *encode_args) -> bytes:
|
||||
...
|
||||
def always_bytes(str_or_bytes: str | bytes, *encode_args) -> bytes: ...
|
||||
|
||||
|
||||
def always_bytes(str_or_bytes: None | str | bytes, *encode_args) -> None | bytes:
|
||||
@ -29,13 +27,11 @@ def always_bytes(str_or_bytes: None | str | bytes, *encode_args) -> None | bytes
|
||||
|
||||
|
||||
@overload
|
||||
def always_str(str_or_bytes: None, *encode_args) -> None:
|
||||
...
|
||||
def always_str(str_or_bytes: None, *encode_args) -> None: ...
|
||||
|
||||
|
||||
@overload
|
||||
def always_str(str_or_bytes: str | bytes, *encode_args) -> str:
|
||||
...
|
||||
def always_str(str_or_bytes: str | bytes, *encode_args) -> str: ...
|
||||
|
||||
|
||||
def always_str(str_or_bytes: None | str | bytes, *decode_args) -> None | str:
|
||||
|
@ -1,6 +1,7 @@
|
||||
"""
|
||||
This module provides a method to detect if a given file object supports virtual terminal escape codes.
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
from typing import IO
|
||||
|
@ -5,6 +5,7 @@ as HTTP flows as well. They can be distinguished from regular HTTP requests by h
|
||||
|
||||
This module only defines the classes for individual `WebSocketMessage`s and the `WebSocketData` container.
|
||||
"""
|
||||
|
||||
import time
|
||||
import warnings
|
||||
from dataclasses import dataclass
|
||||
|
@ -3,6 +3,7 @@
|
||||
Building and deploying docker images is a bit of a special snowflake as we don't get a file we can upload/download
|
||||
as an artifact. So we need to do everything in one job.
|
||||
"""
|
||||
|
||||
import os
|
||||
import shutil
|
||||
import subprocess
|
||||
|
@ -296,9 +296,10 @@ def installbuilder_installer():
|
||||
if not IB_LICENSE.exists():
|
||||
print("Decrypt InstallBuilder license...")
|
||||
f = cryptography.fernet.Fernet(os.environ["CI_BUILD_KEY"].encode())
|
||||
with open(IB_LICENSE.with_suffix(".xml.enc"), "rb") as infile, open(
|
||||
IB_LICENSE, "wb"
|
||||
) as outfile:
|
||||
with (
|
||||
open(IB_LICENSE.with_suffix(".xml.enc"), "rb") as infile,
|
||||
open(IB_LICENSE, "wb") as outfile,
|
||||
):
|
||||
outfile.write(f.decrypt(infile.read()))
|
||||
|
||||
if not IB_CLI.exists():
|
||||
|
@ -9,6 +9,7 @@ References:
|
||||
- https://docs.microsoft.com/en-us/windows/uwp/monetize/python-code-examples-for-the-windows-store-submission-api
|
||||
- https://docs.microsoft.com/en-us/windows/uwp/monetize/python-code-examples-for-submissions-game-options-and-trailers
|
||||
"""
|
||||
|
||||
import http.client
|
||||
import json
|
||||
import os
|
||||
|
@ -3,6 +3,7 @@ This addons is used for binaries to perform a minimal selftest. Use like so:
|
||||
|
||||
mitmdump -s selftest.py -p 0
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
import ssl
|
||||
|
@ -6,9 +6,11 @@ from mitmproxy.test import taddons
|
||||
|
||||
def test_browser(caplog):
|
||||
caplog.set_level("INFO")
|
||||
with mock.patch("subprocess.Popen") as po, mock.patch(
|
||||
"shutil.which"
|
||||
) as which, taddons.context():
|
||||
with (
|
||||
mock.patch("subprocess.Popen") as po,
|
||||
mock.patch("shutil.which") as which,
|
||||
taddons.context(),
|
||||
):
|
||||
which.return_value = "chrome"
|
||||
b = browser.Browser()
|
||||
b.start()
|
||||
@ -48,9 +50,10 @@ async def test_get_browser_cmd_flatpak():
|
||||
returncode = 0 if cmd == ["flatpak", "info", "com.google.Chrome"] else 1
|
||||
return mock.Mock(returncode=returncode)
|
||||
|
||||
with mock.patch("shutil.which") as which, mock.patch(
|
||||
"subprocess.run"
|
||||
) as subprocess_run:
|
||||
with (
|
||||
mock.patch("shutil.which") as which,
|
||||
mock.patch("subprocess.run") as subprocess_run,
|
||||
):
|
||||
which.side_effect = lambda cmd: cmd == "flatpak"
|
||||
subprocess_run.side_effect = subprocess_run_mock
|
||||
assert browser.get_browser_cmd() == [
|
||||
@ -62,9 +65,10 @@ async def test_get_browser_cmd_flatpak():
|
||||
|
||||
|
||||
async def test_get_browser_cmd_no_flatpak():
|
||||
with mock.patch("shutil.which") as which, mock.patch(
|
||||
"subprocess.run"
|
||||
) as subprocess_run:
|
||||
with (
|
||||
mock.patch("shutil.which") as which,
|
||||
mock.patch("subprocess.run") as subprocess_run,
|
||||
):
|
||||
which.side_effect = lambda cmd: cmd == "flatpak"
|
||||
subprocess_run.return_value = mock.Mock(returncode=1)
|
||||
assert browser.get_browser_cmd() is None
|
||||
|
@ -60,7 +60,7 @@ def test_extract(tdata):
|
||||
|
||||
def test_extract_str():
|
||||
tf = tflow.tflow()
|
||||
tf.request.raw_content = b"\xFF"
|
||||
tf.request.raw_content = b"\xff"
|
||||
assert cut.extract_str("request.raw_content", tf) == r"b'\xff'"
|
||||
|
||||
|
||||
|
@ -502,7 +502,7 @@ explicit_proxy_configs = [
|
||||
partial(HttpStream, stream_id=1),
|
||||
],
|
||||
after=[modes.HttpProxy, HttpLayer, HttpStream, TCPLayer],
|
||||
data_client=b"\xFF",
|
||||
data_client=b"\xff",
|
||||
),
|
||||
id=f"explicit proxy: TCP over regular proxy",
|
||||
),
|
||||
@ -676,7 +676,7 @@ transparent_proxy_configs = [
|
||||
after=[modes.TransparentProxy, UDPLayer],
|
||||
server_address=("192.0.2.1", 53),
|
||||
transport_protocol="udp",
|
||||
data_client=b"\xFF",
|
||||
data_client=b"\xff",
|
||||
),
|
||||
id="transparent proxy: raw udp",
|
||||
),
|
||||
|
@ -146,7 +146,7 @@ def test_tls_setup():
|
||||
|
||||
def test_binary_content():
|
||||
resp_content = SaveHar().make_har(
|
||||
[tflow.tflow(resp=tutils.tresp(content=b"foo" + b"\xFF" * 10))]
|
||||
[tflow.tflow(resp=tutils.tresp(content=b"foo" + b"\xff" * 10))]
|
||||
)["log"]["entries"][0]["response"]["content"]
|
||||
assert resp_content == {
|
||||
"compression": 0,
|
||||
|
@ -121,9 +121,9 @@ class TestStickyCookie:
|
||||
# Test that a cookie is be deleted
|
||||
# by setting the expire time in the past
|
||||
f = self._response(sc, "duffer=zafar; Path=/", "www.google.com")
|
||||
f.response.headers[
|
||||
"Set-Cookie"
|
||||
] = "duffer=; Expires=Thu, 01-Jan-1970 00:00:00 GMT"
|
||||
f.response.headers["Set-Cookie"] = (
|
||||
"duffer=; Expires=Thu, 01-Jan-1970 00:00:00 GMT"
|
||||
)
|
||||
sc.response(f)
|
||||
assert not sc.jar.keys()
|
||||
|
||||
|
@ -682,7 +682,7 @@ def test_configure():
|
||||
[
|
||||
[":default:", SYMBOL_MARK],
|
||||
["X", "X"],
|
||||
[":grapes:", "\N{grapes}"],
|
||||
[":grapes:", "\N{GRAPES}"],
|
||||
[":not valid:", SYMBOL_MARK],
|
||||
[":weird", SYMBOL_MARK],
|
||||
],
|
||||
|
@ -46,7 +46,7 @@ def test_view_auto():
|
||||
)
|
||||
assert f[0] == "Unknown Image"
|
||||
|
||||
f = v(b"\xFF" * 30)
|
||||
f = v(b"\xff" * 30)
|
||||
assert f[0] == "Hexdump"
|
||||
|
||||
f = v(
|
||||
|
@ -10,7 +10,7 @@ class TestHexDump:
|
||||
def test_render_priority(self):
|
||||
v = hex.ViewHexDump()
|
||||
assert not v.render_priority(b"ascii")
|
||||
assert v.render_priority(b"\xFF")
|
||||
assert v.render_priority(b"\xff")
|
||||
assert not v.render_priority(b"")
|
||||
|
||||
|
||||
@ -22,5 +22,5 @@ class TestHexStream:
|
||||
def test_render_priority(self):
|
||||
v = hex.ViewHexStream()
|
||||
assert not v.render_priority(b"ascii")
|
||||
assert v.render_priority(b"\xFF")
|
||||
assert v.render_priority(b"\xff")
|
||||
assert not v.render_priority(b"")
|
||||
|
@ -12,7 +12,7 @@ def test_parse_json():
|
||||
assert json.parse_json(
|
||||
b'{"foo" : "\xe4\xb8\x96\xe7\x95\x8c"}'
|
||||
) # utf8 with chinese characters
|
||||
assert json.parse_json(b'{"foo" : "\xFF"}') is json.PARSE_ERROR
|
||||
assert json.parse_json(b'{"foo" : "\xff"}') is json.PARSE_ERROR
|
||||
|
||||
|
||||
def test_format_json():
|
||||
|
@ -7,13 +7,13 @@ from mitmproxy.contentviews import mqtt
|
||||
@pytest.mark.parametrize(
|
||||
"data,expected_text",
|
||||
[
|
||||
pytest.param(b"\xC0\x00", "[PINGREQ]", id="PINGREQ"),
|
||||
pytest.param(b"\xD0\x00", "[PINGRESP]", id="PINGRESP"),
|
||||
pytest.param(b"\xc0\x00", "[PINGREQ]", id="PINGREQ"),
|
||||
pytest.param(b"\xd0\x00", "[PINGRESP]", id="PINGRESP"),
|
||||
pytest.param(
|
||||
b"\x90\x00", "Packet type SUBACK is not supported yet!", id="SUBACK"
|
||||
),
|
||||
pytest.param(
|
||||
b"\xA0\x00",
|
||||
b"\xa0\x00",
|
||||
"Packet type UNSUBSCRIBE is not supported yet!",
|
||||
id="UNSUBSCRIBE",
|
||||
),
|
||||
@ -58,7 +58,7 @@ def test_view_mqtt(data, expected_text):
|
||||
assert output == [[("text", expected_text)]]
|
||||
|
||||
|
||||
@pytest.mark.parametrize("data", [b"\xC0\xFF\xFF\xFF\xFF"])
|
||||
@pytest.mark.parametrize("data", [b"\xc0\xff\xff\xff\xff"])
|
||||
def test_mqtt_malformed(data):
|
||||
v = full_eval(mqtt.ViewMQTT())
|
||||
with pytest.raises(Exception):
|
||||
|
@ -11,9 +11,9 @@ def test_view_raw():
|
||||
[[("text", "🫠".encode())]],
|
||||
)
|
||||
# invalid utf8
|
||||
assert v(b"\xFF") == (
|
||||
assert v(b"\xff") == (
|
||||
"Raw",
|
||||
[[("text", b"\xFF")]],
|
||||
[[("text", b"\xff")]],
|
||||
)
|
||||
|
||||
|
||||
|
@ -12,7 +12,7 @@ def test_view_urlencoded():
|
||||
d = url.encode([("adsfa", "")]).encode()
|
||||
assert v(d)
|
||||
|
||||
assert not v(b"\xFF\x00")
|
||||
assert not v(b"\xff\x00")
|
||||
|
||||
|
||||
def test_render_priority():
|
||||
|
@ -7,7 +7,7 @@ datadir = "mitmproxy/contentviews/test_wbxml_data/"
|
||||
def test_wbxml(tdata):
|
||||
v = full_eval(wbxml.ViewWBXML())
|
||||
|
||||
assert v(b"\x03\x01\x6A\x00") == ("WBXML", [[("text", '<?xml version="1.0" ?>')]])
|
||||
assert v(b"\x03\x01\x6a\x00") == ("WBXML", [[("text", '<?xml version="1.0" ?>')]])
|
||||
assert v(b"foo") is None
|
||||
|
||||
path = tdata.path(
|
||||
|
@ -1,6 +1,7 @@
|
||||
"""
|
||||
Generate SSL test certificates.
|
||||
"""
|
||||
|
||||
import os
|
||||
import shlex
|
||||
import shutil
|
||||
|
@ -8,7 +8,7 @@ from mitmproxy.net.dns import domain_names
|
||||
|
||||
def test_unpack_from_with_compression():
|
||||
assert domain_names.unpack_from_with_compression(
|
||||
b"\xFF\x03www\x07example\x03org\x00", 1, domain_names.cache()
|
||||
b"\xff\x03www\x07example\x03org\x00", 1, domain_names.cache()
|
||||
) == (
|
||||
"www.example.org",
|
||||
17,
|
||||
@ -20,7 +20,7 @@ def test_unpack_from_with_compression():
|
||||
b"\x03www\xc0\x00", 0, domain_names.cache()
|
||||
)
|
||||
assert domain_names.unpack_from_with_compression(
|
||||
b"\xFF\xFF\xFF\x07example\x03org\x00\xFF\xFF\xFF\x03www\xc0\x03",
|
||||
b"\xff\xff\xff\x07example\x03org\x00\xff\xff\xff\x03www\xc0\x03",
|
||||
19,
|
||||
domain_names.cache(),
|
||||
) == ("www.example.org", 6)
|
||||
@ -31,7 +31,7 @@ def test_unpack():
|
||||
with pytest.raises(
|
||||
struct.error, match=re.escape("unpack requires a buffer of 17 bytes")
|
||||
):
|
||||
domain_names.unpack(b"\x03www\x07example\x03org\x00\xFF")
|
||||
domain_names.unpack(b"\x03www\x07example\x03org\x00\xff")
|
||||
with pytest.raises(
|
||||
struct.error,
|
||||
match=re.escape("unpack encountered a pointer which is not supported in RDATA"),
|
||||
|
@ -123,7 +123,7 @@ def test_expected_http_body_size():
|
||||
with pytest.raises(ValueError, match="Invalid transfer encoding"):
|
||||
expected_http_body_size(
|
||||
treq(
|
||||
headers=Headers(transfer_encoding="chun\u212Aed")
|
||||
headers=Headers(transfer_encoding="chun\u212aed")
|
||||
), # "chunKed".lower() == "chunked"
|
||||
)
|
||||
with pytest.raises(ValueError, match="Unknown transfer encoding"):
|
||||
|
@ -36,7 +36,7 @@ def test_assemble_content_type():
|
||||
("", b"", "latin-1"),
|
||||
("", b"foo", "latin-1"),
|
||||
("", b"\xfc", "latin-1"),
|
||||
("", b"\xF0\xE2", "latin-1"),
|
||||
("", b"\xf0\xe2", "latin-1"),
|
||||
("text/html; charset=latin1", b"\xc3\xbc", "latin1"),
|
||||
("text/html; charset=utf8", b"\xc3\xbc", "utf8"),
|
||||
# json
|
||||
|
@ -6,6 +6,7 @@ Usage:
|
||||
See also:
|
||||
- https://github.com/mitmproxy/proxybench
|
||||
"""
|
||||
|
||||
import copy
|
||||
|
||||
from .layers import test_tcp
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user