This PR does a few related things to dump_devinfo:
- Store the raw discovery result in the fixture.
- Consolidate redaction logic so it's not duplicated in dump_devinfo.
- Update existing fixtures to:
- Store raw discovery result under `result`
- Use `SCRUBBED_CHILD_DEVICE_ID` everywhere
- Have correct values as per the consolidated redactors.
This PR implements a clear-text, token-based transport protocol seen on
RV30 Plus (#937).
- Client sends `{"username": "email@example.com", "password":
md5(password)}` and gets back a token in the response
- Rest of the communications are done with POST at `/app?token=<token>`
---------
Co-authored-by: Steven B. <51370195+sdb9696@users.noreply.github.com>
Python 3.11 ships with latest Debian Bookworm.
pypy is not that widely used with this library based on statistics. It could be added back when pypy supports python 3.11.
Mashumaro is faster and doesn't come with all versioning problems that
pydantic does.
A basic perf test deserializing all of our discovery results fixtures
shows mashumaro as being about 6 times faster deserializing dicts than
pydantic. It's much faster parsing from a json string but that's likely
because it uses orjson under the hood although that's not really our use
case at the moment.
```
PYDANTIC - ms
=================
json dict
-----------------
4.7665 1.3268
3.1548 1.5922
3.1130 1.8039
4.2834 2.7606
2.0669 1.3757
2.0163 1.6377
3.1667 1.3561
4.1296 2.7297
2.0132 1.3471
4.0648 1.4105
MASHUMARO - ms
=================
json dict
-----------------
0.5977 0.5543
0.5336 0.2983
0.3955 0.2549
0.6516 0.2742
0.5386 0.2706
0.6678 0.2580
0.4120 0.2511
0.3836 0.2472
0.4020 0.2465
0.4268 0.2487
```
I noticed after building a new linux instance that running `git commit`
when the virtual environment is not active causes the pre-commit to
fail, as the `generate_supported` hook is not explicitly configured to
run in the virtual env. This PR calls `generate_supported` via the
`run-in-env.sh` script.
- Fixes issue running pyshark on new thread in windows
- Fixes bug if handshake repeated during capture
- Tries the default tplink hardcoded credentials as per the library
PR with just the initial structural changes for the cli to be a package.
Subsequent PR will break out `main.py` into modules. Doing it in two
stages ensure that the commit history will be continuous for `cli.py` >
`cli/main.py`
Ensures that all modules try to access their data in `_post_update_hook` in a safe manner and disable themselves if there's an error.
Also adds parameters to get_preset_rules and get_on_off_gradually_info to fix issues with recent firmware updates.
[#1033](https://github.com/python-kasa/python-kasa/issues/1033)
For some time I've noticed that my IDE is reporting mypy errors that the
pre-commit hook is not picking up. This is because [mypy
mirror](https://github.com/pre-commit/mirrors-mypy) runs in an isolated
pre-commit environment which does not have dependencies installed and it
enables `--ignore-missing-imports` to avoid errors.
This is [advised against by
mypy](https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-library-stubs-or-py-typed-marker)
for obvious reasons:
> We recommend avoiding --ignore-missing-imports if possible: it’s
equivalent to adding a # type: ignore to all unresolved imports in your
codebase.
This PR configures the mypy pre-commit hook to run in the virtual
environment and addresses the additional errors identified as a result.
It also introduces a minimal mypy config into the `pyproject.toml`
[mypy errors identified without the fixes in this
PR](https://github.com/user-attachments/files/15896693/mypyerrors.txt)
Adds username and password arguments to discovery to remove the need to import Credentials.
Creates TypeAliases in Device for connection configuration classes and DeviceType.
Using the API with these changes will only require importing either Discover or Device
depending on whether using Discover.discover() or Device.connect() to
initialize and interact with the API.