Merge branch 'master' into asgi-refactor-attempt

This commit is contained in:
Adam Hopkins 2019-05-06 12:59:56 +03:00 committed by GitHub
commit c68523150f
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
85 changed files with 2466 additions and 808 deletions

View File

@ -2,11 +2,6 @@ version: "{branch}.{build}"
environment:
matrix:
- TOXENV: py35-no-ext
PYTHON: "C:\\Python35-x64"
PYTHON_VERSION: "3.5.x"
PYTHON_ARCH: "64"
- TOXENV: py36-no-ext
PYTHON: "C:\\Python36-x64"
PYTHON_VERSION: "3.6.x"

3
.gitignore vendored
View File

@ -15,4 +15,5 @@ docs/_build/
docs/_api/
build/*
.DS_Store
dist/*
dist/*
pip-wheel-metadata/

View File

@ -5,10 +5,6 @@ cache:
- $HOME/.cache/pip
matrix:
include:
- env: TOX_ENV=py35
python: 3.5
- env: TOX_ENV=py35-no-ext
python: 3.5
- env: TOX_ENV=py36
python: 3.6
- env: TOX_ENV=py36-no-ext
@ -33,9 +29,9 @@ after_success:
- codecov
deploy:
provider: pypi
user: channelcat
user: brewmaster
password:
secure: h7oNDjA/ObDBGK7xt55SV0INHOclMJW/byxMrNxvCZ0JxiRk7WBNtWYt0WJjyf5lO/L0/sfgiAk0GIdFon57S24njSLPAq/a4ptkWZ68s2A+TaF6ezJSZvE9V8khivjoeub90TzfX6P5aukRja1CSxXKJm+v0V8hGE4CZGyCgEDvK3JqIakpXllSDl19DhVftCS/lQZD7AXrZlg1kZnPCMtB5IbCVR4L2bfrSJVNptBi2CqqxacY2MOLu+jv5FzJ2BGVIJ2zoIJS2T+JmGJzpiamF6y8Amv0667i9lg2DXWCtI3PsQzCmwa3F/ZsI+ohUAvJC5yvzP7SyTJyXifRBdJ9O137QkNAHFoJOOY3B4GSnTo8/boajKXEqGiV4h2EgwNjBaR0WJl0pB7HHUCBMkNRWqo6ACB8eCr04tXWXPvkGIc+wPjq960hsUZea1O31MuktYc9Ot6eiFqm7OKoItdi7LxCen1eTj93ePgkiEnVZ+p/04Hh1U7CX31UJMNu5kCvZPIANnAuDsS2SK7Qkr88OAuWL0wmrBcXKOcnVkJtZ5mzx8T54bI1RrSYtFDBLFfOPb0GucSziMBtQpE76qPEauVwIXBk3RnR8N57xBR/lvTaIk758tf+haO0llEO5rVls1zLNZ+VlTzXy7hX0OZbdopIAcCFBFWqWMAdXQc=
secure: "GoawLwmbtJOgKB6AJ0ZSYUUnNwIoonseHBxaAUH3zu79TS/Afrq+yB3lsVaMSG0CbyDgN4FrfD1phT1NzbvZ1VcLIOTDtCrmpQ1kLDw+zwgF40ab8sp8fPkKVHHHfCCs1mjltHIpxQa5lZTJcAs6Bpi/lbUWWwYxFzSV8pHw4W4hY09EHUd2o+evLTSVxaploetSt725DJUYKICUr2eAtCC11IDnIW4CzBJEx6krVV3uhzfTJW0Ls17x0c6sdZ9icMnV/G9xO/eQH6RIHe4xcrWJ6cmLDNKoGAkJp+BKr1CeVVg7Jw/MzPjvZKL2/ki6Beue1y6GUIy7lOS7jPVaOEhJ23b0zQwFcLMZw+Tt+E3v6QfHk+B/WBBBnM3zUZed9UI+QyW8+lqLLt39sQX0FO0P3eaDh8qTXtUuon2jTyFMMAMTFRTNpJmpAzuBH9yeMmDeALPTh0HphI+BkoUl5q1QbWFYjjnZMH2CatApxpLybt9A7rwm//PbOG0TSI93GEKNQ4w5DYryKTfwHzRBptNSephJSuxZYEfJsmUtas5es1D7Fe0PkyjxNNSU+eO+8wsTlitLUsJO4k0jAgy+cEKdU7YJ3J0GZVXocSkrNnUfd2hQPcJ3UtEJx3hLqqr8EM7EZBAasc1yGHh36NFetclzFY24YPih0G1+XurhTys="
on:
tags: true
distributions: "sdist bdist_wheel"

View File

@ -1,3 +1,129 @@
Version 19.6
------------
19.6.0
- Changes:
- Remove `aiohttp` dependencey and create new `SanicTestClient` based upon
[`requests-async`](https://github.com/encode/requests-async).
- Deprecation:
- Support for Python 3.5
Note: Sanic will not support Python 3.5 from version 19.6 and forward. However,
version 18.12LTS will have its support period extended thru December 2020, and
therefore passing Python's official support version 3.5, which is set to expire
in September 2020.
Version 19.3
-------------
19.3.1
- Changes:
- [#1497](https://github.com/huge-success/sanic/pull/1497)
Add support for zero-length and RFC 5987 encoded filename for
multipart/form-data requests.
- [#1484](https://github.com/huge-success/sanic/pull/1484)
The type of `expires` attribute of `sanic.cookies.Cookie` is now
enforced to be of type `datetime`.
- [#1482](https://github.com/huge-success/sanic/pull/1482)
Add support for the `stream` parameter of `sanic.Sanic.add_route()`
available to `sanic.Blueprint.add_route()`.
- [#1481](https://github.com/huge-success/sanic/pull/1481)
Accept negative values for route parameters with type `int` or `number`.
- [#1476](https://github.com/huge-success/sanic/pull/1476)
Deprecated the use of `sanic.request.Request.raw_args` - it has a
fundamental flaw in which is drops repeated query string parameters.
Added `sanic.request.Request.query_args` as a replacement for the
original use-case.
- [#1472](https://github.com/huge-success/sanic/pull/1472)
Remove an unwanted `None` check in Request class `repr` implementation.
This changes the default `repr` of a Request from `<Request>` to
`<Request: None />`
- [#1470](https://github.com/huge-success/sanic/pull/1470)
Added 2 new parameters to `sanic.app.Sanic.create_server`:
- `return_asyncio_server` - whether to return an asyncio.Server.
- `asyncio_server_kwargs` - kwargs to pass to `loop.create_server` for
the event loop that sanic is using.
This is a breaking change.
- [#1499](https://github.com/huge-success/sanic/pull/1499)
Added a set of test cases that test and benchmark route resolution.
- [#1457](https://github.com/huge-success/sanic/pull/1457)
The type of the `"max-age"` value in a `sanic.cookies.Cookie` is now
enforced to be an integer. Non-integer values are replaced with `0`.
- [#1445](https://github.com/huge-success/sanic/pull/1445)
Added the `endpoint` attribute to an incoming `request`, containing the
name of the handler function.
- [#1423](https://github.com/huge-success/sanic/pull/1423)
Improved request streaming. `request.stream` is now a bounded-size buffer
instead of an unbounded queue. Callers must now call
`await request.stream.read()` instead of `await request.stream.get()`
to read each portion of the body.
This is a breaking change.
- Fixes:
- [#1502](https://github.com/huge-success/sanic/pull/1502)
Sanic was prefetching `time.time()` and updating it once per second to
avoid excessive `time.time()` calls. The implementation was observed to
cause memory leaks in some cases. The benefit of the prefetch appeared
to negligible, so this has been removed. Fixes
[#1500](https://github.com/huge-success/sanic/pull/1500)
- [#1501](https://github.com/huge-success/sanic/pull/1501)
Fix a bug in the auto-reloader when the process was launched as a module
i.e. `python -m init0.mod1` where the sanic server is started
in `init0/mod1.py` with `debug` enabled and imports another module in
`init0`.
- [#1376](https://github.com/huge-success/sanic/pull/1376)
Allow sanic test client to bind to a random port by specifying
`port=None` when constructing a `SanicTestClient`
- [#1399](https://github.com/huge-success/sanic/pull/1399)
Added the ability to specify middleware on a blueprint group, so that all
routes produced from the blueprints in the group have the middleware
applied.
- [#1442](https://github.com/huge-success/sanic/pull/1442)
Allow the the use the `SANIC_ACCESS_LOG` environment variable to
enable/disable the access log when not explicitly passed to `app.run()`.
This allows the access log to be disabled for example when running via
gunicorn.
- Developer infrastructure:
- [#1529](https://github.com/huge-success/sanic/pull/1529) Update project PyPI credentials
- [#1515](https://github.com/huge-success/sanic/pull/1515) fix linter issue causing travis build failures (fix #1514)
- [#1490](https://github.com/huge-success/sanic/pull/1490) Fix python version in doc build
- [#1478](https://github.com/huge-success/sanic/pull/1478) Upgrade setuptools version and use native docutils in doc build
- [#1464](https://github.com/huge-success/sanic/pull/1464) Upgrade pytest, and fix caplog unit tests
- Typos and Documentation:
- [#1516](https://github.com/huge-success/sanic/pull/1516) Fix typo at the exception documentation
- [#1510](https://github.com/huge-success/sanic/pull/1510) fix typo in Asyncio example
- [#1486](https://github.com/huge-success/sanic/pull/1486) Documentation typo
- [#1477](https://github.com/huge-success/sanic/pull/1477) Fix grammar in README.md
- [#1489](https://github.com/huge-success/sanic/pull/1489) Added "databases" to the extensions list
- [#1483](https://github.com/huge-success/sanic/pull/1483) Add sanic-zipkin to extensions list
- [#1487](https://github.com/huge-success/sanic/pull/1487) Removed link to deleted repo, Sanic-OAuth, from the extensions list
- [#1460](https://github.com/huge-success/sanic/pull/1460) 18.12 changelog
- [#1449](https://github.com/huge-success/sanic/pull/1449) Add example of amending request object
- [#1446](https://github.com/huge-success/sanic/pull/1446) Update README
- [#1444](https://github.com/huge-success/sanic/pull/1444) Update README
- [#1443](https://github.com/huge-success/sanic/pull/1443) Update README, including new logo
- [#1440](https://github.com/huge-success/sanic/pull/1440) fix minor type and pip install instruction mismatch
- [#1424](https://github.com/huge-success/sanic/pull/1424) Documentation Enhancements
Note: 19.3.0 was skipped for packagement purposes and not released on PyPI
Version 18.12
-------------
18.12.0

View File

@ -18,7 +18,7 @@ So assume you have already cloned the repo and are in the working directory with
a virtual environment already set up, then run:
```bash
pip3 install -e . "[.dev]"
pip3 install -e . ".[dev]"
```
# Dependency Changes
@ -30,9 +30,9 @@ the document that explains the way `sanic` manages dependencies inside the `setu
| Dependency Type | Usage | Installation |
| ------------------------------------------| -------------------------------------------------------------------------- | --------------------------- |
| requirements | Bare minimum dependencies required for sanic to function | pip3 install -e . |
| tests_require / extras_require['test'] | Dependencies required to run the Unit Tests for `sanic` | pip3 install -e '[.test]' |
| extras_require['dev'] | Additional Development requirements to add contributing | pip3 install -e '[.dev]' |
| extras_require['docs'] | Dependencies required to enable building and enhancing sanic documentation | pip3 install -e '[.docs]' |
| tests_require / extras_require['test'] | Dependencies required to run the Unit Tests for `sanic` | pip3 install -e '.[test]' |
| extras_require['dev'] | Additional Development requirements to add contributing | pip3 install -e '.[dev]' |
| extras_require['docs'] | Dependencies required to enable building and enhancing sanic documentation | pip3 install -e '.[docs]' |
## Running tests

View File

@ -1,6 +1,6 @@
MIT License
Copyright (c) 2016-present Channel Cat
Copyright (c) 2016-present Sanic Community
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

View File

@ -47,12 +47,12 @@ ifdef include_tests
isort -rc sanic tests
else
$(info Sorting Imports)
isort -rc sanic
isort -rc sanic tests
endif
endif
black:
black --config ./pyproject.toml sanic tests
black --config ./.black.toml sanic tests
fix-import: black
isort -rc sanic
isort -rc sanic tests

View File

@ -17,6 +17,8 @@ Sanic | Build fast. Run fast.
- | |PyPI| |PyPI version| |Wheel| |Supported implementations| |Code style black|
* - Support
- | |Forums| |Join the chat at https://gitter.im/sanic-python/Lobby|
* - Stats
- | |Downloads|
.. |Forums| image:: https://img.shields.io/badge/forums-community-ff0068.svg
:target: https://community.sanicframework.org/
@ -42,10 +44,13 @@ Sanic | Build fast. Run fast.
.. |Supported implementations| image:: https://img.shields.io/pypi/implementation/sanic.svg
:alt: Supported implementations
:target: https://pypi.python.org/pypi/sanic
.. |Downloads| image:: https://pepy.tech/badge/sanic/month
:alt: Downloads
:target: https://pepy.tech/project/sanic
.. end-badges
Sanic is a Python web server and web framework that's written to go fast. It allows usage the `async` and `await` syntax added in Python 3.5, which makes your code non-blocking and speedy.
Sanic is a **Python 3.6+** web server and web framework that's written to go fast. It allows the usage of the ``async/await`` syntax added in Python 3.5, which makes your code non-blocking and speedy.
`Source code on GitHub <https://github.com/huge-success/sanic/>`_ | `Help and discussion board <https://community.sanicframework.org/>`_.
@ -64,7 +69,7 @@ Installation
$ export SANIC_NO_UVLOOP=true
$ export SANIC_NO_UJSON=true
$ pip3 install sanic
$ pip3 install --no-binary :all: sanic
Hello World Example

View File

@ -33,6 +33,7 @@ Guides
sanic/changelog
sanic/contributing
sanic/api_reference
sanic/asyncio_python37
Module Documentation

View File

@ -20,6 +20,15 @@ sanic.blueprints module
:undoc-members:
:show-inheritance:
sanic.blueprint_group module
----------------------------
.. automodule:: sanic.blueprint_group
:members:
:undoc-members:
:show-inheritance:
sanic.config module
-------------------

View File

@ -0,0 +1,58 @@
Python 3.7 AsyncIO examples
###########################
With Python 3.7 AsyncIO got major update for the following types:
- asyncio.AbstractEventLoop
- asyncio.AbstractServer
This example shows how to use sanic with Python 3.7, to be precise: how to retrieve an asyncio server instance:
.. code:: python
import asyncio
import socket
import os
from sanic import Sanic
from sanic.response import json
app = Sanic(__name__)
@app.route("/")
async def test(request):
return json({"hello": "world"})
server_socket = '/tmp/sanic.sock'
sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
try:
os.remove(server_socket)
finally:
sock.bind(server_socket)
if __name__ == "__main__":
loop = asyncio.get_event_loop()
srv_coro = app.create_server(
sock=sock,
return_asyncio_server=True,
asyncio_server_kwargs=dict(
start_serving=False
)
)
srv = loop.run_until_complete(srv_coro)
try:
assert srv.is_serving() is False
loop.run_until_complete(srv.start_serving())
assert srv.is_serving() is True
loop.run_until_complete(srv.serve_forever())
except KeyboardInterrupt:
srv.close()
loop.close()
Please note that uvloop does not support these features yet.

View File

@ -127,7 +127,7 @@ Blueprints have almost the same functionality as an application instance.
WebSocket handlers can be registered on a blueprint using the `@bp.websocket`
decorator or `bp.add_websocket_route` method.
### Middleware
### Blueprint Middleware
Using blueprints allows you to also register middleware globally.
@ -145,6 +145,36 @@ async def halt_response(request, response):
return text('I halted the response')
```
### Blueprint Group Middleware
Using this middleware will ensure that you can apply a common middleware to all the blueprints that form the
current blueprint group under consideration.
```python
bp1 = Blueprint('bp1', url_prefix='/bp1')
bp2 = Blueprint('bp2', url_prefix='/bp2')
@bp1.middleware('request')
async def bp1_only_middleware(request):
print('applied on Blueprint : bp1 Only')
@bp1.route('/')
async def bp1_route(request):
return text('bp1')
@bp2.route('/<param>')
async def bp2_route(request, param):
return text(param)
group = Blueprint.group(bp1, bp2)
@group.middleware('request')
async def group_middleware(request):
print('common middleware applied for both bp1 and bp2')
# Register Blueprint group under the app
app.blueprint(group)
```
### Exceptions
Exceptions can be applied exclusively to blueprints globally.

View File

@ -1,3 +1,129 @@
Version 19.6
------------
19.6.0
- Changes:
- Remove `aiohttp` dependencey and create new `SanicTestClient` based upon
[`requests-async`](https://github.com/encode/requests-async).
- Deprecation:
- Support for Python 3.5
Note: Sanic will not support Python 3.5 from version 19.6 and forward. However,
version 18.12LTS will have its support period extended thru December 2020, and
therefore passing Python's official support version 3.5, which is set to expire
in September 2020.
Version 19.3
-------------
19.3.1
- Changes:
- [#1497](https://github.com/huge-success/sanic/pull/1497)
Add support for zero-length and RFC 5987 encoded filename for
multipart/form-data requests.
- [#1484](https://github.com/huge-success/sanic/pull/1484)
The type of `expires` attribute of `sanic.cookies.Cookie` is now
enforced to be of type `datetime`.
- [#1482](https://github.com/huge-success/sanic/pull/1482)
Add support for the `stream` parameter of `sanic.Sanic.add_route()`
available to `sanic.Blueprint.add_route()`.
- [#1481](https://github.com/huge-success/sanic/pull/1481)
Accept negative values for route parameters with type `int` or `number`.
- [#1476](https://github.com/huge-success/sanic/pull/1476)
Deprecated the use of `sanic.request.Request.raw_args` - it has a
fundamental flaw in which is drops repeated query string parameters.
Added `sanic.request.Request.query_args` as a replacement for the
original use-case.
- [#1472](https://github.com/huge-success/sanic/pull/1472)
Remove an unwanted `None` check in Request class `repr` implementation.
This changes the default `repr` of a Request from `<Request>` to
`<Request: None />`
- [#1470](https://github.com/huge-success/sanic/pull/1470)
Added 2 new parameters to `sanic.app.Sanic.create_server`:
- `return_asyncio_server` - whether to return an asyncio.Server.
- `asyncio_server_kwargs` - kwargs to pass to `loop.create_server` for
the event loop that sanic is using.
This is a breaking change.
- [#1499](https://github.com/huge-success/sanic/pull/1499)
Added a set of test cases that test and benchmark route resolution.
- [#1457](https://github.com/huge-success/sanic/pull/1457)
The type of the `"max-age"` value in a `sanic.cookies.Cookie` is now
enforced to be an integer. Non-integer values are replaced with `0`.
- [#1445](https://github.com/huge-success/sanic/pull/1445)
Added the `endpoint` attribute to an incoming `request`, containing the
name of the handler function.
- [#1423](https://github.com/huge-success/sanic/pull/1423)
Improved request streaming. `request.stream` is now a bounded-size buffer
instead of an unbounded queue. Callers must now call
`await request.stream.read()` instead of `await request.stream.get()`
to read each portion of the body.
This is a breaking change.
- Fixes:
- [#1502](https://github.com/huge-success/sanic/pull/1502)
Sanic was prefetching `time.time()` and updating it once per second to
avoid excessive `time.time()` calls. The implementation was observed to
cause memory leaks in some cases. The benefit of the prefetch appeared
to negligible, so this has been removed. Fixes
[#1500](https://github.com/huge-success/sanic/pull/1500)
- [#1501](https://github.com/huge-success/sanic/pull/1501)
Fix a bug in the auto-reloader when the process was launched as a module
i.e. `python -m init0.mod1` where the sanic server is started
in `init0/mod1.py` with `debug` enabled and imports another module in
`init0`.
- [#1376](https://github.com/huge-success/sanic/pull/1376)
Allow sanic test client to bind to a random port by specifying
`port=None` when constructing a `SanicTestClient`
- [#1399](https://github.com/huge-success/sanic/pull/1399)
Added the ability to specify middleware on a blueprint group, so that all
routes produced from the blueprints in the group have the middleware
applied.
- [#1442](https://github.com/huge-success/sanic/pull/1442)
Allow the the use the `SANIC_ACCESS_LOG` environment variable to
enable/disable the access log when not explicitly passed to `app.run()`.
This allows the access log to be disabled for example when running via
gunicorn.
- Developer infrastructure:
- [#1529](https://github.com/huge-success/sanic/pull/1529) Update project PyPI credentials
- [#1515](https://github.com/huge-success/sanic/pull/1515) fix linter issue causing travis build failures (fix #1514)
- [#1490](https://github.com/huge-success/sanic/pull/1490) Fix python version in doc build
- [#1478](https://github.com/huge-success/sanic/pull/1478) Upgrade setuptools version and use native docutils in doc build
- [#1464](https://github.com/huge-success/sanic/pull/1464) Upgrade pytest, and fix caplog unit tests
- Typos and Documentation:
- [#1516](https://github.com/huge-success/sanic/pull/1516) Fix typo at the exception documentation
- [#1510](https://github.com/huge-success/sanic/pull/1510) fix typo in Asyncio example
- [#1486](https://github.com/huge-success/sanic/pull/1486) Documentation typo
- [#1477](https://github.com/huge-success/sanic/pull/1477) Fix grammar in README.md
- [#1489](https://github.com/huge-success/sanic/pull/1489) Added "databases" to the extensions list
- [#1483](https://github.com/huge-success/sanic/pull/1483) Add sanic-zipkin to extensions list
- [#1487](https://github.com/huge-success/sanic/pull/1487) Removed link to deleted repo, Sanic-OAuth, from the extensions list
- [#1460](https://github.com/huge-success/sanic/pull/1460) 18.12 changelog
- [#1449](https://github.com/huge-success/sanic/pull/1449) Add example of amending request object
- [#1446](https://github.com/huge-success/sanic/pull/1446) Update README
- [#1444](https://github.com/huge-success/sanic/pull/1444) Update README
- [#1443](https://github.com/huge-success/sanic/pull/1443) Update README, including new logo
- [#1440](https://github.com/huge-success/sanic/pull/1440) fix minor type and pip install instruction mismatch
- [#1424](https://github.com/huge-success/sanic/pull/1424) Documentation Enhancements
Note: 19.3.0 was skipped for packagement purposes and not released on PyPI
Version 18.12
-------------
18.12.0

View File

@ -85,16 +85,19 @@ DB_USER = 'appuser'
Out of the box there are just a few predefined values which can be overwritten when creating the application.
| Variable | Default | Description |
| ------------------------- | --------- | --------------------------------------------------------- |
| REQUEST_MAX_SIZE | 100000000 | How big a request may be (bytes) |
| REQUEST_BUFFER_QUEUE_SIZE | 100 | Request streaming buffer queue size |
| REQUEST_TIMEOUT | 60 | How long a request can take to arrive (sec) |
| RESPONSE_TIMEOUT | 60 | How long a response can take to process (sec) |
| KEEP_ALIVE | True | Disables keep-alive when False |
| KEEP_ALIVE_TIMEOUT | 5 | How long to hold a TCP connection open (sec) |
| GRACEFUL_SHUTDOWN_TIMEOUT | 15.0 | How long to wait to force close non-idle connection (sec) |
| ACCESS_LOG | True | Disable or enable access log |
| Variable | Default | Description |
| ------------------------- | ----------------- | --------------------------------------------------------------------------- |
| REQUEST_MAX_SIZE | 100000000 | How big a request may be (bytes) |
| REQUEST_BUFFER_QUEUE_SIZE | 100 | Request streaming buffer queue size |
| REQUEST_TIMEOUT | 60 | How long a request can take to arrive (sec) |
| RESPONSE_TIMEOUT | 60 | How long a response can take to process (sec) |
| KEEP_ALIVE | True | Disables keep-alive when False |
| KEEP_ALIVE_TIMEOUT | 5 | How long to hold a TCP connection open (sec) |
| GRACEFUL_SHUTDOWN_TIMEOUT | 15.0 | How long to wait to force close non-idle connection (sec) |
| ACCESS_LOG | True | Disable or enable access log |
| PROXIES_COUNT | -1 | The number of proxy servers in front of the app (e.g. nginx; see below) |
| FORWARDED_FOR_HEADER | "X-Forwarded-For" | The name of "X-Forwarded-For" HTTP header that contains client and proxy ip |
| REAL_IP_HEADER | "X-Real-IP" | The name of "X-Real-IP" HTTP header that contains real client ip |
### The different Timeout variables:
@ -143,3 +146,17 @@ Firefox client hard keepalive limit = 115 seconds
Opera 11 client hard keepalive limit = 120 seconds
Chrome 13+ client keepalive limit > 300+ seconds
```
### About proxy servers and client ip
When you use a reverse proxy server (e.g. nginx), the value of `request.ip` will contain ip of a proxy, typically `127.0.0.1`. To determine the real client ip, `X-Forwarded-For` and `X-Real-IP` HTTP headers are used. But client can fake these headers if they have not been overridden by a proxy. Sanic has a set of options to determine the level of confidence in these headers.
* If you have a single proxy, set `PROXIES_COUNT` to `1`. Then Sanic will use `X-Real-IP` if available or the last ip from `X-Forwarded-For`.
* If you have multiple proxies, set `PROXIES_COUNT` equal to their number to allow Sanic to select the correct ip from `X-Forwarded-For`.
* If you don't use a proxy, set `PROXIES_COUNT` to `0` to ignore these headers and prevent ip falsification.
* If you don't use `X-Real-IP` (e.g. your proxy sends only `X-Forwarded-For`), set `REAL_IP_HEADER` to an empty string.
The real ip will be available in `request.remote_addr`. If HTTP headers are unavailable or untrusted, `request.remote_addr` will be an empty string; in this case use `request.ip` instead.

View File

@ -64,6 +64,26 @@ of the memory leak.
See the [Gunicorn Docs](http://docs.gunicorn.org/en/latest/settings.html#max-requests) for more information.
## Running behind a reverse proxy
Sanic can be used with a reverse proxy (e.g. nginx). There's a simple example of nginx configuration:
```
server {
listen 80;
server_name example.org;
location / {
proxy_pass http://127.0.0.1:8000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
}
```
If you want to get real client ip, you should configure `X-Real-IP` and `X-Forwarded-For` HTTP headers and set `app.config.PROXIES_COUNT` to `1`; see the configuration page for more information.
## Disable debug logging
To improve the performance add `debug=False` and `access_log=False` in the `run` arguments.
@ -92,7 +112,7 @@ to run the app in general.
Here is an incomplete example (please see `run_async.py` in examples for something more practical):
```python
server = app.create_server(host="0.0.0.0", port=8000)
server = app.create_server(host="0.0.0.0", port=8000, return_asyncio_server=True)
loop = asyncio.get_event_loop()
task = asyncio.ensure_future(server)
loop.run_forever()

View File

@ -160,7 +160,7 @@ execution support provided by the ``pytest-xdist`` plugin.
Amending Request Object
~~~~~~~~~~~~~~~~~~~~~~~
The ``request`` object in ``Sanic`` is a kind of ``dict`` object, this means that ``reqeust`` object can be manipulated as a regular ``dict`` object.
The ``request`` object in ``Sanic`` is a kind of ``dict`` object, this means that ``request`` object can be manipulated as a regular ``dict`` object.
.. literalinclude:: ../../examples/amending_request_object.py

View File

@ -59,7 +59,7 @@ app = Sanic()
app.error_handler.add(Exception, server_error_handler)
```
In some cases, you might want want to add some more error handling
In some cases, you might want to add some more error handling
functionality to what is provided by default. In that case, you
can subclass Sanic's default error handler as such:

View File

@ -17,7 +17,6 @@ A list of Sanic extensions created by the community.
- [Sanic-JWT-Extended](https://github.com/devArtoria/Sanic-JWT-Extended): Provides extended JWT support for
- [UserAgent](https://github.com/lixxu/sanic-useragent): Add `user_agent` to request
- [Limiter](https://github.com/bohea/sanic-limiter): Rate limiting for sanic.
- [Sanic-OAuth](https://github.com/Sniedes722/Sanic-OAuth): OAuth Library for connecting to & creating your own token providers.
- [sanic-oauth](https://gitlab.com/SirEdvin/sanic-oauth): OAuth Library with many provider and OAuth1/OAuth2 support.
- [Sanic-Auth](https://github.com/pyx/sanic-auth): A minimal backend agnostic session-based user authentication mechanism for Sanic.
- [Sanic-CookieSession](https://github.com/pyx/sanic-cookiesession): A client-side only, cookie-based session, similar to the built-in session in Flask.
@ -34,6 +33,7 @@ A list of Sanic extensions created by the community.
- [Sanic CRUD](https://github.com/Typhon66/sanic_crud): CRUD REST API generation with peewee models.
- [sanic-graphql](https://github.com/graphql-python/sanic-graphql): GraphQL integration with Sanic
- [GINO](https://github.com/fantix/gino): An asyncio ORM on top of SQLAlchemy core, delivered with a Sanic extension. ([Documentation](https://python-gino.readthedocs.io/))
- [Databases](https://github.com/encode/databases): Async database access for SQLAlchemy core, with support for PostgreSQL, MySQL, and SQLite.
## Unit Testing
@ -67,6 +67,7 @@ A list of Sanic extensions created by the community.
## Monitoring and Reporting
- [sanic-prometheus](https://github.com/dkruchinin/sanic-prometheus): Prometheus metrics for Sanic
- [sanic-zipkin](https://github.com/kevinqqnj/sanic-zipkin): Easily report request/function/RPC traces to zipkin/jaeger, through aiozipkin.
## Sample Applications

View File

@ -19,6 +19,8 @@ The following variables are accessible as properties on `Request` objects:
URL that resembles `?key1=value1&key2=value2`. If that URL were to be parsed,
the `args` dictionary would look like `{'key1': ['value1'], 'key2': ['value2']}`.
The request's `query_string` variable holds the unparsed string value.
Property is providing the default parsing strategy. If you would like to change it look to the section below
(`Changing the default parsing rules of the queryset`).
```python
from sanic.response import json
@ -28,9 +30,54 @@ The following variables are accessible as properties on `Request` objects:
return json({ "parsed": True, "args": request.args, "url": request.url, "query_string": request.query_string })
```
- `raw_args` (dict) - On many cases you would need to access the url arguments in
a less packed dictionary. For same previous URL `?key1=value1&key2=value2`, the
`raw_args` dictionary would look like `{'key1': 'value1', 'key2': 'value2'}`.
- `query_args` (list) - On many cases you would need to access the url arguments in
a less packed form. `query_args` is the list of `(key, value)` tuples.
Property is providing the default parsing strategy. If you would like to change it look to the section below
(`Changing the default parsing rules of the queryset`).
For the same previous URL queryset `?key1=value1&key2=value2`, the
`query_args` list would look like `[('key1', 'value1'), ('key2', 'value2')]`.
And in case of the multiple params with the same key like `?key1=value1&key2=value2&key1=value3`
the `query_args` list would look like `[('key1', 'value1'), ('key2', 'value2'), ('key1', 'value3')]`.
The difference between Request.args and Request.query_args
for the queryset `?key1=value1&key2=value2&key1=value3`
```python
from sanic import Sanic
from sanic.response import json
app = Sanic(__name__)
@app.route("/test_request_args")
async def test_request_args(request):
return json({
"parsed": True,
"url": request.url,
"query_string": request.query_string,
"args": request.args,
"raw_args": request.raw_args,
"query_args": request.query_args,
})
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000)
```
Output
```
{
"parsed":true,
"url":"http:\/\/0.0.0.0:8000\/test_request_args?key1=value1&key2=value2&key1=value3",
"query_string":"key1=value1&key2=value2&key1=value3",
"args":{"key1":["value1","value3"],"key2":["value2"]},
"raw_args":{"key1":"value1","key2":"value2"},
"query_args":[["key1","value1"],["key2","value2"],["key1","value3"]]
}
```
`raw_args` contains only the first entry of `key1`. Will be deprecated in the future versions.
- `files` (dictionary of `File` objects) - List of files that have a name, body, and type
@ -106,6 +153,51 @@ The following variables are accessible as properties on `Request` objects:
- `token`: The value of Authorization header: `Basic YWRtaW46YWRtaW4=`
## Changing the default parsing rules of the queryset
The default parameters that are using internally in `args` and `query_args` properties to parse queryset:
- `keep_blank_values` (bool): `False` - flag indicating whether blank values in
percent-encoded queries should be treated as blank strings.
A true value indicates that blanks should be retained as blank
strings. The default false value indicates that blank values
are to be ignored and treated as if they were not included.
- `strict_parsing` (bool): `False` - flag indicating what to do with parsing errors. If
false (the default), errors are silently ignored. If true,
errors raise a ValueError exception.
- `encoding` and `errors` (str): 'utf-8' and 'replace' - specify how to decode percent-encoded sequences
into Unicode characters, as accepted by the bytes.decode() method.
If you would like to change that default parameters you could call `get_args` and `get_query_args` methods
with the new values.
For the queryset `/?test1=value1&test2=&test3=value3`:
```python
from sanic.response import json
@app.route("/query_string")
def query_string(request):
args_with_blank_values = request.get_args(keep_blank_values=True)
return json({
"parsed": True,
"url": request.url,
"args_with_blank_values": args_with_blank_values,
"query_string": request.query_string
})
```
The output will be:
```
{
"parsed": true,
"url": "http:\/\/0.0.0.0:8000\/query_string?test1=value1&test2=&test3=value3",
"args_with_blank_values": {"test1": ["value1""], "test2": "", "test3": ["value3"]},
"query_string": "test1=value1&test2=&test3=value3"
}
```
## Accessing values using `get` and `getlist`
The request properties which return a dictionary actually return a subclass of

View File

@ -55,11 +55,13 @@ from sanic import response
@app.route("/streaming")
async def index(request):
async def streaming_fn(response):
response.write('foo')
response.write('bar')
await response.write('foo')
await response.write('bar')
return response.stream(streaming_fn, content_type='text/plain')
```
See [Streaming](streaming.md) for more information.
## File Streaming
For large files, a combination of File and Streaming above
```python

View File

@ -48,7 +48,7 @@ app.run(host="0.0.0.0", port=8000)
## Virtual Host
The `app.static()` method also support **virtual host**. You can serve your static files with spefic **virtual host** with `host` argument. For example:
The `app.static()` method also support **virtual host**. You can serve your static files with specific **virtual host** with `host` argument. For example:
```python
from sanic import Sanic

View File

@ -42,7 +42,7 @@ async def handler(request):
@bp.put('/bp_stream', stream=True)
async def bp_handler(request):
async def bp_put_handler(request):
result = ''
while True:
body = await request.stream.read()
@ -52,6 +52,19 @@ async def bp_handler(request):
return text(result)
# You can also use `bp.add_route()` with stream argument
async def bp_post_handler(request):
result = ''
while True:
body = await request.stream.read()
if body is None:
break
result += body.decode('utf-8').replace('1', 'A')
return text(result)
bp.add_route(bp_post_handler, '/bp_stream', methods=['POST'], stream=True)
async def post_handler(request):
result = ''
while True:
@ -104,3 +117,27 @@ async def index(request):
return stream(stream_from_db)
```
If a client supports HTTP/1.1, Sanic will use [chunked transfer encoding](https://en.wikipedia.org/wiki/Chunked_transfer_encoding); you can explicitly enable or disable it using `chunked` option of the `stream` function.
## File Streaming
Sanic provides `sanic.response.file_stream` function that is useful when you want to send a large file. It returns a `StreamingHTTPResponse` object and will use chunked transfer encoding by default; for this reason Sanic doesn't add `Content-Length` HTTP header in the response. If you want to use this header, you can disable chunked transfer encoding and add it manually:
```python
from aiofiles import os as async_os
from sanic.response import file_stream
@app.route("/")
async def index(request):
file_path = "/srv/www/whatever.png"
file_stat = await async_os.stat(file_path)
headers = {"Content-Length": str(file_stat.st_size)}
return await file_stream(
file_path,
headers=headers,
chunked=False,
)
```

View File

@ -1,8 +1,8 @@
# Testing
Sanic endpoints can be tested locally using the `test_client` object, which
depends on the additional [aiohttp](https://aiohttp.readthedocs.io/en/stable/)
library.
depends on the additional [`requests-async`](https://github.com/encode/requests-async)
library, which implements an API that mirrors the `requests` library.
The `test_client` exposes `get`, `post`, `put`, `delete`, `patch`, `head` and `options` methods
for you to run against your application. A simple example (using pytest) is like follows:
@ -21,7 +21,7 @@ def test_index_put_not_allowed():
```
Internally, each time you call one of the `test_client` methods, the Sanic app is run at `127.0.0.1:42101` and
your test request is executed against your application, using `aiohttp`.
your test request is executed against your application, using `requests-async`.
The `test_client` methods accept the following arguments and keyword arguments:
@ -33,7 +33,7 @@ The `test_client` methods accept the following arguments and keyword arguments:
- `server_kwargs` *(default `{}`) a dict of additional arguments to pass into `app.run` before the test request is run.
- `debug` *(default `False`)* A boolean which determines whether to run the server in debug mode.
The function further takes the `*request_args` and `**request_kwargs`, which are passed directly to the aiohttp ClientSession request.
The function further takes the `*request_args` and `**request_kwargs`, which are passed directly to the request.
For example, to supply data to a GET request, you would do the following:
@ -55,8 +55,25 @@ def test_post_json_request_includes_data():
More information about
the available arguments to aiohttp can be found
[in the documentation for ClientSession](https://aiohttp.readthedocs.io/en/stable/client_reference.html#client-session).
the available arguments to `requests-async` can be found
[in the documentation for `requests`](https://2.python-requests.org/en/master/).
## Using a random port
If you need to test using a free unpriveleged port chosen by the kernel
instead of the default with `SanicTestClient`, you can do so by specifying
`port=None`. On most systems the port will be in the range 1024 to 65535.
```python
# Import the Sanic app, usually created with Sanic(__name__)
from external_server import app
from sanic.testing import SanicTestClient
def test_index_returns_200():
request, response = SanicTestClient(app, port=None).get('/')
assert response.status == 200
```
## pytest-sanic

View File

@ -1,21 +1,18 @@
name: py35
name: py36
dependencies:
- openssl=1.0.2g=0
- pip=8.1.1=py35_0
- python=3.5.1=0
- readline=6.2=2
- setuptools=20.3=py35_0
- sqlite=3.9.2=0
- tk=8.5.18=0
- wheel=0.29.0=py35_0
- xz=5.0.5=1
- zlib=1.2.8=0
- pip=18.1=py36_0
- python=3.6=0
- setuptools=40.4.3=py36_0
- pip:
- uvloop>=0.5.3
- httptools>=0.0.10
- uvloop>=0.5.3
- ujson>=1.35
- aiofiles>=0.3.0
- websockets>=6.0
- sphinxcontrib-asyncio>=0.2.0
- websockets>=6.0,<7.0
- multidict>=4.0,<5.0
- https://github.com/channelcat/docutils-fork/zipball/master
- sphinx==1.8.3
- sphinx_rtd_theme==0.4.2
- recommonmark==0.5.0
- sphinxcontrib-asyncio>=0.2.0
- docutils==0.14
- pygments==2.3.1

View File

@ -76,7 +76,7 @@ async def test(request):
if __name__ == '__main__':
asyncio.set_event_loop(uvloop.new_event_loop())
server = app.create_server(host="0.0.0.0", port=8000)
server = app.create_server(host="0.0.0.0", port=8000, return_asyncio_server=True)
loop = asyncio.get_event_loop()
loop.set_task_factory(context.task_factory)
task = asyncio.ensure_future(server)

View File

@ -12,7 +12,7 @@ async def test(request):
return response.json({"answer": "42"})
asyncio.set_event_loop(uvloop.new_event_loop())
server = app.create_server(host="0.0.0.0", port=8000)
server = app.create_server(host="0.0.0.0", port=8000, return_asyncio_server=True)
loop = asyncio.get_event_loop()
task = asyncio.ensure_future(server)
signal(SIGINT, lambda s, f: loop.stop())

View File

@ -2,6 +2,6 @@ from sanic.app import Sanic
from sanic.blueprints import Blueprint
__version__ = "18.12.0"
__version__ = "19.03.1"
__all__ = ["Sanic", "Blueprint"]

View File

@ -16,6 +16,7 @@ from typing import Any, Optional, Type, Union
from urllib.parse import urlencode, urlunparse
from sanic import reloader_helpers
from sanic.blueprint_group import BlueprintGroup
from sanic.config import BASE_LOGO, Config
from sanic.constants import HTTP_METHODS
from sanic.exceptions import SanicException, ServerError, URLBuildError
@ -181,27 +182,28 @@ class Sanic:
strict_slashes = self.strict_slashes
def response(handler):
args = [key for key in signature(handler).parameters.keys()]
if args:
if stream:
handler.is_stream = stream
args = list(signature(handler).parameters.keys())
self.router.add(
uri=uri,
methods=methods,
handler=handler,
host=host,
strict_slashes=strict_slashes,
version=version,
name=name,
)
return handler
else:
if not args:
raise ValueError(
"Required parameter `request` missing "
"in the {0}() route?".format(handler.__name__)
)
if stream:
handler.is_stream = stream
self.router.add(
uri=uri,
methods=methods,
handler=handler,
host=host,
strict_slashes=strict_slashes,
version=version,
name=name,
)
return handler
return response
# Shorthand method decorators
@ -333,7 +335,7 @@ class Sanic:
name=None,
):
"""
Add an API URL under the **DELETE** *HTTP* method
Add an API URL under the **PATCH** *HTTP* method
:param uri: URL to be tagged to **PATCH** method of *HTTP*
:param host: Host IP or FQDN for the service to use
@ -599,9 +601,11 @@ class Sanic:
:return: decorated method
"""
if attach_to == "request":
self.request_middleware.append(middleware)
if middleware not in self.request_middleware:
self.request_middleware.append(middleware)
if attach_to == "response":
self.response_middleware.appendleft(middleware)
if middleware not in self.response_middleware:
self.response_middleware.appendleft(middleware)
return middleware
# Decorator
@ -683,7 +687,7 @@ class Sanic:
:param options: option dictionary with blueprint defaults
:return: Nothing
"""
if isinstance(blueprint, (list, tuple)):
if isinstance(blueprint, (list, tuple, BlueprintGroup)):
for item in blueprint:
self.blueprint(item, **options)
return
@ -879,8 +883,6 @@ class Sanic:
# -------------------------------------------- #
# Request Middleware
# -------------------------------------------- #
request.app = self
response = await self._run_request_middleware(request)
# No middleware results
if not response:
@ -1122,6 +1124,8 @@ class Sanic:
backlog: int = 100,
stop_event: Any = None,
access_log: Optional[bool] = None,
return_asyncio_server=False,
asyncio_server_kwargs=None,
) -> None:
"""
Asynchronous version of :func:`run`.
@ -1155,6 +1159,13 @@ class Sanic:
:type stop_event: None
:param access_log: Enables writing access logs (slows server)
:type access_log: bool
:param return_asyncio_server: flag that defines whether there's a need
to return asyncio.Server or
start it serving right away
:type return_asyncio_server: bool
:param asyncio_server_kwargs: key-value arguments for
asyncio/uvloop create_server method
:type asyncio_server_kwargs: dict
:return: Nothing
"""
@ -1185,7 +1196,7 @@ class Sanic:
loop=get_event_loop(),
protocol=protocol,
backlog=backlog,
run_async=True,
run_async=return_asyncio_server,
)
# Trigger before_start events
@ -1194,7 +1205,9 @@ class Sanic:
server_settings.get("loop"),
)
return await serve(**server_settings)
return await serve(
asyncio_server_kwargs=asyncio_server_kwargs, **server_settings
)
async def trigger_events(self, events, loop):
"""Trigger events (functions or async)
@ -1274,6 +1287,7 @@ class Sanic:
"port": port,
"sock": sock,
"ssl": ssl,
"app": self,
"signal": Signal(),
"debug": debug,
"request_handler": self.handle_request,

120
sanic/blueprint_group.py Normal file
View File

@ -0,0 +1,120 @@
from collections import MutableSequence
class BlueprintGroup(MutableSequence):
"""
This class provides a mechanism to implement a Blueprint Group
using the `Blueprint.group` method. To avoid having to re-write
some of the existing implementation, this class provides a custom
iterator implementation that will let you use the object of this
class as a list/tuple inside the existing implementation.
"""
__slots__ = ("_blueprints", "_url_prefix")
def __init__(self, url_prefix=None):
"""
Create a new Blueprint Group
:param url_prefix: URL: to be prefixed before all the Blueprint Prefix
"""
self._blueprints = []
self._url_prefix = url_prefix
@property
def url_prefix(self):
"""
Retrieve the URL prefix being used for the Current Blueprint Group
:return: string with url prefix
"""
return self._url_prefix
@property
def blueprints(self):
"""
Retrieve a list of all the available blueprints under this group.
:return: List of Blueprint instance
"""
return self._blueprints
def __iter__(self):
"""Tun the class Blueprint Group into an Iterable item"""
return iter(self._blueprints)
def __getitem__(self, item):
"""
This method returns a blueprint inside the group specified by
an index value. This will enable indexing, splice and slicing
of the blueprint group like we can do with regular list/tuple.
This method is provided to ensure backward compatibility with
any of the pre-existing usage that might break.
:param item: Index of the Blueprint item in the group
:return: Blueprint object
"""
return self._blueprints[item]
def __setitem__(self, index: int, item: object) -> None:
"""
Abstract method implemented to turn the `BlueprintGroup` class
into a list like object to support all the existing behavior.
This method is used to perform the list's indexed setter operation.
:param index: Index to use for inserting a new Blueprint item
:param item: New `Blueprint` object.
:return: None
"""
self._blueprints[index] = item
def __delitem__(self, index: int) -> None:
"""
Abstract method implemented to turn the `BlueprintGroup` class
into a list like object to support all the existing behavior.
This method is used to delete an item from the list of blueprint
groups like it can be done on a regular list with index.
:param index: Index to use for removing a new Blueprint item
:return: None
"""
del self._blueprints[index]
def __len__(self) -> int:
"""
Get the Length of the blueprint group object.
:return: Length of Blueprint group object
"""
return len(self._blueprints)
def insert(self, index: int, item: object) -> None:
"""
The Abstract class `MutableSequence` leverages this insert method to
perform the `BlueprintGroup.append` operation.
:param index: Index to use for removing a new Blueprint item
:param item: New `Blueprint` object.
:return: None
"""
self._blueprints.insert(index, item)
def middleware(self, *args, **kwargs):
"""
A decorator that can be used to implement a Middleware plugin to
all of the Blueprints that belongs to this specific Blueprint Group.
In case of nested Blueprint Groups, the same middleware is applied
across each of the Blueprints recursively.
:param args: Optional positional Parameters to be use middleware
:param kwargs: Optional Keyword arg to use with Middleware
:return: Partial function to apply the middleware
"""
kwargs["bp_group"] = True
def register_middleware_for_blueprints(fn):
for blueprint in self.blueprints:
blueprint.middleware(fn, *args, **kwargs)
return register_middleware_for_blueprints

View File

@ -1,5 +1,6 @@
from collections import defaultdict, namedtuple
from sanic.blueprint_group import BlueprintGroup
from sanic.constants import HTTP_METHODS
from sanic.views import CompositionView
@ -78,10 +79,12 @@ class Blueprint:
for i in nested:
if isinstance(i, (list, tuple)):
yield from chain(i)
elif isinstance(i, BlueprintGroup):
yield from i.blueprints
else:
yield i
bps = []
bps = BlueprintGroup(url_prefix=url_prefix)
for bp in chain(blueprints):
if bp.url_prefix is None:
bp.url_prefix = ""
@ -212,6 +215,7 @@ class Blueprint:
strict_slashes=None,
version=None,
name=None,
stream=False,
):
"""Create a blueprint route from a function.
@ -224,6 +228,7 @@ class Blueprint:
training */*
:param version: Blueprint Version
:param name: user defined route name for url_for
:param stream: boolean specifying if the handler is a stream handler
:return: function or class instance
"""
# Handle HTTPMethodView differently
@ -246,6 +251,7 @@ class Blueprint:
methods=methods,
host=host,
strict_slashes=strict_slashes,
stream=stream,
version=version,
name=name,
)(handler)
@ -324,7 +330,13 @@ class Blueprint:
args = []
return register_middleware(middleware)
else:
return register_middleware
if kwargs.get("bp_group") and callable(args[0]):
middleware = args[0]
args = args[1:]
kwargs.pop("bp_group")
return register_middleware(middleware)
else:
return register_middleware
def exception(self, *args, **kwargs):
"""

View File

@ -1,8 +1,6 @@
import os
import types
from distutils.util import strtobool
from sanic.exceptions import PyFileError
@ -27,6 +25,9 @@ DEFAULT_CONFIG = {
"WEBSOCKET_WRITE_LIMIT": 2 ** 16,
"GRACEFUL_SHUTDOWN_TIMEOUT": 15.0, # 15 sec
"ACCESS_LOG": True,
"PROXIES_COUNT": -1,
"FORWARDED_FOR_HEADER": "X-Forwarded-For",
"REAL_IP_HEADER": "X-Real-IP",
}
@ -127,6 +128,23 @@ class Config(dict):
self[config_key] = float(v)
except ValueError:
try:
self[config_key] = bool(strtobool(v))
self[config_key] = strtobool(v)
except ValueError:
self[config_key] = v
def strtobool(val):
"""
This function was borrowed from distutils.utils. While distutils
is part of stdlib, it feels odd to use distutils in main application code.
The function was modified to walk its talk and actually return bool
and not int.
"""
val = val.lower()
if val in ("y", "yes", "t", "true", "on", "1"):
return True
elif val in ("n", "no", "f", "false", "off", "0"):
return False
else:
raise ValueError("invalid truth value %r" % (val,))

View File

@ -1,6 +1,10 @@
import re
import string
from datetime import datetime
DEFAULT_MAX_AGE = 0
# ------------------------------------------------------------ #
# SimpleCookie
@ -103,6 +107,14 @@ class Cookie(dict):
if key not in self._keys:
raise KeyError("Unknown cookie property")
if value is not False:
if key.lower() == "max-age":
if not str(value).isdigit():
value = DEFAULT_MAX_AGE
elif key.lower() == "expires":
if not isinstance(value, datetime):
raise TypeError(
"Cookie 'expires' property must be a datetime"
)
return super().__setitem__(key, value)
def encode(self, encoding):
@ -126,16 +138,10 @@ class Cookie(dict):
except TypeError:
output.append("%s=%s" % (self._keys[key], value))
elif key == "expires":
try:
output.append(
"%s=%s"
% (
self._keys[key],
value.strftime("%a, %d-%b-%Y %T GMT"),
)
)
except AttributeError:
output.append("%s=%s" % (self._keys[key], value))
output.append(
"%s=%s"
% (self._keys[key], value.strftime("%a, %d-%b-%Y %T GMT"))
)
elif key in self._flags and self[key]:
output.append(self._keys[key])
else:

View File

@ -36,7 +36,15 @@ def _iter_module_files():
def _get_args_for_reloading():
"""Returns the executable."""
rv = [sys.executable]
rv.extend(sys.argv)
main_module = sys.modules["__main__"]
mod_spec = getattr(main_module, "__spec__", None)
if mod_spec:
# Parent exe was launched as a module rather than a script
rv.extend(["-m", mod_spec.name])
if len(sys.argv) > 1:
rv.extend(sys.argv[1:])
else:
rv.extend(sys.argv)
return rv
@ -44,6 +52,7 @@ def restart_with_reloader():
"""Create a new process and a subprocess in it with the same arguments as
this one.
"""
cwd = os.getcwd()
args = _get_args_for_reloading()
new_environ = os.environ.copy()
new_environ["SANIC_SERVER_RUNNING"] = "true"
@ -51,7 +60,7 @@ def restart_with_reloader():
worker_process = Process(
target=subprocess.call,
args=(cmd,),
kwargs=dict(shell=True, env=new_environ),
kwargs={"cwd": cwd, "shell": True, "env": new_environ},
)
worker_process.start()
return worker_process

View File

@ -1,11 +1,13 @@
import asyncio
import email.utils
import json
import sys
import warnings
from cgi import parse_header
from collections import namedtuple
from collections import defaultdict, namedtuple
from http.cookies import SimpleCookie
from urllib.parse import parse_qs, urlunparse
from urllib.parse import parse_qs, parse_qsl, unquote, urlunparse
from httptools import parse_url
@ -82,6 +84,7 @@ class Request(dict):
"headers",
"method",
"parsed_args",
"parsed_not_grouped_args",
"parsed_files",
"parsed_form",
"parsed_json",
@ -92,11 +95,11 @@ class Request(dict):
"version",
)
def __init__(self, url_bytes, headers, version, method, transport):
def __init__(self, url_bytes, headers, version, method, transport, app):
self.raw_url = url_bytes
# TODO: Content-Encoding detection
self._parsed_url = parse_url(url_bytes)
self.app = None
self.app = app
self.headers = headers
self.version = version
@ -108,15 +111,14 @@ class Request(dict):
self.parsed_json = None
self.parsed_form = None
self.parsed_files = None
self.parsed_args = None
self.parsed_args = defaultdict(RequestParameters)
self.parsed_not_grouped_args = defaultdict(list)
self.uri_template = None
self._cookies = None
self.stream = None
self.endpoint = None
def __repr__(self):
if self.method is None or not self.path:
return "<{0}>".format(self.__class__.__name__)
return "<{0}: {1} {2}>".format(
self.__class__.__name__, self.method, self.path
)
@ -200,21 +202,117 @@ class Request(dict):
return self.parsed_files
@property
def args(self):
if self.parsed_args is None:
def get_args(
self,
keep_blank_values: bool = False,
strict_parsing: bool = False,
encoding: str = "utf-8",
errors: str = "replace",
) -> RequestParameters:
"""
Method to parse `query_string` using `urllib.parse.parse_qs`.
This methods is used by `args` property.
Can be used directly if you need to change default parameters.
:param keep_blank_values: flag indicating whether blank values in
percent-encoded queries should be treated as blank strings.
A true value indicates that blanks should be retained as blank
strings. The default false value indicates that blank values
are to be ignored and treated as if they were not included.
:type keep_blank_values: bool
:param strict_parsing: flag indicating what to do with parsing errors.
If false (the default), errors are silently ignored. If true,
errors raise a ValueError exception.
:type strict_parsing: bool
:param encoding: specify how to decode percent-encoded sequences
into Unicode characters, as accepted by the bytes.decode() method.
:type encoding: str
:param errors: specify how to decode percent-encoded sequences
into Unicode characters, as accepted by the bytes.decode() method.
:type errors: str
:return: RequestParameters
"""
if not self.parsed_args[
(keep_blank_values, strict_parsing, encoding, errors)
]:
if self.query_string:
self.parsed_args = RequestParameters(
parse_qs(self.query_string)
self.parsed_args[
(keep_blank_values, strict_parsing, encoding, errors)
] = RequestParameters(
parse_qs(
qs=self.query_string,
keep_blank_values=keep_blank_values,
strict_parsing=strict_parsing,
encoding=encoding,
errors=errors,
)
)
else:
self.parsed_args = RequestParameters()
return self.parsed_args
return self.parsed_args[
(keep_blank_values, strict_parsing, encoding, errors)
]
args = property(get_args)
@property
def raw_args(self):
def raw_args(self) -> dict:
if self.app.debug: # pragma: no cover
warnings.simplefilter("default")
warnings.warn(
"Use of raw_args will be deprecated in "
"the future versions. Please use args or query_args "
"properties instead",
DeprecationWarning,
)
return {k: v[0] for k, v in self.args.items()}
def get_query_args(
self,
keep_blank_values: bool = False,
strict_parsing: bool = False,
encoding: str = "utf-8",
errors: str = "replace",
) -> list:
"""
Method to parse `query_string` using `urllib.parse.parse_qsl`.
This methods is used by `query_args` property.
Can be used directly if you need to change default parameters.
:param keep_blank_values: flag indicating whether blank values in
percent-encoded queries should be treated as blank strings.
A true value indicates that blanks should be retained as blank
strings. The default false value indicates that blank values
are to be ignored and treated as if they were not included.
:type keep_blank_values: bool
:param strict_parsing: flag indicating what to do with parsing errors.
If false (the default), errors are silently ignored. If true,
errors raise a ValueError exception.
:type strict_parsing: bool
:param encoding: specify how to decode percent-encoded sequences
into Unicode characters, as accepted by the bytes.decode() method.
:type encoding: str
:param errors: specify how to decode percent-encoded sequences
into Unicode characters, as accepted by the bytes.decode() method.
:type errors: str
:return: list
"""
if not self.parsed_not_grouped_args[
(keep_blank_values, strict_parsing, encoding, errors)
]:
if self.query_string:
self.parsed_not_grouped_args[
(keep_blank_values, strict_parsing, encoding, errors)
] = parse_qsl(
qs=self.query_string,
keep_blank_values=keep_blank_values,
strict_parsing=strict_parsing,
encoding=encoding,
errors=errors,
)
return self.parsed_not_grouped_args[
(keep_blank_values, strict_parsing, encoding, errors)
]
query_args = property(get_query_args)
@property
def cookies(self):
if self._cookies is None:
@ -257,19 +355,38 @@ class Request(dict):
@property
def remote_addr(self):
"""Attempt to return the original client ip based on X-Forwarded-For.
"""Attempt to return the original client ip based on X-Forwarded-For
or X-Real-IP. If HTTP headers are unavailable or untrusted, returns
an empty string.
:return: original client ip.
"""
if not hasattr(self, "_remote_addr"):
forwarded_for = self.headers.get("X-Forwarded-For", "").split(",")
remote_addrs = [
addr
for addr in [addr.strip() for addr in forwarded_for]
if addr
]
if len(remote_addrs) > 0:
self._remote_addr = remote_addrs[0]
if self.app.config.PROXIES_COUNT == 0:
self._remote_addr = ""
elif self.app.config.REAL_IP_HEADER and self.headers.get(
self.app.config.REAL_IP_HEADER
):
self._remote_addr = self.headers[
self.app.config.REAL_IP_HEADER
]
elif self.app.config.FORWARDED_FOR_HEADER:
forwarded_for = self.headers.get(
self.app.config.FORWARDED_FOR_HEADER, ""
).split(",")
remote_addrs = [
addr
for addr in [addr.strip() for addr in forwarded_for]
if addr
]
if self.app.config.PROXIES_COUNT == -1:
self._remote_addr = remote_addrs[0]
elif len(remote_addrs) >= self.app.config.PROXIES_COUNT:
self._remote_addr = remote_addrs[
-self.app.config.PROXIES_COUNT
]
else:
self._remote_addr = ""
else:
self._remote_addr = ""
return self._remote_addr
@ -358,15 +475,28 @@ def parse_multipart_form(body, boundary):
)
if form_header_field == "content-disposition":
file_name = form_parameters.get("filename")
field_name = form_parameters.get("name")
file_name = form_parameters.get("filename")
# non-ASCII filenames in RFC2231, "filename*" format
if file_name is None and form_parameters.get("filename*"):
encoding, _, value = email.utils.decode_rfc2231(
form_parameters["filename*"]
)
file_name = unquote(value, encoding=encoding)
elif form_header_field == "content-type":
content_type = form_header_value
content_charset = form_parameters.get("charset", "utf-8")
if field_name:
post_data = form_part[line_index:-4]
if file_name:
if file_name is None:
value = post_data.decode(content_charset)
if field_name in fields:
fields[field_name].append(value)
else:
fields[field_name] = [value]
else:
form_file = File(
type=content_type, name=file_name, body=post_data
)
@ -374,12 +504,6 @@ def parse_multipart_form(body, boundary):
files[field_name].append(form_file)
else:
files[field_name] = [form_file]
else:
value = post_data.decode(content_charset)
if field_name in fields:
fields[field_name].append(value)
else:
fields[field_name] = [value]
else:
logger.debug(
"Form-data field does not have a 'name' parameter "

View File

@ -59,16 +59,23 @@ class StreamingHTTPResponse(BaseHTTPResponse):
"status",
"content_type",
"headers",
"chunked",
"_cookies",
)
def __init__(
self, streaming_fn, status=200, headers=None, content_type="text/plain"
self,
streaming_fn,
status=200,
headers=None,
content_type="text/plain",
chunked=True,
):
self.content_type = content_type
self.streaming_fn = streaming_fn
self.status = status
self.headers = CIMultiDict(headers or {})
self.chunked = chunked
self._cookies = None
async def write(self, data):
@ -79,7 +86,10 @@ class StreamingHTTPResponse(BaseHTTPResponse):
if type(data) != bytes:
data = self._encode_body(data)
self.protocol.push_data(b"%x\r\n%b\r\n" % (len(data), data))
if self.chunked:
self.protocol.push_data(b"%x\r\n%b\r\n" % (len(data), data))
else:
self.protocol.push_data(data)
await self.protocol.drain()
async def stream(
@ -88,6 +98,8 @@ class StreamingHTTPResponse(BaseHTTPResponse):
"""Streams headers, runs the `streaming_fn` callback that writes
content to the response body, then finalizes the response body.
"""
if version != "1.1":
self.chunked = False
headers = self.get_headers(
version,
keep_alive=keep_alive,
@ -96,7 +108,8 @@ class StreamingHTTPResponse(BaseHTTPResponse):
self.protocol.push_data(headers)
await self.protocol.drain()
await self.streaming_fn(self)
self.protocol.push_data(b"0\r\n\r\n")
if self.chunked:
self.protocol.push_data(b"0\r\n\r\n")
# no need to await drain here after this write, because it is the
# very last thing we write and nothing needs to wait for it.
@ -109,15 +122,16 @@ class StreamingHTTPResponse(BaseHTTPResponse):
if keep_alive and keep_alive_timeout is not None:
timeout_header = b"Keep-Alive: %d\r\n" % keep_alive_timeout
self.headers["Transfer-Encoding"] = "chunked"
self.headers.pop("Content-Length", None)
if self.chunked and version == "1.1":
self.headers["Transfer-Encoding"] = "chunked"
self.headers.pop("Content-Length", None)
self.headers["Content-Type"] = self.headers.get(
"Content-Type", self.content_type
)
headers = self._parse_headers()
if self.status is 200:
if self.status == 200:
status = b"OK"
else:
status = STATUS_CODES.get(self.status)
@ -176,7 +190,7 @@ class HTTPResponse(BaseHTTPResponse):
headers = self._parse_headers()
if self.status is 200:
if self.status == 200:
status = b"OK"
else:
status = STATUS_CODES.get(self.status, b"UNKNOWN RESPONSE")
@ -327,6 +341,7 @@ async def file_stream(
mime_type=None,
headers=None,
filename=None,
chunked=True,
_range=None,
):
"""Return a streaming response object with file data.
@ -336,6 +351,7 @@ async def file_stream(
:param mime_type: Specific mime_type.
:param headers: Custom Headers.
:param filename: Override filename.
:param chunked: Enable or disable chunked transfer-encoding
:param _range:
"""
headers = headers or {}
@ -383,6 +399,7 @@ async def file_stream(
status=status,
headers=headers,
content_type=mime_type,
chunked=chunked,
)
@ -391,6 +408,7 @@ def stream(
status=200,
headers=None,
content_type="text/plain; charset=utf-8",
chunked=True,
):
"""Accepts an coroutine `streaming_fn` which can be used to
write chunks to a streaming response. Returns a `StreamingHTTPResponse`.
@ -409,9 +427,14 @@ def stream(
writes content to that response.
:param mime_type: Specific mime_type.
:param headers: Custom Headers.
:param chunked: Enable or disable chunked transfer-encoding
"""
return StreamingHTTPResponse(
streaming_fn, headers=headers, content_type=content_type, status=status
streaming_fn,
headers=headers,
content_type=content_type,
status=status,
chunked=chunked,
)

View File

@ -17,8 +17,8 @@ Parameter = namedtuple("Parameter", ["name", "cast"])
REGEX_TYPES = {
"string": (str, r"[^/]+"),
"int": (int, r"\d+"),
"number": (float, r"[0-9\\.]+"),
"int": (int, r"-?\d+"),
"number": (float, r"-?(?:\d+(?:\.\d*)?|\.\d+)"),
"alpha": (str, r"[A-Za-z]+"),
"path": (str, r"[^/].*?"),
"uuid": (

View File

@ -34,9 +34,6 @@ except ImportError:
pass
current_time = None
class Signal:
stopped = False
@ -47,6 +44,8 @@ class HttpProtocol(asyncio.Protocol):
"""
__slots__ = (
# app
"app",
# event loop, connection
"loop",
"transport",
@ -91,6 +90,7 @@ class HttpProtocol(asyncio.Protocol):
self,
*,
loop,
app,
request_handler,
error_handler,
signal=Signal(),
@ -110,6 +110,7 @@ class HttpProtocol(asyncio.Protocol):
**kwargs
):
self.loop = loop
self.app = app
self.transport = None
self.request = None
self.parser = None
@ -118,7 +119,7 @@ class HttpProtocol(asyncio.Protocol):
self.router = router
self.signal = signal
self.access_log = access_log
self.connections = connections or set()
self.connections = connections if connections is not None else set()
self.request_handler = request_handler
self.error_handler = error_handler
self.request_timeout = request_timeout
@ -171,7 +172,7 @@ class HttpProtocol(asyncio.Protocol):
self.request_timeout, self.request_timeout_callback
)
self.transport = transport
self._last_request_time = current_time
self._last_request_time = time()
def connection_lost(self, exc):
self.connections.discard(self)
@ -197,7 +198,7 @@ class HttpProtocol(asyncio.Protocol):
# exactly what this timeout is checking for.
# Check if elapsed time since request initiated exceeds our
# configured maximum request timeout value
time_elapsed = current_time - self._last_request_time
time_elapsed = time() - self._last_request_time
if time_elapsed < self.request_timeout:
time_left = self.request_timeout - time_elapsed
self._request_timeout_handler = self.loop.call_later(
@ -213,7 +214,7 @@ class HttpProtocol(asyncio.Protocol):
def response_timeout_callback(self):
# Check if elapsed time since response was initiated exceeds our
# configured maximum request timeout value
time_elapsed = current_time - self._last_request_time
time_elapsed = time() - self._last_request_time
if time_elapsed < self.response_timeout:
time_left = self.response_timeout - time_elapsed
self._response_timeout_handler = self.loop.call_later(
@ -234,7 +235,7 @@ class HttpProtocol(asyncio.Protocol):
:return: None
"""
time_elapsed = current_time - self._last_response_time
time_elapsed = time() - self._last_response_time
if time_elapsed < self.keep_alive_timeout:
time_left = self.keep_alive_timeout - time_elapsed
self._keep_alive_timeout_handler = self.loop.call_later(
@ -306,6 +307,7 @@ class HttpProtocol(asyncio.Protocol):
version=self.parser.get_http_version(),
method=self.parser.get_method().decode(),
transport=self.transport,
app=self.app,
)
# Remove any existing KeepAlive handler here,
# It will be recreated if required on the new request.
@ -362,7 +364,7 @@ class HttpProtocol(asyncio.Protocol):
self._response_timeout_handler = self.loop.call_later(
self.response_timeout, self.response_timeout_callback
)
self._last_request_time = current_time
self._last_request_time = time()
self._request_handler_task = self.loop.create_task(
self.request_handler(
self.request, self.write_response, self.stream_response
@ -449,7 +451,7 @@ class HttpProtocol(asyncio.Protocol):
self._keep_alive_timeout_handler = self.loop.call_later(
self.keep_alive_timeout, self.keep_alive_timeout_callback
)
self._last_response_time = current_time
self._last_response_time = time()
self.cleanup()
async def drain(self):
@ -502,7 +504,7 @@ class HttpProtocol(asyncio.Protocol):
self._keep_alive_timeout_handler = self.loop.call_later(
self.keep_alive_timeout, self.keep_alive_timeout_callback
)
self._last_response_time = current_time
self._last_response_time = time()
self.cleanup()
def write_error(self, exception):
@ -552,11 +554,15 @@ class HttpProtocol(asyncio.Protocol):
:return: None
"""
if from_error or self.transport.is_closing():
if from_error or self.transport is None or self.transport.is_closing():
logger.error(
"Transport closed @ %s and exception "
"experienced during error handling",
self.transport.get_extra_info("peername"),
(
self.transport.get_extra_info("peername")
if self.transport is not None
else "N/A"
),
)
logger.debug("Exception:", exc_info=True)
else:
@ -595,18 +601,6 @@ class HttpProtocol(asyncio.Protocol):
self.transport = None
def update_current_time(loop):
"""Cache the current time, since it is needed at the end of every
keep-alive request to update the request timeout time
:param loop:
:return:
"""
global current_time
current_time = time()
loop.call_later(1, partial(update_current_time, loop))
def trigger_events(events, loop):
"""Trigger event callbacks (functions or async)
@ -622,6 +616,7 @@ def trigger_events(events, loop):
def serve(
host,
port,
app,
request_handler,
error_handler,
before_start=None,
@ -656,6 +651,7 @@ def serve(
websocket_write_limit=2 ** 16,
state=None,
graceful_shutdown_timeout=15.0,
asyncio_server_kwargs=None,
):
"""Start asynchronous HTTP Server on an individual process.
@ -700,6 +696,8 @@ def serve(
:param router: Router object
:param graceful_shutdown_timeout: How long take to Force close non-idle
connection
:param asyncio_server_kwargs: key-value args for asyncio/uvloop
create_server method
:return: Nothing
"""
if not run_async:
@ -716,6 +714,7 @@ def serve(
loop=loop,
connections=connections,
signal=signal,
app=app,
request_handler=request_handler,
error_handler=error_handler,
request_timeout=request_timeout,
@ -734,7 +733,9 @@ def serve(
state=state,
debug=debug,
)
asyncio_server_kwargs = (
asyncio_server_kwargs if asyncio_server_kwargs else {}
)
server_coroutine = loop.create_server(
server,
host,
@ -743,12 +744,9 @@ def serve(
reuse_port=reuse_port,
sock=sock,
backlog=backlog,
**asyncio_server_kwargs
)
# Instead of pulling time at the end of every request,
# pull it once per minute
loop.call_soon(partial(update_current_time, loop))
if run_async:
return server_coroutine

View File

@ -1,4 +1,10 @@
from json import JSONDecodeError
from socket import socket
import requests_async as requests
import websockets
import asyncio
import http
import io
@ -18,54 +24,10 @@ HOST = "127.0.0.1"
PORT = 42101
# Annotations for `Session.request()`
Cookies = typing.Union[
typing.MutableMapping[str, str], requests.cookies.RequestsCookieJar
]
Params = typing.Union[bytes, typing.MutableMapping[str, str]]
DataType = typing.Union[bytes, typing.MutableMapping[str, str], typing.IO]
TimeOut = typing.Union[float, typing.Tuple[float, float]]
FileType = typing.MutableMapping[str, typing.IO]
AuthType = typing.Union[
typing.Tuple[str, str],
requests.auth.AuthBase,
typing.Callable[[requests.Request], requests.Request],
]
class _HeaderDict(requests.packages.urllib3._collections.HTTPHeaderDict):
def get_all(self, key: str, default: str) -> str:
return self.getheaders(key)
class _MockOriginalResponse:
"""
We have to jump through some hoops to present the response as if
it was made using urllib3.
"""
def __init__(self, headers: typing.List[typing.Tuple[bytes, bytes]]) -> None:
self.msg = _HeaderDict(headers)
self.closed = False
def isclosed(self) -> bool:
return self.closed
class _Upgrade(Exception):
def __init__(self, session: "WebSocketTestSession") -> None:
self.session = session
def _get_reason_phrase(status_code: int) -> str:
try:
return http.HTTPStatus(status_code).phrase
except ValueError:
return ""
class _ASGIAdapter(requests.adapters.HTTPAdapter):
def __init__(self, app: ASGIApp, raise_server_exceptions: bool = True) -> None:
class SanicTestClient:
def __init__(self, app, port=PORT):
"""Use port=None to bind to a random port"""
self.app = app
self.raise_server_exceptions = raise_server_exceptions
@ -76,22 +38,55 @@ class _ASGIAdapter(requests.adapters.HTTPAdapter):
request.url
)
default_port = {"http": 80, "ws": 80, "https": 443, "wss": 443}[scheme]
def get_new_session(self):
return requests.Session()
if ":" in netloc:
host, port_string = netloc.split(":", 1)
port = int(port_string)
else:
host = netloc
port = default_port
async def _local_request(self, method, url, *args, **kwargs):
logger.info(url)
raw_cookies = kwargs.pop("raw_cookies", None)
# Include the 'host' header.
if "host" in request.headers:
headers = [] # type: typing.List[typing.Tuple[bytes, bytes]]
elif port == default_port:
headers = [(b"host", host.encode())]
if method == "websocket":
async with websockets.connect(url, *args, **kwargs) as websocket:
websocket.opened = websocket.open
return websocket
else:
headers = [(b"host", ("%s:%d" % (host, port)).encode())]
async with self.get_new_session() as session:
try:
response = await getattr(session, method.lower())(
url, verify=False, *args, **kwargs
)
except NameError:
raise Exception(response.status_code)
try:
response.json = response.json()
except (JSONDecodeError, UnicodeDecodeError):
response.json = None
response.body = await response.read()
response.status = response.status_code
response.content_type = response.headers.get("content-type")
if raw_cookies:
response.raw_cookies = {}
for cookie in response.cookies:
response.raw_cookies[cookie.name] = cookie
return response
def _sanic_endpoint_test(
self,
method="get",
uri="/",
gather_request=True,
debug=False,
server_kwargs={"auto_reload": False},
*request_args,
**request_kwargs
):
results = [None, None]
exceptions = []
# Include other request headers.
headers += [
@ -158,25 +153,31 @@ class _ASGIAdapter(requests.adapters.HTTPAdapter):
else:
body_bytes = body
request_complete = True
return {"type": "http.request", "body": body_bytes}
if self.port:
server_kwargs = dict(host=HOST, port=self.port, **server_kwargs)
host, port = HOST, self.port
else:
sock = socket()
sock.bind((HOST, 0))
server_kwargs = dict(sock=sock, **server_kwargs)
host, port = sock.getsockname()
async def send(message: Message) -> None:
nonlocal raw_kwargs, response_started, response_complete, template, context
if uri.startswith(
("http:", "https:", "ftp:", "ftps://", "//", "ws:", "wss:")
):
url = uri
else:
uri = uri if uri.startswith("/") else "/{uri}".format(uri=uri)
scheme = "ws" if method == "websocket" else "http"
url = "{scheme}://{host}:{port}{uri}".format(
scheme=scheme, host=host, port=port, uri=uri
)
if message["type"] == "http.response.start":
assert (
not response_started
), 'Received multiple "http.response.start" messages.'
raw_kwargs["version"] = 11
raw_kwargs["status"] = message["status"]
raw_kwargs["reason"] = _get_reason_phrase(message["status"])
raw_kwargs["headers"] = [
(key.decode(), value.decode()) for key, value in message["headers"]
]
raw_kwargs["preload_content"] = False
raw_kwargs["original_response"] = _MockOriginalResponse(
raw_kwargs["headers"]
@self.app.listener("after_server_start")
async def _collect_response(sanic, loop):
try:
response = await self._local_request(
method, url, *request_args, **request_kwargs
)
response_started = True
elif message["type"] == "http.response.body":
@ -204,11 +205,9 @@ class _ASGIAdapter(requests.adapters.HTTPAdapter):
template = None
context = None
try:
loop = asyncio.get_event_loop()
except RuntimeError:
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
self.app.run(debug=debug, **server_kwargs)
self.app.listeners["after_server_start"].pop()
self.app.is_running = True
try:
@ -350,6 +349,7 @@ class SanicTestClient(requests.Session):
return self.request("options", *args, **kwargs)
def head(self, *args, **kwargs):
if 'uri' in kwargs:
kwargs['url'] = kwargs.pop('uri')
return self.request("head", *args, **kwargs)
return self._sanic_endpoint_test("head", *args, **kwargs)
def websocket(self, *args, **kwargs):
return self._sanic_endpoint_test("websocket", *args, **kwargs)

View File

@ -14,7 +14,7 @@ multi_line_output = 3
not_skip = __init__.py
[version]
current_version = 0.8.3
current_version = 19.3.1
file = sanic/__init__.py
current_version_pattern = __version__ = "{current_version}"
new_version_pattern = __version__ = "{new_version}"

View File

@ -50,12 +50,12 @@ with open_local(["README.rst"]) as rm:
setup_kwargs = {
"name": "sanic",
"version": version,
"url": "http://github.com/channelcat/sanic/",
"url": "http://github.com/huge-success/sanic/",
"license": "MIT",
"author": "Channel Cat",
"author_email": "channelcat@gmail.com",
"author": "Sanic Community",
"author_email": "admhpkns@gmail.com",
"description": (
"A microframework based on uvloop, httptools, and learnings of flask"
"A web server and web framework that's written to go fast. Build fast. Run fast."
),
"long_description": long_description,
"packages": ["sanic"],
@ -64,7 +64,6 @@ setup_kwargs = {
"Development Status :: 4 - Beta",
"Environment :: Web Environment",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3.5",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
],
@ -90,12 +89,14 @@ tests_require = [
"multidict>=4.0,<5.0",
"gunicorn",
"pytest-cov",
"aiohttp>=2.3.0,<=3.2.1",
"httpcore==0.1.1",
"requests-async==0.4.0",
"beautifulsoup4",
uvloop,
ujson,
"pytest-sanic",
"pytest-sugar",
"pytest-benchmark",
]
if strtobool(os.environ.get("SANIC_NO_UJSON", "no")):
@ -118,7 +119,7 @@ extras_require = {
"recommonmark",
"sphinxcontrib-asyncio",
"docutils",
"pygments"
"pygments",
],
}

View File

@ -0,0 +1,55 @@
from random import choice, seed
from pytest import mark
import sanic.router
seed("Pack my box with five dozen liquor jugs.")
# Disable Caching for testing purpose
sanic.router.ROUTER_CACHE_SIZE = 0
class TestSanicRouteResolution:
@mark.asyncio
async def test_resolve_route_no_arg_string_path(
self, sanic_router, route_generator, benchmark
):
simple_routes = route_generator.generate_random_direct_route(
max_route_depth=4
)
router, simple_routes = sanic_router(route_details=simple_routes)
route_to_call = choice(simple_routes)
result = benchmark.pedantic(
router._get,
("/{}".format(route_to_call[-1]), route_to_call[0], "localhost"),
iterations=1000,
rounds=1000,
)
assert await result[0](None) == 1
@mark.asyncio
async def test_resolve_route_with_typed_args(
self, sanic_router, route_generator, benchmark
):
typed_routes = route_generator.add_typed_parameters(
route_generator.generate_random_direct_route(max_route_depth=4),
max_route_depth=8,
)
router, typed_routes = sanic_router(route_details=typed_routes)
route_to_call = choice(typed_routes)
url = route_generator.generate_url_for_template(
template=route_to_call[-1]
)
print("{} -> {}".format(route_to_call[-1], url))
result = benchmark.pedantic(
router._get,
("/{}".format(url), route_to_call[0], "localhost"),
iterations=1000,
rounds=1000,
)
assert await result[0](None) == 1

View File

@ -1,12 +1,131 @@
import random
import re
import string
import sys
import uuid
import pytest
from sanic import Sanic
from sanic.router import RouteExists, Router
random.seed("Pack my box with five dozen liquor jugs.")
if sys.platform in ["win32", "cygwin"]:
collect_ignore = ["test_worker.py"]
async def _handler(request):
"""
Dummy placeholder method used for route resolver when creating a new
route into the sanic router. This router is not actually called by the
sanic app. So do not worry about the arguments to this method.
If you change the return value of this method, make sure to propagate the
change to any test case that leverages RouteStringGenerator.
"""
return 1
TYPE_TO_GENERATOR_MAP = {
"string": lambda: "".join(
[random.choice(string.ascii_letters + string.digits) for _ in range(4)]
),
"int": lambda: random.choice(range(1000000)),
"number": lambda: random.random(),
"alpha": lambda: "".join(
[random.choice(string.ascii_letters) for _ in range(4)]
),
"uuid": lambda: str(uuid.uuid1()),
}
class RouteStringGenerator:
ROUTE_COUNT_PER_DEPTH = 100
HTTP_METHODS = ["GET", "PUT", "POST", "PATCH", "DELETE", "OPTION"]
ROUTE_PARAM_TYPES = ["string", "int", "number", "alpha", "uuid"]
def generate_random_direct_route(self, max_route_depth=4):
routes = []
for depth in range(1, max_route_depth + 1):
for _ in range(self.ROUTE_COUNT_PER_DEPTH):
route = "/".join(
[
TYPE_TO_GENERATOR_MAP.get("string")()
for _ in range(depth)
]
)
route = route.replace(".", "", -1)
route_detail = (random.choice(self.HTTP_METHODS), route)
if route_detail not in routes:
routes.append(route_detail)
return routes
def add_typed_parameters(self, current_routes, max_route_depth=8):
routes = []
for method, route in current_routes:
current_length = len(route.split("/"))
new_route_part = "/".join(
[
"<{}:{}>".format(
TYPE_TO_GENERATOR_MAP.get("string")(),
random.choice(self.ROUTE_PARAM_TYPES),
)
for _ in range(max_route_depth - current_length)
]
)
route = "/".join([route, new_route_part])
route = route.replace(".", "", -1)
routes.append((method, route))
return routes
@staticmethod
def generate_url_for_template(template):
url = template
for pattern, param_type in re.findall(
re.compile(r"((?:<\w+:(string|int|number|alpha|uuid)>)+)"),
template,
):
value = TYPE_TO_GENERATOR_MAP.get(param_type)()
url = url.replace(pattern, str(value), -1)
return url
@pytest.fixture(scope="function")
def sanic_router():
# noinspection PyProtectedMember
def _setup(route_details: tuple) -> (Router, tuple):
router = Router()
added_router = []
for method, route in route_details:
try:
router._add(
uri="/{}".format(route),
methods=frozenset({method}),
host="localhost",
handler=_handler,
)
added_router.append((method, route))
except RouteExists:
pass
return router, added_router
return _setup
@pytest.fixture(scope="function")
def route_generator() -> RouteStringGenerator:
return RouteStringGenerator()
@pytest.fixture(scope="function")
def url_param_generator():
return TYPE_TO_GENERATOR_MAP
@pytest.fixture
def app(request):
return Sanic(request.node.name)

View File

@ -1,10 +1,13 @@
# Run with python3 simple_server.py PORT
from aiohttp import web
import asyncio
import sys
import uvloop
import ujson as json
import uvloop
from aiohttp import web
loop = uvloop.new_event_loop()
asyncio.set_event_loop(loop)

View File

@ -1,8 +1,9 @@
# Run with: gunicorn --workers=1 --worker-class=meinheld.gmeinheld.MeinheldWorker -b :8000 simple_server:app
import bottle
from bottle import route, run
import ujson
from bottle import route, run
@route("/")
def index():

View File

@ -1,10 +1,13 @@
# Run with: python3 -O simple_server.py
import asyncio
from kyoukai import Kyoukai, HTTPRequestContext
import logging
import ujson
import uvloop
from kyoukai import HTTPRequestContext, Kyoukai
loop = uvloop.new_event_loop()
asyncio.set_event_loop(loop)

View File

@ -1,16 +1,18 @@
import asyncpg
import sys
import os
import inspect
import os
import sys
import timeit
import asyncpg
from sanic.response import json
currentdir = os.path.dirname(
os.path.abspath(inspect.getfile(inspect.currentframe()))
)
sys.path.insert(0, currentdir + "/../../../")
import timeit
from sanic.response import json
print(json({"test": True}).output())

View File

@ -1,14 +1,16 @@
import sys
import os
import inspect
import os
import sys
from sanic import Sanic
from sanic.response import json
currentdir = os.path.dirname(
os.path.abspath(inspect.getfile(inspect.currentframe()))
)
sys.path.insert(0, currentdir + "/../../../")
from sanic import Sanic
from sanic.response import json
app = Sanic("test")

View File

@ -1,15 +1,17 @@
import sys
import os
import inspect
import os
import sys
from sanic import Sanic
from sanic.exceptions import ServerError
from sanic.response import json, text
currentdir = os.path.dirname(
os.path.abspath(inspect.getfile(inspect.currentframe()))
)
sys.path.insert(0, currentdir + "/../../../")
from sanic import Sanic
from sanic.response import json, text
from sanic.exceptions import ServerError
app = Sanic("test")
@ -56,8 +58,6 @@ def query_string(request):
)
import sys
app.run(host="0.0.0.0", port=sys.argv[1])

View File

@ -1,5 +1,6 @@
# Run with: python simple_server.py
import ujson
from tornado import ioloop, web

View File

@ -2,15 +2,16 @@
""" Minimal helloworld application.
"""
from wheezy.http import HTTPResponse
from wheezy.http import WSGIApplication
import ujson
from wheezy.http import HTTPResponse, WSGIApplication
from wheezy.http.response import json_response
from wheezy.routing import url
from wheezy.web.handlers import BaseHandler
from wheezy.web.middleware import bootstrap_defaults
from wheezy.web.middleware import path_routing_middleware_factory
import ujson
from wheezy.web.middleware import (
bootstrap_defaults,
path_routing_middleware_factory,
)
class WelcomeHandler(BaseHandler):

View File

@ -1,5 +1,8 @@
import asyncio
import logging
import sys
from inspect import isawaitable
import pytest
@ -7,6 +10,15 @@ from sanic.exceptions import SanicException
from sanic.response import text
def uvloop_installed():
try:
import uvloop # noqa
return True
except ImportError:
return False
def test_app_loop_running(app):
@app.get("/test")
async def handler(request):
@ -17,9 +29,35 @@ def test_app_loop_running(app):
assert response.text == "pass"
@pytest.mark.skipif(
sys.version_info < (3, 7), reason="requires python3.7 or higher"
)
def test_create_asyncio_server(app):
if not uvloop_installed():
loop = asyncio.get_event_loop()
asyncio_srv_coro = app.create_server(return_asyncio_server=True)
assert isawaitable(asyncio_srv_coro)
srv = loop.run_until_complete(asyncio_srv_coro)
assert srv.is_serving() is True
@pytest.mark.skipif(
sys.version_info < (3, 7), reason="requires python3.7 or higher"
)
def test_asyncio_server_start_serving(app):
if not uvloop_installed():
loop = asyncio.get_event_loop()
asyncio_srv_coro = app.create_server(
return_asyncio_server=True,
asyncio_server_kwargs=dict(start_serving=False),
)
srv = loop.run_until_complete(asyncio_srv_coro)
assert srv.is_serving() is False
def test_app_loop_not_running(app):
with pytest.raises(SanicException) as excinfo:
app.loop
_ = app.loop
assert str(excinfo.value) == (
"Loop can only be retrieved after the app has started "
@ -103,7 +141,6 @@ def test_handle_request_with_nested_exception(app, monkeypatch):
@app.get("/")
def handler(request):
raise Exception
return text("OK")
request, response = app.test_client.get("/")
assert response.status == 500
@ -125,7 +162,6 @@ def test_handle_request_with_nested_exception_debug(app, monkeypatch):
@app.get("/")
def handler(request):
raise Exception
return text("OK")
request, response = app.test_client.get("/", debug=True)
assert response.status == 500
@ -149,7 +185,6 @@ def test_handle_request_with_nested_sanic_exception(app, monkeypatch, caplog):
@app.get("/")
def handler(request):
raise Exception
return text("OK")
with caplog.at_level(logging.ERROR):
request, response = app.test_client.get("/")

View File

@ -0,0 +1,181 @@
from pytest import raises
from sanic.app import Sanic
from sanic.blueprints import Blueprint
from sanic.request import Request
from sanic.response import HTTPResponse, text
MIDDLEWARE_INVOKE_COUNTER = {"request": 0, "response": 0}
AUTH = "dGVzdDp0ZXN0Cg=="
def test_bp_group_indexing(app: Sanic):
blueprint_1 = Blueprint("blueprint_1", url_prefix="/bp1")
blueprint_2 = Blueprint("blueprint_2", url_prefix="/bp2")
group = Blueprint.group(blueprint_1, blueprint_2)
assert group[0] == blueprint_1
with raises(expected_exception=IndexError) as e:
_ = group[3]
def test_bp_group_with_additional_route_params(app: Sanic):
blueprint_1 = Blueprint("blueprint_1", url_prefix="/bp1")
blueprint_2 = Blueprint("blueprint_2", url_prefix="/bp2")
@blueprint_1.route(
"/request_path", methods=frozenset({"PUT", "POST"}), version=2
)
def blueprint_1_v2_method_with_put_and_post(request: Request):
if request.method == "PUT":
return text("PUT_OK")
elif request.method == "POST":
return text("POST_OK")
@blueprint_2.route(
"/route/<param>", methods=frozenset({"DELETE", "PATCH"}), name="test"
)
def blueprint_2_named_method(request: Request, param):
if request.method == "DELETE":
return text("DELETE_{}".format(param))
elif request.method == "PATCH":
return text("PATCH_{}".format(param))
blueprint_group = Blueprint.group(
blueprint_1, blueprint_2, url_prefix="/api"
)
@blueprint_group.middleware("request")
def authenticate_request(request: Request):
global AUTH
auth = request.headers.get("authorization")
if auth:
# Dummy auth check. We can have anything here and it's fine.
if AUTH not in auth:
return text("Unauthorized", status=401)
else:
return text("Unauthorized", status=401)
@blueprint_group.middleware("response")
def enhance_response_middleware(request: Request, response: HTTPResponse):
response.headers.add("x-test-middleware", "value")
app.blueprint(blueprint_group)
header = {"authorization": " ".join(["Basic", AUTH])}
_, response = app.test_client.put(
"/v2/api/bp1/request_path", headers=header
)
assert response.text == "PUT_OK"
assert response.headers.get("x-test-middleware") == "value"
_, response = app.test_client.post(
"/v2/api/bp1/request_path", headers=header
)
assert response.text == "POST_OK"
_, response = app.test_client.delete("/api/bp2/route/bp2", headers=header)
assert response.text == "DELETE_bp2"
_, response = app.test_client.patch("/api/bp2/route/bp2", headers=header)
assert response.text == "PATCH_bp2"
_, response = app.test_client.get("/v2/api/bp1/request_path")
assert response.status == 401
def test_bp_group(app: Sanic):
blueprint_1 = Blueprint("blueprint_1", url_prefix="/bp1")
blueprint_2 = Blueprint("blueprint_2", url_prefix="/bp2")
@blueprint_1.route("/")
def blueprint_1_default_route(request):
return text("BP1_OK")
@blueprint_2.route("/")
def blueprint_2_default_route(request):
return text("BP2_OK")
blueprint_group_1 = Blueprint.group(
blueprint_1, blueprint_2, url_prefix="/bp"
)
blueprint_3 = Blueprint("blueprint_3", url_prefix="/bp3")
@blueprint_group_1.middleware("request")
def blueprint_group_1_middleware(request):
global MIDDLEWARE_INVOKE_COUNTER
MIDDLEWARE_INVOKE_COUNTER["request"] += 1
@blueprint_3.route("/")
def blueprint_3_default_route(request):
return text("BP3_OK")
blueprint_group_2 = Blueprint.group(
blueprint_group_1, blueprint_3, url_prefix="/api"
)
@blueprint_group_2.middleware("response")
def blueprint_group_2_middleware(request, response):
global MIDDLEWARE_INVOKE_COUNTER
MIDDLEWARE_INVOKE_COUNTER["response"] += 1
app.blueprint(blueprint_group_2)
@app.route("/")
def app_default_route(request):
return text("APP_OK")
_, response = app.test_client.get("/")
assert response.text == "APP_OK"
_, response = app.test_client.get("/api/bp/bp1")
assert response.text == "BP1_OK"
_, response = app.test_client.get("/api/bp/bp2")
assert response.text == "BP2_OK"
_, response = app.test_client.get("/api/bp3")
assert response.text == "BP3_OK"
assert MIDDLEWARE_INVOKE_COUNTER["response"] == 4
assert MIDDLEWARE_INVOKE_COUNTER["request"] == 4
def test_bp_group_list_operations(app: Sanic):
blueprint_1 = Blueprint("blueprint_1", url_prefix="/bp1")
blueprint_2 = Blueprint("blueprint_2", url_prefix="/bp2")
@blueprint_1.route("/")
def blueprint_1_default_route(request):
return text("BP1_OK")
@blueprint_2.route("/")
def blueprint_2_default_route(request):
return text("BP2_OK")
blueprint_group_1 = Blueprint.group(
blueprint_1, blueprint_2, url_prefix="/bp"
)
blueprint_3 = Blueprint("blueprint_2", url_prefix="/bp3")
@blueprint_3.route("/second")
def blueprint_3_second_route(request):
return text("BP3_OK")
assert len(blueprint_group_1) == 2
blueprint_group_1.append(blueprint_3)
assert len(blueprint_group_1) == 3
del blueprint_group_1[2]
assert len(blueprint_group_1) == 2
blueprint_group_1[1] = blueprint_3
assert len(blueprint_group_1) == 2
assert blueprint_group_1.url_prefix == "/bp"

View File

@ -7,9 +7,9 @@ import pytest
from sanic.app import Sanic
from sanic.blueprints import Blueprint
from sanic.constants import HTTP_METHODS
from sanic.exceptions import NotFound, ServerError, InvalidUsage
from sanic.exceptions import InvalidUsage, NotFound, ServerError
from sanic.request import Request
from sanic.response import text, json
from sanic.response import json, text
from sanic.views import CompositionView
@ -467,16 +467,8 @@ def test_bp_shorthand(app):
request, response = app.test_client.get("/delete")
assert response.status == 405
request, response = app.test_client.get(
"/ws/",
headers={
"Upgrade": "websocket",
"Connection": "upgrade",
"Sec-WebSocket-Key": "dGhlIHNhbXBsZSBub25jZQ==",
"Sec-WebSocket-Version": "13",
},
)
assert response.status == 101
request, response = app.test_client.websocket("/ws/")
assert response.opened is True
assert ev.is_set()
@ -595,14 +587,13 @@ def test_blueprint_middleware_with_args(app: Sanic):
"/wa", headers={"content-type": "plain/text"}
)
assert response.json.get("test") == "value"
d = {}
@pytest.mark.parametrize("file_name", ["test.file"])
def test_static_blueprint_name(app: Sanic, static_file_directory, file_name):
current_file = inspect.getfile(inspect.currentframe())
with open(current_file, "rb") as file:
current_file_contents = file.read()
file.read()
bp = Blueprint(name="static", url_prefix="/static", strict_slashes=False)
@ -662,16 +653,8 @@ def test_websocket_route(app: Sanic):
app.blueprint(bp)
_, response = app.test_client.get(
"/ws/test",
headers={
"Upgrade": "websocket",
"Connection": "upgrade",
"Sec-WebSocket-Key": "dGhlIHNhbXBsZSBub25jZQ==",
"Sec-WebSocket-Version": "13",
},
)
assert response.status == 101
_, response = app.test_client.websocket("/ws/test")
assert response.opened is True
assert event.is_set()

View File

@ -1,12 +1,13 @@
from contextlib import contextmanager
from os import environ
from pathlib import Path
from contextlib import contextmanager
from tempfile import TemporaryDirectory
from textwrap import dedent
import pytest
from sanic import Sanic
from sanic.config import Config, DEFAULT_CONFIG
from sanic.config import DEFAULT_CONFIG, Config
from sanic.exceptions import PyFileError
@ -144,9 +145,10 @@ def test_overwrite_exisiting_config_ignore_lowercase(app):
def test_missing_config(app):
with pytest.raises(AttributeError) as e:
app.config.NON_EXISTENT
assert str(e.value) == ("Config has no 'NON_EXISTENT'")
with pytest.raises(
AttributeError, match="Config has no 'NON_EXISTENT'"
) as e:
_ = app.config.NON_EXISTENT
def test_config_defaults():
@ -166,7 +168,7 @@ def test_config_custom_defaults():
custom_defaults = {
"REQUEST_MAX_SIZE": 1,
"KEEP_ALIVE": False,
"ACCESS_LOG": False
"ACCESS_LOG": False,
}
conf = Config(defaults=custom_defaults)
for key, value in DEFAULT_CONFIG.items():
@ -182,13 +184,13 @@ def test_config_custom_defaults_with_env():
custom_defaults = {
"REQUEST_MAX_SIZE123": 1,
"KEEP_ALIVE123": False,
"ACCESS_LOG123": False
"ACCESS_LOG123": False,
}
environ_defaults = {
"SANIC_REQUEST_MAX_SIZE123": "2",
"SANIC_KEEP_ALIVE123": "True",
"SANIC_ACCESS_LOG123": "False"
"SANIC_ACCESS_LOG123": "False",
}
for key, value in environ_defaults.items():
@ -201,8 +203,8 @@ def test_config_custom_defaults_with_env():
try:
value = int(value)
except ValueError:
if value in ['True', 'False']:
value = value == 'True'
if value in ["True", "False"]:
value = value == "True"
assert getattr(conf, key) == value
@ -213,7 +215,7 @@ def test_config_custom_defaults_with_env():
def test_config_access_log_passing_in_run(app):
assert app.config.ACCESS_LOG == True
@app.listener('after_server_start')
@app.listener("after_server_start")
async def _request(sanic, loop):
app.stop()
@ -227,14 +229,18 @@ def test_config_access_log_passing_in_run(app):
async def test_config_access_log_passing_in_create_server(app):
assert app.config.ACCESS_LOG == True
@app.listener('after_server_start')
@app.listener("after_server_start")
async def _request(sanic, loop):
app.stop()
await app.create_server(port=1341, access_log=False)
await app.create_server(
port=1341, access_log=False, return_asyncio_server=True
)
assert app.config.ACCESS_LOG == False
await app.create_server(port=1342, access_log=True)
await app.create_server(
port=1342, access_log=True, return_asyncio_server=True
)
assert app.config.ACCESS_LOG == True

View File

@ -1,8 +1,11 @@
from datetime import datetime, timedelta
from http.cookies import SimpleCookie
from sanic.response import text
import pytest
from sanic.cookies import Cookie
from sanic.response import text
# ------------------------------------------------------------ #
# GET
@ -135,10 +138,10 @@ def test_cookie_set_same_key(app):
request, response = app.test_client.get("/", cookies=cookies)
assert response.status == 200
assert response.cookies["test"].value == "pass"
assert response.cookies["test"] == "pass"
@pytest.mark.parametrize("max_age", ["0", 30, "30"])
@pytest.mark.parametrize("max_age", ["0", 30, 30.0, 30.1, "30", "test"])
def test_cookie_max_age(app, max_age):
cookies = {"test": "wait"}
@ -149,18 +152,42 @@ def test_cookie_max_age(app, max_age):
response.cookies["test"]["max-age"] = max_age
return response
request, response = app.test_client.get("/", cookies=cookies)
request, response = app.test_client.get(
"/", cookies=cookies, raw_cookies=True
)
assert response.status == 200
assert response.cookies["test"].value == "pass"
assert response.cookies["test"]["max-age"] == str(max_age)
cookie = response.cookies.get("test")
if (
str(max_age).isdigit()
and int(max_age) == float(max_age)
and int(max_age) != 0
):
cookie_expires = datetime.utcfromtimestamp(
response.raw_cookies["test"].expires
).replace(microsecond=0)
# Grabbing utcnow after the response may lead to it being off slightly.
# Therefore, we 0 out the microseconds, and accept the test if there
# is a 1 second difference.
expires = datetime.utcnow().replace(microsecond=0) + timedelta(
seconds=int(max_age)
)
assert cookie == "pass"
assert (
cookie_expires == expires
or cookie_expires == expires + timedelta(seconds=-1)
)
else:
assert cookie is None
@pytest.mark.parametrize(
"expires",
[datetime.now() + timedelta(seconds=60), "Fri, 21-Dec-2018 15:30:00 GMT"],
"expires", [datetime.utcnow() + timedelta(seconds=60)]
)
def test_cookie_expires(app, expires):
expires = expires.replace(microsecond=0)
cookies = {"test": "wait"}
@app.get("/")
@ -170,12 +197,21 @@ def test_cookie_expires(app, expires):
response.cookies["test"]["expires"] = expires
return response
request, response = app.test_client.get("/", cookies=cookies)
request, response = app.test_client.get(
"/", cookies=cookies, raw_cookies=True
)
cookie_expires = datetime.utcfromtimestamp(
response.raw_cookies["test"].expires
).replace(microsecond=0)
assert response.status == 200
assert response.cookies["test"] == "pass"
assert cookie_expires == expires
assert response.cookies["test"].value == "pass"
if isinstance(expires, datetime):
expires = expires.strftime("%a, %d-%b-%Y %T GMT")
assert response.cookies["test"]["expires"] == expires
@pytest.mark.parametrize("expires", ["Fri, 21-Dec-2018 15:30:00 GMT"])
def test_cookie_expires_illegal_instance_type(expires):
c = Cookie("test_cookie", "value")
with pytest.raises(expected_exception=TypeError) as e:
c["expires"] = expires
assert e.message == "Cookie 'expires' property must be a datetime"

View File

@ -1,7 +1,9 @@
from sanic.response import text
from threading import Event
import asyncio
from queue import Queue
from threading import Event
from sanic.response import text
def test_create_task(app):

View File

@ -1,5 +1,5 @@
from sanic.server import HttpProtocol
from sanic.response import text
from sanic.server import HttpProtocol
class CustomHttpProtocol(HttpProtocol):

View File

@ -1,6 +1,7 @@
import pytest
from sanic.response import text
from sanic.router import RouteExists
import pytest
@pytest.mark.parametrize(
@ -39,5 +40,5 @@ def test_overload_dynamic_routes_exist(app):
with pytest.raises(RouteExists):
@app.route("/overload/<param>", methods=["PUT", "DELETE"])
async def handler3(request):
async def handler3(request, param):
return text("Duplicated")

View File

@ -1,10 +1,17 @@
import pytest
from bs4 import BeautifulSoup
from sanic import Sanic
from sanic.exceptions import (
Forbidden,
InvalidUsage,
NotFound,
ServerError,
Unauthorized,
abort,
)
from sanic.response import text
from sanic.exceptions import InvalidUsage, ServerError, NotFound, Unauthorized
from sanic.exceptions import Forbidden, abort
class SanicExceptionTestException(Exception):
@ -74,7 +81,7 @@ def exception_app():
@app.route("/divide_by_zero")
def handle_unhandled_exception(request):
1 / 0
_ = 1 / 0
@app.route("/error_in_error_handler_handler")
def custom_error_handler(request):
@ -82,7 +89,7 @@ def exception_app():
@app.exception(SanicExceptionTestException)
def error_in_error_handler_handler(request, exception):
1 / 0
_ = 1 / 0
return app

View File

@ -1,9 +1,11 @@
from sanic import Sanic
from sanic.response import text
from sanic.exceptions import InvalidUsage, ServerError, NotFound
from sanic.handlers import ErrorHandler
from bs4 import BeautifulSoup
from sanic import Sanic
from sanic.exceptions import InvalidUsage, NotFound, ServerError
from sanic.handlers import ErrorHandler
from sanic.response import text
exception_handler_app = Sanic("test_exception_handler")

View File

@ -1,41 +1,143 @@
from json import JSONDecodeError
from sanic import Sanic
import asyncio
import functools
import socket
from asyncio import sleep as aio_sleep
from http.client import _encode
from json import JSONDecodeError
import httpcore
import requests_async as requests
from httpcore import PoolTimeout
from sanic import Sanic, server
from sanic.response import text
from sanic import server
import aiohttp
from aiohttp import TCPConnector
from sanic.testing import SanicTestClient, HOST, PORT
from sanic.testing import HOST, PORT, SanicTestClient
CONFIG_FOR_TESTS = {
"KEEP_ALIVE_TIMEOUT": 2,
"KEEP_ALIVE": True
}
# import traceback
class ReuseableTCPConnector(TCPConnector):
def __init__(self, *args, **kwargs):
super(ReuseableTCPConnector, self).__init__(*args, **kwargs)
self.old_proto = None
async def connect(self, req, *args, **kwargs):
new_conn = await super(ReuseableTCPConnector, self).connect(
req, *args, **kwargs
)
if self.old_proto is not None:
if self.old_proto != new_conn._protocol:
CONFIG_FOR_TESTS = {"KEEP_ALIVE_TIMEOUT": 2, "KEEP_ALIVE": True}
old_conn = None
class ReusableSanicConnectionPool(httpcore.ConnectionPool):
async def acquire_connection(self, url, ssl, timeout):
global old_conn
if timeout.connect_timeout and not timeout.pool_timeout:
timeout.pool_timeout = timeout.connect_timeout
key = (url.scheme, url.hostname, url.port, ssl, timeout)
try:
connection = self._keepalive_connections[key].pop()
if not self._keepalive_connections[key]:
del self._keepalive_connections[key]
self.num_keepalive_connections -= 1
self.num_active_connections += 1
except (KeyError, IndexError):
ssl_context = await self.get_ssl_context(url, ssl)
try:
await asyncio.wait_for(
self._max_connections.acquire(), timeout.pool_timeout
)
except asyncio.TimeoutError:
raise PoolTimeout()
release = functools.partial(self.release_connection, key=key)
connection = httpcore.connections.Connection(
timeout=timeout, on_release=release
)
self.num_active_connections += 1
await connection.open(url.hostname, url.port, ssl=ssl_context)
if old_conn is not None:
if old_conn != connection:
raise RuntimeError(
"We got a new connection, wanted the same one!"
)
print(new_conn.__dict__)
self.old_proto = new_conn._protocol
return new_conn
old_conn = connection
return connection
class ReusableSanicAdapter(requests.adapters.HTTPAdapter):
def __init__(self):
self.pool = ReusableSanicConnectionPool()
async def send(
self,
request,
stream=False,
timeout=None,
verify=True,
cert=None,
proxies=None,
):
method = request.method
url = request.url
headers = [
(_encode(k), _encode(v)) for k, v in request.headers.items()
]
if not request.body:
body = b""
elif isinstance(request.body, str):
body = _encode(request.body)
else:
body = request.body
if isinstance(timeout, tuple):
timeout_kwargs = {
"connect_timeout": timeout[0],
"read_timeout": timeout[1],
}
else:
timeout_kwargs = {
"connect_timeout": timeout,
"read_timeout": timeout,
}
ssl = httpcore.SSLConfig(cert=cert, verify=verify)
timeout = httpcore.TimeoutConfig(**timeout_kwargs)
try:
response = await self.pool.request(
method,
url,
headers=headers,
body=body,
stream=stream,
ssl=ssl,
timeout=timeout,
)
except (httpcore.BadResponse, socket.error) as err:
raise ConnectionError(err, request=request)
except httpcore.ConnectTimeout as err:
raise requests.exceptions.ConnectTimeout(err, request=request)
except httpcore.ReadTimeout as err:
raise requests.exceptions.ReadTimeout(err, request=request)
return self.build_response(request, response)
class ResusableSanicSession(requests.Session):
def __init__(self, *args, **kwargs) -> None:
super().__init__(*args, **kwargs)
adapter = ReusableSanicAdapter()
self.mount("http://", adapter)
self.mount("https://", adapter)
class ReuseableSanicTestClient(SanicTestClient):
def __init__(self, app, loop=None):
super(ReuseableSanicTestClient, self).__init__(app)
super().__init__(app)
if loop is None:
loop = asyncio.get_event_loop()
self._loop = loop
@ -51,14 +153,13 @@ class ReuseableSanicTestClient(SanicTestClient):
uri="/",
gather_request=True,
debug=False,
server_kwargs={},
server_kwargs={"return_asyncio_server": True},
*request_args,
**request_kwargs
**request_kwargs,
):
loop = self._loop
results = [None, None]
exceptions = []
do_kill_server = request_kwargs.pop("end_server", False)
if gather_request:
def _collect_request(request):
@ -67,21 +168,27 @@ class ReuseableSanicTestClient(SanicTestClient):
self.app.request_middleware.appendleft(_collect_request)
if uri.startswith(
("http:", "https:", "ftp:", "ftps://", "//", "ws:", "wss:")
):
url = uri
else:
uri = uri if uri.startswith("/") else "/{uri}".format(uri=uri)
scheme = "http"
url = "{scheme}://{host}:{port}{uri}".format(
scheme=scheme, host=HOST, port=PORT, uri=uri
)
@self.app.listener("after_server_start")
async def _collect_response(loop):
try:
if do_kill_server:
request_kwargs["end_session"] = True
response = await self._local_request(
method, uri, *request_args, **request_kwargs
method, url, *request_args, **request_kwargs
)
results[-1] = response
except Exception as e2:
import traceback
traceback.print_tb(e2.__traceback__)
# traceback.print_tb(e2.__traceback__)
exceptions.append(e2)
# Don't stop here! self.app.stop()
if self._server is not None:
_server = self._server
@ -96,27 +203,14 @@ class ReuseableSanicTestClient(SanicTestClient):
try:
loop._stopping = False
http_server = loop.run_until_complete(_server_co)
_server = loop.run_until_complete(_server_co)
except Exception as e1:
import traceback
traceback.print_tb(e1.__traceback__)
# traceback.print_tb(e1.__traceback__)
raise e1
self._server = _server = http_server
self._server = _server
server.trigger_events(self.app.listeners["after_server_start"], loop)
self.app.listeners["after_server_start"].pop()
if do_kill_server:
try:
_server.close()
self._server = None
loop.run_until_complete(_server.wait_closed())
self.app.stop()
except Exception as e3:
import traceback
traceback.print_tb(e3.__traceback__)
exceptions.append(e3)
if exceptions:
raise ValueError("Exception during request: {}".format(exceptions))
@ -139,59 +233,61 @@ class ReuseableSanicTestClient(SanicTestClient):
"Request object expected, got ({})".format(results)
)
def kill_server(self):
try:
if self._server:
self._server.close()
self._loop.run_until_complete(self._server.wait_closed())
self._server = None
self.app.stop()
if self._session:
self._loop.run_until_complete(self._session.close())
self._session = None
except Exception as e3:
raise e3
# Copied from SanicTestClient, but with some changes to reuse the
# same TCPConnection and the sane ClientSession more than once.
# Note, you cannot use the same session if you are in a _different_
# loop, so the changes above are required too.
async def _local_request(self, method, uri, cookies=None, *args, **kwargs):
async def _local_request(self, method, url, *args, **kwargs):
raw_cookies = kwargs.pop("raw_cookies", None)
request_keepalive = kwargs.pop(
"request_keepalive", CONFIG_FOR_TESTS['KEEP_ALIVE_TIMEOUT']
"request_keepalive", CONFIG_FOR_TESTS["KEEP_ALIVE_TIMEOUT"]
)
if uri.startswith(("http:", "https:", "ftp:", "ftps://" "//")):
url = uri
else:
url = "http://{host}:{port}{uri}".format(
host=HOST, port=self.port, uri=uri
)
do_kill_session = kwargs.pop("end_session", False)
if self._session:
session = self._session
_session = self._session
else:
if self._tcp_connector:
conn = self._tcp_connector
else:
conn = ReuseableTCPConnector(
ssl=False,
loop=self._loop,
keepalive_timeout=request_keepalive,
_session = ResusableSanicSession()
self._session = _session
async with _session as session:
try:
response = await getattr(session, method.lower())(
url,
verify=False,
timeout=request_keepalive,
*args,
**kwargs,
)
self._tcp_connector = conn
session = aiohttp.ClientSession(
cookies=cookies, connector=conn, loop=self._loop
)
self._session = session
async with getattr(session, method.lower())(
url, *args, **kwargs
) as response:
try:
response.text = await response.text()
except UnicodeDecodeError:
response.text = None
except NameError:
raise Exception(response.status_code)
try:
response.json = await response.json()
except (
JSONDecodeError,
UnicodeDecodeError,
aiohttp.ClientResponseError,
):
response.json = response.json()
except (JSONDecodeError, UnicodeDecodeError):
response.json = None
response.body = await response.read()
if do_kill_session:
await session.close()
self._session = None
response.status = response.status_code
response.content_type = response.headers.get("content-type")
if raw_cookies:
response.raw_cookies = {}
for cookie in response.cookies:
response.raw_cookies[cookie.name] = cookie
return response
@ -231,9 +327,10 @@ def test_keep_alive_timeout_reuse():
assert response.status == 200
assert response.text == "OK"
loop.run_until_complete(aio_sleep(1))
request, response = client.get("/1", end_server=True)
request, response = client.get("/1")
assert response.status == 200
assert response.text == "OK"
client.kill_server()
def test_keep_alive_client_timeout():
@ -243,20 +340,21 @@ def test_keep_alive_client_timeout():
asyncio.set_event_loop(loop)
client = ReuseableSanicTestClient(keep_alive_app_client_timeout, loop)
headers = {"Connection": "keep-alive"}
request, response = client.get("/1", headers=headers, request_keepalive=1)
assert response.status == 200
assert response.text == "OK"
loop.run_until_complete(aio_sleep(2))
exception = None
try:
request, response = client.get(
"/1", end_server=True, request_keepalive=1
"/1", headers=headers, request_keepalive=1
)
assert response.status == 200
assert response.text == "OK"
loop.run_until_complete(aio_sleep(2))
exception = None
request, response = client.get("/1", request_keepalive=1)
except ValueError as e:
exception = e
assert exception is not None
assert isinstance(exception, ValueError)
assert "got a new connection" in exception.args[0]
client.kill_server()
def test_keep_alive_server_timeout():
@ -268,15 +366,15 @@ def test_keep_alive_server_timeout():
asyncio.set_event_loop(loop)
client = ReuseableSanicTestClient(keep_alive_app_server_timeout, loop)
headers = {"Connection": "keep-alive"}
request, response = client.get("/1", headers=headers, request_keepalive=60)
assert response.status == 200
assert response.text == "OK"
loop.run_until_complete(aio_sleep(3))
exception = None
try:
request, response = client.get(
"/1", request_keepalive=60, end_server=True
"/1", headers=headers, request_keepalive=60
)
assert response.status == 200
assert response.text == "OK"
loop.run_until_complete(aio_sleep(3))
exception = None
request, response = client.get("/1", request_keepalive=60)
except ValueError as e:
exception = e
assert exception is not None
@ -285,3 +383,4 @@ def test_keep_alive_server_timeout():
"Connection reset" in exception.args[0]
or "got a new connection" in exception.args[0]
)
client.kill_server()

View File

@ -1,17 +1,17 @@
import uuid
import logging
import uuid
from io import StringIO
from importlib import reload
import pytest
from io import StringIO
from unittest.mock import Mock
import pytest
import sanic
from sanic.response import text
from sanic.log import LOGGING_CONFIG_DEFAULTS
from sanic import Sanic
from sanic.log import logger
from sanic.log import LOGGING_CONFIG_DEFAULTS, logger
from sanic.response import text
logging_format = """module: %(module)s; \

View File

@ -1,8 +1,9 @@
import logging
import asyncio
import logging
from sanic.config import BASE_LOGO
try:
import uvloop # noqa
@ -12,7 +13,7 @@ except BaseException:
def test_logo_base(app, caplog):
server = app.create_server(debug=True)
server = app.create_server(debug=True, return_asyncio_server=True)
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop._stopping = False
@ -31,7 +32,7 @@ def test_logo_base(app, caplog):
def test_logo_false(app, caplog):
app.config.LOGO = False
server = app.create_server(debug=True)
server = app.create_server(debug=True, return_asyncio_server=True)
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop._stopping = False
@ -50,7 +51,7 @@ def test_logo_false(app, caplog):
def test_logo_true(app, caplog):
app.config.LOGO = True
server = app.create_server(debug=True)
server = app.create_server(debug=True, return_asyncio_server=True)
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop._stopping = False
@ -69,7 +70,7 @@ def test_logo_true(app, caplog):
def test_logo_custom(app, caplog):
app.config.LOGO = "My Custom Logo"
server = app.create_server(debug=True)
server = app.create_server(debug=True, return_asyncio_server=True)
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop._stopping = False

View File

@ -1,10 +1,12 @@
import logging
from asyncio import CancelledError
from sanic.exceptions import NotFound
from sanic.request import Request
from sanic.response import HTTPResponse, text
# ------------------------------------------------------------ #
# GET
# ------------------------------------------------------------ #

View File

@ -1,11 +1,12 @@
import multiprocessing
import pickle
import random
import signal
import pickle
import pytest
from sanic.testing import HOST, PORT
from sanic.response import text
from sanic.testing import HOST, PORT
@pytest.mark.skipif(

View File

@ -2,12 +2,13 @@
# -*- coding: utf-8 -*-
import asyncio
import pytest
from sanic.blueprints import Blueprint
from sanic.response import text
from sanic.exceptions import URLBuildError
from sanic.constants import HTTP_METHODS
from sanic.exceptions import URLBuildError
from sanic.response import text
# ------------------------------------------------------------ #

View File

@ -1,7 +1,8 @@
import pytest
from urllib.parse import quote
from sanic.response import text, redirect
import pytest
from sanic.response import redirect, text
@pytest.fixture

View File

@ -1,7 +1,7 @@
import asyncio
import contextlib
from sanic.response import text, stream
from sanic.response import stream, text
async def test_request_cancel_when_connection_lost(loop, app, test_client):

View File

@ -2,6 +2,7 @@ import random
from sanic.response import json
try:
from ujson import loads
except ImportError:

View File

@ -1,10 +1,8 @@
import asyncio
from sanic.blueprints import Blueprint
from sanic.views import CompositionView
from sanic.views import HTTPMethodView
from sanic.views import stream as stream_decorator
from sanic.response import stream, text
from sanic.request import StreamBuffer
from sanic.response import stream, text
from sanic.views import CompositionView, HTTPMethodView
from sanic.views import stream as stream_decorator
data = "abc" * 10000000
@ -270,6 +268,21 @@ def test_request_stream_blueprint(app):
return stream(streaming)
async def post_add_route(request):
assert isinstance(request.stream, StreamBuffer)
async def streaming(response):
while True:
body = await request.stream.read()
if body is None:
break
await response.write(body.decode("utf-8"))
return stream(streaming)
bp.add_route(
post_add_route, "/post/add_route", methods=["POST"], stream=True
)
app.blueprint(bp)
assert app.is_request_stream is True
@ -314,6 +327,10 @@ def test_request_stream_blueprint(app):
assert response.status == 200
assert response.text == data
request, response = app.test_client.post("/post/add_route", data=data)
assert response.status == 200
assert response.text == data
def test_request_stream_composition_view(app):
"""for self.is_request_stream = True"""

View File

@ -1,185 +1,73 @@
from json import JSONDecodeError
import asyncio
import httpcore
import requests_async as requests
from sanic import Sanic
import asyncio
from sanic.response import text
import aiohttp
from aiohttp import TCPConnector
from sanic.testing import SanicTestClient, HOST
try:
try:
# direct use
import packaging
version = packaging.version
except (ImportError, AttributeError):
# setuptools v39.0 and above.
try:
from setuptools.extern import packaging
except ImportError:
# Before setuptools v39.0
from pkg_resources.extern import packaging
version = packaging.version
except ImportError:
raise RuntimeError("The 'packaging' library is missing.")
from sanic.testing import SanicTestClient
aiohttp_version = version.parse(aiohttp.__version__)
class DelayableSanicConnectionPool(httpcore.ConnectionPool):
def __init__(self, request_delay=None, *args, **kwargs):
self._request_delay = request_delay
super().__init__(*args, **kwargs)
async def request(
self,
method,
url,
headers=(),
body=b"",
stream=False,
ssl=None,
timeout=None,
):
if ssl is None:
ssl = self.ssl_config
if timeout is None:
timeout = self.timeout
class DelayableTCPConnector(TCPConnector):
class RequestContextManager(object):
def __new__(cls, req, delay):
cls = super(
DelayableTCPConnector.RequestContextManager, cls
).__new__(cls)
cls.req = req
cls.send_task = None
cls.resp = None
cls.orig_send = getattr(req, "send")
cls.orig_start = None
cls.delay = delay
cls._acting_as = req
return cls
def __getattr__(self, item):
acting_as = self._acting_as
return getattr(acting_as, item)
async def start(self, connection, read_until_eof=False):
if self.send_task is None:
raise RuntimeError("do a send() before you do a start()")
resp = await self.send_task
self.send_task = None
self.resp = resp
self._acting_as = self.resp
self.orig_start = getattr(resp, "start")
try:
if aiohttp_version >= version.parse("3.3.0"):
ret = await self.orig_start(connection)
else:
ret = await self.orig_start(connection, read_until_eof)
except Exception as e:
raise e
return ret
def close(self):
if self.resp is not None:
self.resp.close()
if self.send_task is not None:
self.send_task.cancel()
async def delayed_send(self, *args, **kwargs):
req = self.req
if self.delay and self.delay > 0:
# sync_sleep(self.delay)
await asyncio.sleep(self.delay)
t = req.loop.time()
print("sending at {}".format(t), flush=True)
next(iter(args)) # first arg is connection
try:
return await self.orig_send(*args, **kwargs)
except Exception as e:
if aiohttp_version < version.parse("3.1.0"):
return aiohttp.ClientResponse(req.method, req.url)
kw = dict(
writer=None,
continue100=None,
timer=None,
request_info=None,
traces=[],
loop=req.loop,
session=None,
)
if aiohttp_version < version.parse("3.3.0"):
kw["auto_decompress"] = None
return aiohttp.ClientResponse(req.method, req.url, **kw)
def _send(self, *args, **kwargs):
gen = self.delayed_send(*args, **kwargs)
task = self.req.loop.create_task(gen)
self.send_task = task
self._acting_as = task
return self
if aiohttp_version >= version.parse("3.1.0"):
# aiohttp changed the request.send method to async
async def send(self, *args, **kwargs):
return self._send(*args, **kwargs)
else:
send = _send
def __init__(self, *args, **kwargs):
_post_connect_delay = kwargs.pop("post_connect_delay", 0)
_pre_request_delay = kwargs.pop("pre_request_delay", 0)
super(DelayableTCPConnector, self).__init__(*args, **kwargs)
self._post_connect_delay = _post_connect_delay
self._pre_request_delay = _pre_request_delay
async def connect(self, req, *args, **kwargs):
d_req = DelayableTCPConnector.RequestContextManager(
req, self._pre_request_delay
parsed_url = httpcore.URL(url)
request = httpcore.Request(
method, parsed_url, headers=headers, body=body
)
conn = await super(DelayableTCPConnector, self).connect(
req, *args, **kwargs
connection = await self.acquire_connection(
parsed_url, ssl=ssl, timeout=timeout
)
if self._post_connect_delay and self._post_connect_delay > 0:
await asyncio.sleep(self._post_connect_delay, loop=self._loop)
req.send = d_req.send
t = req.loop.time()
print("Connected at {}".format(t), flush=True)
return conn
if self._request_delay:
print(f"\t>> Sleeping ({self._request_delay})")
await asyncio.sleep(self._request_delay)
response = await connection.send(request)
if not stream:
try:
await response.read()
finally:
await response.close()
return response
class DelayableSanicAdapter(requests.adapters.HTTPAdapter):
def __init__(self, request_delay=None):
self.pool = DelayableSanicConnectionPool(request_delay=request_delay)
class DelayableSanicSession(requests.Session):
def __init__(self, request_delay=None, *args, **kwargs) -> None:
super().__init__(*args, **kwargs)
adapter = DelayableSanicAdapter(request_delay=request_delay)
self.mount("http://", adapter)
self.mount("https://", adapter)
class DelayableSanicTestClient(SanicTestClient):
def __init__(self, app, loop, request_delay=1):
super(DelayableSanicTestClient, self).__init__(app)
def __init__(self, app, request_delay=None):
super().__init__(app)
self._request_delay = request_delay
self._loop = None
async def _local_request(self, method, uri, cookies=None, *args, **kwargs):
if self._loop is None:
self._loop = asyncio.get_event_loop()
if uri.startswith(("http:", "https:", "ftp:", "ftps://" "//")):
url = uri
else:
url = "http://{host}:{port}{uri}".format(
host=HOST, port=self.port, uri=uri
)
conn = DelayableTCPConnector(
pre_request_delay=self._request_delay,
ssl=False,
loop=self._loop,
)
async with aiohttp.ClientSession(
cookies=cookies, connector=conn, loop=self._loop
) as session:
# Insert a delay after creating the connection
# But before sending the request.
async with getattr(session, method.lower())(
url, *args, **kwargs
) as response:
try:
response.text = await response.text()
except UnicodeDecodeError:
response.text = None
try:
response.json = await response.json()
except (
JSONDecodeError,
UnicodeDecodeError,
aiohttp.ClientResponseError,
):
response.json = None
response.body = await response.read()
return response
def get_new_session(self):
return DelayableSanicSession(request_delay=self._request_delay)
request_timeout_default_app = Sanic("test_request_timeout_default")
@ -204,14 +92,14 @@ async def ws_handler1(request, ws):
def test_default_server_error_request_timeout():
client = DelayableSanicTestClient(request_timeout_default_app, None, 2)
client = DelayableSanicTestClient(request_timeout_default_app, 2)
request, response = client.get("/1")
assert response.status == 408
assert response.text == "Error: Request Timeout"
def test_default_server_error_request_dont_timeout():
client = DelayableSanicTestClient(request_no_timeout_app, None, 0.2)
client = DelayableSanicTestClient(request_no_timeout_app, 0.2)
request, response = client.get("/1")
assert response.status == 200
assert response.text == "OK"
@ -226,7 +114,7 @@ def test_default_server_error_websocket_request_timeout():
"Sec-WebSocket-Version": "13",
}
client = DelayableSanicTestClient(request_timeout_default_app, None, 2)
client = DelayableSanicTestClient(request_timeout_default_app, 2)
request, response = client.get("/ws1", headers=headers)
assert response.status == 408

View File

@ -1,19 +1,20 @@
import logging
import os
import ssl
from json import dumps as json_dumps
from json import loads as json_loads
from urllib.parse import urlparse
import pytest
from sanic import Sanic
from sanic import Blueprint
from sanic import Blueprint, Sanic
from sanic.exceptions import ServerError
from sanic.request import DEFAULT_HTTP_CONTENT_TYPE
from sanic.request import DEFAULT_HTTP_CONTENT_TYPE, RequestParameters
from sanic.response import json, text
from sanic.testing import HOST, PORT
# ------------------------------------------------------------ #
# GET
# ------------------------------------------------------------ #
@ -29,7 +30,7 @@ def test_sync(app):
assert response.text == "Hello"
def test_remote_address(app):
def test_ip(app):
@app.route("/")
def handler(request):
return text("{}".format(request.ip))
@ -130,11 +131,14 @@ def test_query_string(app):
assert request.args.get("test1") == "1"
assert request.args.get("test2") == "false"
assert request.args.getlist("test2") == ["false", "true"]
assert request.args.getlist("test1") == ["1"]
assert request.args.get("test3", default="My value") == "My value"
def test_uri_template(app):
@app.route("/foo/<id:int>/bar/<name:[A-z]+>")
async def handler(request):
async def handler(request, id, name):
return text("OK")
request, response = app.test_client.get("/foo/123/bar/baz")
@ -200,11 +204,23 @@ def test_content_type(app):
assert response.text == "application/json"
def test_remote_addr(app):
def test_remote_addr_with_two_proxies(app):
app.config.PROXIES_COUNT = 2
@app.route("/")
async def handler(request):
return text(request.remote_addr)
headers = {"X-Real-IP": "127.0.0.2", "X-Forwarded-For": "127.0.1.1"}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == "127.0.0.2"
assert response.text == "127.0.0.2"
headers = {"X-Forwarded-For": "127.0.1.1"}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == ""
assert response.text == ""
headers = {"X-Forwarded-For": "127.0.0.1, 127.0.1.2"}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == "127.0.0.1"
@ -219,6 +235,86 @@ def test_remote_addr(app):
assert request.remote_addr == "127.0.0.1"
assert response.text == "127.0.0.1"
headers = {
"X-Forwarded-For": ", 127.0.2.2, , ,127.0.0.1, , ,,127.0.1.2"
}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == "127.0.0.1"
assert response.text == "127.0.0.1"
def test_remote_addr_with_infinite_number_of_proxies(app):
app.config.PROXIES_COUNT = -1
@app.route("/")
async def handler(request):
return text(request.remote_addr)
headers = {"X-Real-IP": "127.0.0.2", "X-Forwarded-For": "127.0.1.1"}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == "127.0.0.2"
assert response.text == "127.0.0.2"
headers = {"X-Forwarded-For": "127.0.1.1"}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == "127.0.1.1"
assert response.text == "127.0.1.1"
headers = {
"X-Forwarded-For": "127.0.0.5, 127.0.0.4, 127.0.0.3, 127.0.0.2, 127.0.0.1"
}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == "127.0.0.5"
assert response.text == "127.0.0.5"
def test_remote_addr_without_proxy(app):
app.config.PROXIES_COUNT = 0
@app.route("/")
async def handler(request):
return text(request.remote_addr)
headers = {"X-Real-IP": "127.0.0.2", "X-Forwarded-For": "127.0.1.1"}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == ""
assert response.text == ""
headers = {"X-Forwarded-For": "127.0.1.1"}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == ""
assert response.text == ""
headers = {"X-Forwarded-For": "127.0.0.1, 127.0.1.2"}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == ""
assert response.text == ""
def test_remote_addr_custom_headers(app):
app.config.PROXIES_COUNT = 1
app.config.REAL_IP_HEADER = "Client-IP"
app.config.FORWARDED_FOR_HEADER = "Forwarded"
@app.route("/")
async def handler(request):
return text(request.remote_addr)
headers = {"X-Real-IP": "127.0.0.2", "Forwarded": "127.0.1.1"}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == "127.0.1.1"
assert response.text == "127.0.1.1"
headers = {"X-Forwarded-For": "127.0.1.1"}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == ""
assert response.text == ""
headers = {"Client-IP": "127.0.0.2", "Forwarded": "127.0.1.1"}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == "127.0.0.2"
assert response.text == "127.0.0.2"
def test_match_info(app):
@app.route("/api/v1/user/<user_id>/")
@ -432,21 +528,59 @@ def test_request_string_representation(app):
@pytest.mark.parametrize(
"payload",
"payload,filename",
[
"------sanic\r\n"
'Content-Disposition: form-data; filename="filename"; name="test"\r\n'
"\r\n"
"OK\r\n"
"------sanic--\r\n",
"------sanic\r\n"
'content-disposition: form-data; filename="filename"; name="test"\r\n'
"\r\n"
'content-type: application/json; {"field": "value"}\r\n'
"------sanic--\r\n",
(
"------sanic\r\n"
'Content-Disposition: form-data; filename="filename"; name="test"\r\n'
"\r\n"
"OK\r\n"
"------sanic--\r\n",
"filename",
),
(
"------sanic\r\n"
'content-disposition: form-data; filename="filename"; name="test"\r\n'
"\r\n"
'content-type: application/json; {"field": "value"}\r\n'
"------sanic--\r\n",
"filename",
),
(
"------sanic\r\n"
'Content-Disposition: form-data; filename=""; name="test"\r\n'
"\r\n"
"OK\r\n"
"------sanic--\r\n",
"",
),
(
"------sanic\r\n"
'content-disposition: form-data; filename=""; name="test"\r\n'
"\r\n"
'content-type: application/json; {"field": "value"}\r\n'
"------sanic--\r\n",
"",
),
(
"------sanic\r\n"
'Content-Disposition: form-data; filename*="utf-8\'\'filename_%C2%A0_test"; name="test"\r\n'
"\r\n"
"OK\r\n"
"------sanic--\r\n",
"filename_\u00A0_test",
),
(
"------sanic\r\n"
'content-disposition: form-data; filename*="utf-8\'\'filename_%C2%A0_test"; name="test"\r\n'
"\r\n"
'content-type: application/json; {"field": "value"}\r\n'
"------sanic--\r\n",
"filename_\u00A0_test",
),
],
)
def test_request_multipart_files(app, payload):
def test_request_multipart_files(app, payload, filename):
@app.route("/", methods=["POST"])
async def post(request):
return text("OK")
@ -454,7 +588,7 @@ def test_request_multipart_files(app, payload):
headers = {"content-type": "multipart/form-data; boundary=----sanic"}
request, _ = app.test_client.post(data=payload, headers=headers)
assert request.files.get("test").name == "filename"
assert request.files.get("test").name == filename
def test_request_multipart_file_with_json_content_type(app):
@ -566,7 +700,7 @@ def test_request_repr(app):
assert repr(request) == "<Request: GET />"
request.method = None
assert repr(request) == "<Request>"
assert repr(request) == "<Request: None />"
def test_request_bool(app):
@ -626,6 +760,75 @@ def test_request_raw_args(app):
assert request.raw_args == params
def test_request_query_args(app):
# test multiple params with the same key
params = [("test", "value1"), ("test", "value2")]
@app.get("/")
def handler(request):
return text("pass")
request, response = app.test_client.get("/", params=params)
assert request.query_args == params
# test cached value
assert (
request.parsed_not_grouped_args[(False, False, "utf-8", "replace")]
== request.query_args
)
# test params directly in the url
request, response = app.test_client.get("/?test=value1&test=value2")
assert request.query_args == params
# test unique params
params = [("test1", "value1"), ("test2", "value2")]
request, response = app.test_client.get("/", params=params)
assert request.query_args == params
# test no params
request, response = app.test_client.get("/")
assert not request.query_args
def test_request_query_args_custom_parsing(app):
@app.get("/")
def handler(request):
return text("pass")
request, response = app.test_client.get(
"/?test1=value1&test2=&test3=value3"
)
assert request.get_query_args(keep_blank_values=True) == [
("test1", "value1"),
("test2", ""),
("test3", "value3"),
]
assert request.query_args == [("test1", "value1"), ("test3", "value3")]
assert request.get_query_args(keep_blank_values=False) == [
("test1", "value1"),
("test3", "value3"),
]
assert request.get_args(keep_blank_values=True) == RequestParameters(
{"test1": ["value1"], "test2": [""], "test3": ["value3"]}
)
assert request.args == RequestParameters(
{"test1": ["value1"], "test3": ["value3"]}
)
assert request.get_args(keep_blank_values=False) == RequestParameters(
{"test1": ["value1"], "test3": ["value3"]}
)
def test_request_cookies(app):
cookies = {"test": "OK"}

View File

@ -1,6 +1,7 @@
import asyncio
import inspect
import os
from collections import namedtuple
from mimetypes import guess_type
from random import choice
@ -8,6 +9,7 @@ from unittest.mock import MagicMock
from urllib.parse import unquote
import pytest
from aiofiles import os as async_os
from sanic.response import (
@ -18,11 +20,11 @@ from sanic.response import (
json,
raw,
stream,
text,
)
from sanic.server import HttpProtocol
from sanic.testing import HOST, PORT
JSON_DATA = {"ok": True}
@ -77,10 +79,10 @@ def test_response_header(app):
request, response = app.test_client.get("/")
assert dict(response.headers) == {
"Connection": "keep-alive",
"Keep-Alive": str(app.config.KEEP_ALIVE_TIMEOUT),
"Content-Length": "11",
"Content-Type": "application/json",
"connection": "keep-alive",
"keep-alive": str(app.config.KEEP_ALIVE_TIMEOUT),
"content-length": "11",
"content-type": "application/json",
}
@ -192,22 +194,59 @@ def test_no_content(json_app):
def streaming_app(app):
@app.route("/")
async def test(request):
return stream(sample_streaming_fn, content_type="text/csv")
return stream(
sample_streaming_fn,
headers={"Content-Length": "7"},
content_type="text/csv",
)
return app
def test_streaming_adds_correct_headers(streaming_app):
@pytest.fixture
def non_chunked_streaming_app(app):
@app.route("/")
async def test(request):
return stream(
sample_streaming_fn,
headers={"Content-Length": "7"},
content_type="text/csv",
chunked=False,
)
return app
def test_chunked_streaming_adds_correct_headers(streaming_app):
request, response = streaming_app.test_client.get("/")
assert response.headers["Transfer-Encoding"] == "chunked"
assert response.headers["Content-Type"] == "text/csv"
# Content-Length is not allowed by HTTP/1.1 specification
# when "Transfer-Encoding: chunked" is used
assert "Content-Length" not in response.headers
def test_streaming_returns_correct_content(streaming_app):
def test_chunked_streaming_returns_correct_content(streaming_app):
request, response = streaming_app.test_client.get("/")
assert response.text == "foo,bar"
def test_non_chunked_streaming_adds_correct_headers(
non_chunked_streaming_app
):
request, response = non_chunked_streaming_app.test_client.get("/")
assert "Transfer-Encoding" not in response.headers
assert response.headers["Content-Type"] == "text/csv"
assert response.headers["Content-Length"] == "7"
def test_non_chunked_streaming_returns_correct_content(
non_chunked_streaming_app
):
request, response = non_chunked_streaming_app.test_client.get("/")
assert response.text == "foo,bar"
@pytest.mark.parametrize("status", [200, 201, 400, 401])
def test_stream_response_status_returns_correct_headers(status):
response = StreamingHTTPResponse(sample_streaming_fn, status=status)
@ -227,13 +266,27 @@ def test_stream_response_keep_alive_returns_correct_headers(
assert b"Keep-Alive: %s\r\n" % str(keep_alive_timeout).encode() in headers
def test_stream_response_includes_chunked_header():
def test_stream_response_includes_chunked_header_http11():
response = StreamingHTTPResponse(sample_streaming_fn)
headers = response.get_headers()
headers = response.get_headers(version="1.1")
assert b"Transfer-Encoding: chunked\r\n" in headers
def test_stream_response_writes_correct_content_to_transport(streaming_app):
def test_stream_response_does_not_include_chunked_header_http10():
response = StreamingHTTPResponse(sample_streaming_fn)
headers = response.get_headers(version="1.0")
assert b"Transfer-Encoding: chunked\r\n" not in headers
def test_stream_response_does_not_include_chunked_header_if_disabled():
response = StreamingHTTPResponse(sample_streaming_fn, chunked=False)
headers = response.get_headers(version="1.1")
assert b"Transfer-Encoding: chunked\r\n" not in headers
def test_stream_response_writes_correct_content_to_transport_when_chunked(
streaming_app
):
response = StreamingHTTPResponse(sample_streaming_fn)
response.protocol = MagicMock(HttpProtocol)
response.protocol.transport = MagicMock(asyncio.Transport)
@ -262,6 +315,42 @@ def test_stream_response_writes_correct_content_to_transport(streaming_app):
b"0\r\n\r\n"
)
assert len(response.protocol.transport.write.call_args_list) == 4
app.stop()
streaming_app.run(host=HOST, port=PORT)
def test_stream_response_writes_correct_content_to_transport_when_not_chunked(
streaming_app,
):
response = StreamingHTTPResponse(sample_streaming_fn)
response.protocol = MagicMock(HttpProtocol)
response.protocol.transport = MagicMock(asyncio.Transport)
async def mock_drain():
pass
def mock_push_data(data):
response.protocol.transport.write(data)
response.protocol.push_data = mock_push_data
response.protocol.drain = mock_drain
@streaming_app.listener("after_server_start")
async def run_stream(app, loop):
await response.stream(version="1.0")
assert response.protocol.transport.write.call_args_list[1][0][0] == (
b"foo,"
)
assert response.protocol.transport.write.call_args_list[2][0][0] == (
b"bar"
)
assert len(response.protocol.transport.write.call_args_list) == 3
app.stop()
streaming_app.run(host=HOST, port=PORT)
@ -276,7 +365,7 @@ def test_stream_response_with_cookies(app):
return response
request, response = app.test_client.get("/")
assert response.cookies["test"].value == "pass"
assert response.cookies["test"] == "pass"
def test_stream_response_without_cookies(app):

View File

@ -1,7 +1,9 @@
from sanic import Sanic
import asyncio
from sanic.response import text
from sanic import Sanic
from sanic.exceptions import ServiceUnavailable
from sanic.response import text
response_timeout_app = Sanic("test_response_timeout")
response_timeout_default_app = Sanic("test_response_timeout_default")

View File

@ -7,6 +7,7 @@ from sanic.constants import HTTP_METHODS
from sanic.response import json, text
from sanic.router import ParameterNameConflicts, RouteDoesNotExist, RouteExists
# ------------------------------------------------------------ #
# UTF-8
# ------------------------------------------------------------ #
@ -365,9 +366,18 @@ def test_dynamic_route_number(app):
request, response = app.test_client.get("/weight/1234.56")
assert response.status == 200
request, response = app.test_client.get("/weight/.12")
assert response.status == 200
request, response = app.test_client.get("/weight/12.")
assert response.status == 200
request, response = app.test_client.get("/weight/1234-56")
assert response.status == 404
request, response = app.test_client.get("/weight/12.34.56")
assert response.status == 404
def test_dynamic_route_regex(app):
@app.route("/folder/<folder_id:[A-Za-z0-9]{0,4}>")
@ -459,16 +469,8 @@ def test_websocket_route(app, url):
assert ws.subprotocol is None
ev.set()
request, response = app.test_client.get(
"/ws",
headers={
"Upgrade": "websocket",
"Connection": "upgrade",
"Sec-WebSocket-Key": "dGhlIHNhbXBsZSBub25jZQ==",
"Sec-WebSocket-Version": "13",
},
)
assert response.status == 101
request, response = app.test_client.websocket(url)
assert response.opened is True
assert ev.is_set()
@ -478,54 +480,24 @@ def test_websocket_route_with_subprotocols(app):
@app.websocket("/ws", subprotocols=["foo", "bar"])
async def handler(request, ws):
results.append(ws.subprotocol)
assert ws.subprotocol is not None
request, response = app.test_client.get(
"/ws",
headers={
"Upgrade": "websocket",
"Connection": "upgrade",
"Sec-WebSocket-Key": "dGhlIHNhbXBsZSBub25jZQ==",
"Sec-WebSocket-Version": "13",
"Sec-WebSocket-Protocol": "bar",
},
request, response = app.test_client.websocket("/ws", subprotocols=["bar"])
assert response.opened is True
assert results == ["bar"]
request, response = app.test_client.websocket(
"/ws", subprotocols=["bar", "foo"]
)
assert response.status == 101
assert response.opened is True
assert results == ["bar", "bar"]
request, response = app.test_client.get(
"/ws",
headers={
"Upgrade": "websocket",
"Connection": "upgrade",
"Sec-WebSocket-Key": "dGhlIHNhbXBsZSBub25jZQ==",
"Sec-WebSocket-Version": "13",
"Sec-WebSocket-Protocol": "bar, foo",
},
)
assert response.status == 101
request, response = app.test_client.get(
"/ws",
headers={
"Upgrade": "websocket",
"Connection": "upgrade",
"Sec-WebSocket-Key": "dGhlIHNhbXBsZSBub25jZQ==",
"Sec-WebSocket-Version": "13",
"Sec-WebSocket-Protocol": "baz",
},
)
assert response.status == 101
request, response = app.test_client.get(
"/ws",
headers={
"Upgrade": "websocket",
"Connection": "upgrade",
"Sec-WebSocket-Key": "dGhlIHNhbXBsZSBub25jZQ==",
"Sec-WebSocket-Version": "13",
},
)
assert response.status == 101
request, response = app.test_client.websocket("/ws", subprotocols=["baz"])
assert response.opened is True
assert results == ["bar", "bar", None]
request, response = app.test_client.websocket("/ws")
assert response.opened is True
assert results == ["bar", "bar", None, None]
@ -538,16 +510,8 @@ def test_add_webscoket_route(app, strict_slashes):
ev.set()
app.add_websocket_route(handler, "/ws", strict_slashes=strict_slashes)
request, response = app.test_client.get(
"/ws",
headers={
"Upgrade": "websocket",
"Connection": "upgrade",
"Sec-WebSocket-Key": "dGhlIHNhbXBsZSBub25jZQ==",
"Sec-WebSocket-Version": "13",
},
)
assert response.status == 101
request, response = app.test_client.websocket("/ws")
assert response.opened is True
assert ev.is_set()
@ -672,9 +636,18 @@ def test_dynamic_add_route_number(app):
request, response = app.test_client.get("/weight/1234.56")
assert response.status == 200
request, response = app.test_client.get("/weight/.12")
assert response.status == 200
request, response = app.test_client.get("/weight/12.")
assert response.status == 200
request, response = app.test_client.get("/weight/1234-56")
assert response.status == 404
request, response = app.test_client.get("/weight/12.34.56")
assert response.status == 404
def test_dynamic_add_route_regex(app):
async def handler(request, folder_id):

View File

@ -4,6 +4,7 @@ import pytest
from sanic.testing import HOST, PORT
AVAILABLE_LISTENERS = [
"before_server_start",
"after_server_start",
@ -83,7 +84,7 @@ async def test_trigger_before_events_create_server(app):
async def init_db(app, loop):
app.db = MySanicDb()
await app.create_server()
await app.create_server(debug=True, return_asyncio_server=True)
assert hasattr(app, "db")
assert isinstance(app.db, MySanicDb)

View File

@ -1,8 +1,10 @@
import asyncio
from queue import Queue
from unittest.mock import MagicMock
from sanic.response import HTTPResponse
from sanic.testing import HOST, PORT
from unittest.mock import MagicMock
import asyncio
from queue import Queue
async def stop(app, loop):

View File

@ -1,5 +1,6 @@
import inspect
import os
from time import gmtime, strftime
import pytest

View File

@ -0,0 +1,35 @@
import socket
from sanic.response import json, text
from sanic.testing import PORT, SanicTestClient
# ------------------------------------------------------------ #
# UTF-8
# ------------------------------------------------------------ #
def test_test_client_port_none(app):
@app.get("/get")
def handler(request):
return text("OK")
test_client = SanicTestClient(app, port=None)
request, response = test_client.get("/get")
assert response.text == "OK"
request, response = test_client.post("/get")
assert response.status == 405
def test_test_client_port_default(app):
@app.get("/get")
def handler(request):
return json(request.transport.get_extra_info("sockname")[1])
test_client = SanicTestClient(app)
assert test_client.port == PORT
request, response = test_client.get("/get")
assert response.json == PORT

View File

@ -1,14 +1,17 @@
import pytest as pytest
from urllib.parse import urlsplit, parse_qsl
from sanic.response import text
from sanic.views import HTTPMethodView
from sanic.blueprints import Blueprint
from sanic.testing import PORT as test_port, HOST as test_host
from sanic.exceptions import URLBuildError
import string
from urllib.parse import parse_qsl, urlsplit
import pytest as pytest
from sanic.blueprints import Blueprint
from sanic.exceptions import URLBuildError
from sanic.response import text
from sanic.testing import HOST as test_host
from sanic.testing import PORT as test_port
from sanic.views import HTTPMethodView
URL_FOR_ARGS1 = dict(arg1=["v1", "v2"])
URL_FOR_VALUE1 = "/myurl?arg1=v1&arg1=v2"
URL_FOR_ARGS2 = dict(arg1=["v1", "v2"], _anchor="anchor")
@ -169,12 +172,28 @@ def test_fails_with_int_message(app):
app.url_for("fail", **failing_kwargs)
expected_error = (
'Value "not_int" for parameter `foo` '
"does not match pattern for type `int`: \d+"
r'Value "not_int" for parameter `foo` '
r"does not match pattern for type `int`: -?\d+"
)
assert str(e.value) == expected_error
def test_passes_with_negative_int_message(app):
@app.route("path/<possibly_neg:int>/another-word")
def good(request, possibly_neg):
assert isinstance(possibly_neg, int)
return text("this should pass with `{}`".format(possibly_neg))
u_plus_3 = app.url_for("good", possibly_neg=3)
assert u_plus_3 == "/path/3/another-word", u_plus_3
request, response = app.test_client.get(u_plus_3)
assert response.text == "this should pass with `3`"
u_neg_3 = app.url_for("good", possibly_neg=-3)
assert u_neg_3 == "/path/-3/another-word", u_neg_3
request, response = app.test_client.get(u_neg_3)
assert response.text == "this should pass with `-3`"
def test_fails_with_two_letter_string_message(app):
@app.route(COMPLEX_PARAM_URL)
def fail(request):
@ -207,12 +226,26 @@ def test_fails_with_number_message(app):
expected_error = (
'Value "foo" for parameter `some_number` '
"does not match pattern for type `float`: [0-9\\\\.]+"
r"does not match pattern for type `float`: -?(?:\d+(?:\.\d*)?|\.\d+)"
)
assert str(e.value) == expected_error
@pytest.mark.parametrize("number", [3, -3, 13.123, -13.123])
def test_passes_with_negative_number_message(app, number):
@app.route("path/<possibly_neg:number>/another-word")
def good(request, possibly_neg):
assert isinstance(possibly_neg, (int, float))
return text("this should pass with `{}`".format(possibly_neg))
u = app.url_for("good", possibly_neg=number)
assert u == "/path/{}/another-word".format(number), u
request, response = app.test_client.get(u)
# For ``number``, it has been cast to a float - so a ``3`` becomes a ``3.0``
assert response.text == "this should pass with `{}`".format(float(number))
def test_adds_other_supplied_values_as_query_string(app):
@app.route(COMPLEX_PARAM_URL)
def passes(request):

View File

@ -1,4 +1,5 @@
from json import dumps as json_dumps
from sanic.response import text

View File

@ -1,11 +1,11 @@
import pytest as pytest
from sanic.exceptions import InvalidUsage
from sanic.response import text, HTTPResponse
from sanic.views import HTTPMethodView, CompositionView
from sanic.blueprints import Blueprint
from sanic.request import Request
from sanic.constants import HTTP_METHODS
from sanic.exceptions import InvalidUsage
from sanic.request import Request
from sanic.response import HTTPResponse, text
from sanic.views import CompositionView, HTTPMethodView
@pytest.mark.parametrize("method", HTTP_METHODS)

View File

@ -1,14 +1,17 @@
import time
import asyncio
import json
import shlex
import subprocess
import time
import urllib.request
from unittest import mock
from sanic.worker import GunicornWorker
from sanic.app import Sanic
import asyncio
import pytest
from sanic.app import Sanic
from sanic.worker import GunicornWorker
@pytest.fixture(scope="module")
def gunicorn_worker():
@ -24,28 +27,28 @@ def gunicorn_worker():
worker.kill()
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def gunicorn_worker_with_access_logs():
command = (
'gunicorn '
'--bind 127.0.0.1:1338 '
'--worker-class sanic.worker.GunicornWorker '
'examples.simple_server:app'
"gunicorn "
"--bind 127.0.0.1:1338 "
"--worker-class sanic.worker.GunicornWorker "
"examples.simple_server:app"
)
worker = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE)
time.sleep(2)
return worker
@pytest.fixture(scope='module')
@pytest.fixture(scope="module")
def gunicorn_worker_with_env_var():
command = (
'env SANIC_ACCESS_LOG="False" '
'gunicorn '
'--bind 127.0.0.1:1339 '
'--worker-class sanic.worker.GunicornWorker '
'--log-level info '
'examples.simple_server:app'
"gunicorn "
"--bind 127.0.0.1:1339 "
"--worker-class sanic.worker.GunicornWorker "
"--log-level info "
"examples.simple_server:app"
)
worker = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE)
time.sleep(2)
@ -62,7 +65,7 @@ def test_gunicorn_worker_no_logs(gunicorn_worker_with_env_var):
"""
if SANIC_ACCESS_LOG was set to False do not show access logs
"""
with urllib.request.urlopen('http://localhost:1339/') as _:
with urllib.request.urlopen("http://localhost:1339/") as _:
gunicorn_worker_with_env_var.kill()
assert not gunicorn_worker_with_env_var.stdout.read()
@ -71,9 +74,12 @@ def test_gunicorn_worker_with_logs(gunicorn_worker_with_access_logs):
"""
default - show access logs
"""
with urllib.request.urlopen('http://localhost:1338/') as _:
with urllib.request.urlopen("http://localhost:1338/") as _:
gunicorn_worker_with_access_logs.kill()
assert b"(sanic.access)[INFO][127.0.0.1" in gunicorn_worker_with_access_logs.stdout.read()
assert (
b"(sanic.access)[INFO][127.0.0.1"
in gunicorn_worker_with_access_logs.stdout.read()
)
class GunicornTestWorker(GunicornWorker):

18
tox.ini
View File

@ -1,23 +1,25 @@
[tox]
envlist = py35, py36, py37, {py35,py36,py37}-no-ext, lint, check
envlist = py36, py37, {py36,py37}-no-ext, lint, check
[testenv]
usedevelop = True
setenv =
{py35,py36,py37}-no-ext: SANIC_NO_UJSON=1
{py35,py36,py37}-no-ext: SANIC_NO_UVLOOP=1
{py36,py37}-no-ext: SANIC_NO_UJSON=1
{py36,py37}-no-ext: SANIC_NO_UVLOOP=1
deps =
coverage
pytest==4.1.0
pytest-cov
pytest-sanic
pytest-sugar
aiohttp>=2.3,<=3.2.1
httpcore==0.1.1
requests-async==0.4.0
chardet<=2.3.0
beautifulsoup4
gunicorn
pytest-benchmark
commands =
pytest tests --cov sanic --cov-report= {posargs}
pytest {posargs:tests --cov sanic}
- coverage combine --append
coverage report -m
coverage html -i
@ -30,7 +32,7 @@ deps =
commands =
flake8 sanic
black --check --verbose sanic
black --config ./.black.toml --check --verbose sanic
isort --check-only --recursive sanic
[testenv:check]
@ -39,3 +41,7 @@ deps =
pygments
commands =
python setup.py check -r -s
[pytest]
filterwarnings =
ignore:.*async with lock.* instead:DeprecationWarning