Compare commits

...

18 Commits

Author SHA1 Message Date
7
be0d539746 19.9.0 release (#1699) 2019-10-12 09:54:47 -05:00
Lagicrus
4f9739ed2c Update helpers.py (#1693) 2019-10-08 16:29:03 -07:00
Lagicrus
0df37fa653 Update websocket.rst (#1694) 2019-10-08 16:28:09 -07:00
Eli Uriegas
3e932505b0 Bump up pytest version for fixing ci build (#1689)
Bump up pytest version for fixing ci build
2019-10-08 14:32:38 -07:00
Yun Xu
01be691936 misc: bump up pytest version for fixing ci build 2019-10-07 11:41:44 -07:00
Simon
134c414fe5 Support websockets 8.x as well as 7.x (#1687)
Sanic currently requires websockets 7.x, but it's straightforward to
also support the more recent 8.x.
2019-10-01 23:03:09 -07:00
L. Kärkkäinen
c54a8b10bb Implement dict-like API on request objects for custom data. (#1666)
* Implement dict-like API on request objects for custom data.
* Updated docs about custom_context.
2019-09-26 14:11:31 -07:00
Vinícius Dantas
6fc3381229 Add a type checking pipeline (#1682)
* Integrate with mypy
2019-09-22 13:55:36 -07:00
Ashley Sommer
927c0e082e Fix tests for multiprocessing pickle app and pickle blueprint (#1680)
The old tests were not quite checking for the right thing.
Fixing the test does not change Sanic code, expose any bugs, or fix any bugs.
2019-09-18 10:22:24 -07:00
Ashley Sommer
7674e917e4 Fixes "after_server_start" when using return_asyncio_server. (#1676)
* Fixes ability to trigger "after_server_start", "before_server_stop", "after_server_stop" server events when using app.create_server to start your own asyncio_server
See example file run_async_advanced for a full example

* Fix a missing method on AsyncServer that some tests need
Add a tiny bit more documentation in-code
Change name of AsyncServerCoro to AsyncioServer
2019-09-16 10:59:16 -07:00
ku-mu
e13f42c17b Fix docstring style in Sanic.register_listener (#1678)
* Fix docstring style: google -> reST
2019-09-16 10:27:22 -07:00
Lagicrus
b7d4121586 Update static_files.md (#1672) 2019-09-11 14:37:14 -07:00
L. Kärkkäinen
fbcd4b9767 Sanic does not support Python 3.5 and won't need this code. (#1670)
* imports code cleanup as we dropping python3.5 support
2019-09-08 14:08:34 -07:00
7
17c5e28727 Merge pull request #1665 from snguyenthanh/fix-changelog-link
doc: fix the link to CHANGELOG in README.rst
2019-09-06 11:07:41 -07:00
Son
e62b29ca44 Fix link to CHANGELOG in README.rst 2019-09-02 21:51:45 +08:00
L. Kärkkäinen
1e4b1c4d1a Forwarded headers and otherwise improved proxy handling (#1638)
* Added support for HTTP Forwarded header and combined parsing of other proxy headers.

- Accessible via request.forwarded that tries parse_forwarded and then parse_xforwarded
- parse_forwarded uses the Forwarded header, if config.FORWARDED_SECRET is provided and a matching header field is found
- parse_xforwarded uses X-Real-IP and X-Forwarded-* much alike the existing implementation
- This commit does not change existing request properties that still use the old code and won't make use of Forwarded headers.

* Use req.forwarded in req properties server_name, server_port, scheme and remote_addr.

X-Scheme handling moved to parse_xforwarded.

* Cleanup and fix req.server_port; no longer reports socket port if any forwards headers are used.

* Update docstrings to incidate that forwarded header is used first.

* Remove testing function.

* Fix tests and linting.

- One test removed due to change of semantics - no socket port will be used if any forwarded headers are in effect.
- Other tests augmented with X-Forwarded-For, to allow the header being tested take effect (shouldn't affect old implementation).

* Try to workaround buggy tools complaining about incorrect ordering of imports.

* Cleanup forwarded processing, add comments. secret is now also returned.

* Added tests, fixed quoted string handling, cleanup.

* Further tests for full coverage.

* Try'n make linter happy.

* Add support for multiple Forwarded headers. Unify parse_forwarded parameters with parse_xforwarded.

* Implement multiple headers support for X-Forwarded-For.

- Previously only the first header was used, so this BUGFIX may affect functionality.

* Bugfix for request.server_name: strip port and other parts.

- request.server_name docs claim that it returns the hostname only (no port).
- config.SERVER_NAME may be full URL, so strip scheme, port and path
- HTTP Host and consequently forwarded Host may include port number, so
  strip that also for forwarded hosts (previously only done for HTTP Host).
- Possible performance benefit of limiting to one split.

* Fallback to app.url_for and let it handle SERVER_NAME if defined (until a proper solution is implemented).

* Revise previous commit. Only fallback for full URL SERVER_NAMEs; allows host to be defined and proxied information still being used.

* Heil lintnazi.

* Modify testcase not to use underscores in URLs. Use hyphens which the spec allows for.

* Forwarded and Host header parsing improved.

- request.forwarded lowercases hosts, separates host:port into their own fields and lowercases addresses
- forwarded.parse_host helper function added and used for parsing all host-style headers (IPv6 cannot be simply split(":")).
- more tests fixed not to use underscores in hosts as those are no longer accepted and lead to the field being rejected

* Fixed typo in docstring.

* Added IPv6 address tests for Host header.

* Fix regex.

* Further tests and stricter forwarded handling.

* Fix merge commit

* Linter

* Linter

* Linter

* Add  to avoid re-using the  variable. Make a few raw strings non-raw.

* Remove unnecessary or

* Updated docs (work in progress).

* Enable REAL_IP_HEADER parsing irregardless of PROXIES_COUNT setting.

- Also cleanup and added comments

* New defaults for PROXIES_COUNT and REAL_IP_HEADER, updated tests.

* Remove support for PROXIES_COUNT=-1.

* Linter errors.

- This is getting ridiculous: cannot fit an URL on one line, linter requires
  splitting the string literal!

* Add support for by=_proxySecret, updated docs, updated tests.

* Forwarded headers' semantics tuning.

- Forwarded host is now preserved in original format
- request.host now returns a forwarded host if available, else the Host header
- Forwarded options are preserved in original order, and later keys override earlier ones
- Forwarded path is automatically URL-unquoted
- Forwarded 'by' and 'for' are omitted if their value is unknown
- Tests modified accordingly
- Cleanup and improved documentation

* Add ASGI test.

* Linter

* Linter #2
2019-09-02 08:50:56 -05:00
Subham Roy
ae91852cd5 check for already set asyncio event loop policy (#1637)
* check for already set asyncio event loop policy

* fix linting warning
2019-08-28 11:30:23 -05:00
L. Kärkkäinen
2011f3a0b2 PEP 594 has cgi module scheduled for deprecation in Python 3.8 (#1649)
* PEP 594 has cgi module scheduled for deprecation in Python 3.8. Reimplement
cgi.parse_header in Sanic. The new implementation is much faster than either
cgi.parse_header or equivalent werkzeug.parse_options_header, and unlike the
two, handles also quoted values with semicolons or \" in them.

* Fix string escape.

* Useless linter complaints.

* More linter issues

* Add return type hint.

* Do not support quoted-pair escapes.

- Improved documentation and renamed the function more aptly as it only seems
  to apply to content-type and content-disposition headers.

* Unquote filenames also in normal mode.

* Add tests for headers. Adapted from CPython parse_header tests with changes on the final test.

* Linter

* Revert "Unquote filenames also in normal mode."

This reverts commit bf0d502bcd.

* Improved parse_content_header and added tests with Firefox and Chrome.

- Unescaping of quotes moved to parse_content_header because it affects all fields,
  not just filenames.
- It is impossible to handle all cases correctly but the current heuristics should
  suffice well for typical cases and beyond.
- Added comparisons with cgi.parse_header and werkzeug.parse_options_header.

* Updated comments as well.
2019-08-27 08:30:23 -05:00
34 changed files with 1079 additions and 258 deletions

1
.gitignore vendored
View File

@@ -10,6 +10,7 @@ coverage
settings.py
.idea/*
.cache/*
.mypy_cache/
.python-version
docs/_build/
docs/_api/

View File

@@ -21,6 +21,12 @@ matrix:
dist: xenial
sudo: true
name: "Python 3.7 without Extensions"
- env: TOX_ENV=type-checking
python: 3.6
name: "Python 3.6 Type checks"
- env: TOX_ENV=type-checking
python: 3.7
name: "Python 3.7 Type checks"
- env: TOX_ENV=lint
python: 3.6
name: "Python 3.6 Linter checks"

View File

@@ -131,7 +131,7 @@ Documentation
Changelog
---------
`Release Changelogs <https://github.com/huge-success/sanic/blob/master/CHANGELOG.md>`_.
`Release Changelogs <https://github.com/huge-success/sanic/blob/master/CHANGELOG.rst>`_.
Questions and Discussion

View File

@@ -110,37 +110,37 @@ Out of the box there are just a few predefined values which can be overwritten w
#### `REQUEST_TIMEOUT`
A request timeout measures the duration of time between the instant when a new open TCP connection is passed to the
Sanic backend server, and the instant when the whole HTTP request is received. If the time taken exceeds the
`REQUEST_TIMEOUT` value (in seconds), this is considered a Client Error so Sanic generates an `HTTP 408` response
and sends that to the client. Set this parameter's value higher if your clients routinely pass very large request payloads
A request timeout measures the duration of time between the instant when a new open TCP connection is passed to the
Sanic backend server, and the instant when the whole HTTP request is received. If the time taken exceeds the
`REQUEST_TIMEOUT` value (in seconds), this is considered a Client Error so Sanic generates an `HTTP 408` response
and sends that to the client. Set this parameter's value higher if your clients routinely pass very large request payloads
or upload requests very slowly.
#### `RESPONSE_TIMEOUT`
A response timeout measures the duration of time between the instant the Sanic server passes the HTTP request to the
Sanic App, and the instant a HTTP response is sent to the client. If the time taken exceeds the `RESPONSE_TIMEOUT`
value (in seconds), this is considered a Server Error so Sanic generates an `HTTP 503` response and sends that to the
client. Set this parameter's value higher if your application is likely to have long-running process that delay the
A response timeout measures the duration of time between the instant the Sanic server passes the HTTP request to the
Sanic App, and the instant a HTTP response is sent to the client. If the time taken exceeds the `RESPONSE_TIMEOUT`
value (in seconds), this is considered a Server Error so Sanic generates an `HTTP 503` response and sends that to the
client. Set this parameter's value higher if your application is likely to have long-running process that delay the
generation of a response.
#### `KEEP_ALIVE_TIMEOUT`
##### What is Keep Alive? And what does the Keep Alive Timeout value do?
`Keep-Alive` is a HTTP feature introduced in `HTTP 1.1`. When sending a HTTP request, the client (usually a web browser application)
can set a `Keep-Alive` header to indicate the http server (Sanic) to not close the TCP connection after it has send the response.
This allows the client to reuse the existing TCP connection to send subsequent HTTP requests, and ensures more efficient
`Keep-Alive` is a HTTP feature introduced in `HTTP 1.1`. When sending a HTTP request, the client (usually a web browser application)
can set a `Keep-Alive` header to indicate the http server (Sanic) to not close the TCP connection after it has send the response.
This allows the client to reuse the existing TCP connection to send subsequent HTTP requests, and ensures more efficient
network traffic for both the client and the server.
The `KEEP_ALIVE` config variable is set to `True` in Sanic by default. If you don't need this feature in your application,
set it to `False` to cause all client connections to close immediately after a response is sent, regardless of
The `KEEP_ALIVE` config variable is set to `True` in Sanic by default. If you don't need this feature in your application,
set it to `False` to cause all client connections to close immediately after a response is sent, regardless of
the `Keep-Alive` header on the request.
The amount of time the server holds the TCP connection open is decided by the server itself.
In Sanic, that value is configured using the `KEEP_ALIVE_TIMEOUT` value. By default, it is set to 5 seconds.
This is the same default setting as the Apache HTTP server and is a good balance between allowing enough time for
the client to send a new request, and not holding open too many connections at once. Do not exceed 75 seconds unless
The amount of time the server holds the TCP connection open is decided by the server itself.
In Sanic, that value is configured using the `KEEP_ALIVE_TIMEOUT` value. By default, it is set to 5 seconds.
This is the same default setting as the Apache HTTP server and is a good balance between allowing enough time for
the client to send a new request, and not holding open too many connections at once. Do not exceed 75 seconds unless
you know your clients are using a browser which supports TCP connections held open for that long.
For reference:
@@ -154,16 +154,58 @@ Opera 11 client hard keepalive limit = 120 seconds
Chrome 13+ client keepalive limit > 300+ seconds
```
### About proxy servers and client ip
### Proxy configuration
When you use a reverse proxy server (e.g. nginx), the value of `request.ip` will contain ip of a proxy, typically `127.0.0.1`. To determine the real client ip, `X-Forwarded-For` and `X-Real-IP` HTTP headers are used. But client can fake these headers if they have not been overridden by a proxy. Sanic has a set of options to determine the level of confidence in these headers.
When you use a reverse proxy server (e.g. nginx), the value of `request.ip` will contain ip of a proxy, typically `127.0.0.1`. Sanic may be configured to use proxy headers for determining the true client IP, available as `request.remote_addr`. The full external URL is also constructed from header fields if available.
* If you have a single proxy, set `PROXIES_COUNT` to `1`. Then Sanic will use `X-Real-IP` if available or the last ip from `X-Forwarded-For`.
Without proper precautions, a malicious client may use proxy headers to spoof its own IP. To avoid such issues, Sanic does not use any proxy headers unless explicitly enabled.
* If you have multiple proxies, set `PROXIES_COUNT` equal to their number to allow Sanic to select the correct ip from `X-Forwarded-For`.
Services behind reverse proxies must configure `FORWARDED_SECRET`, `REAL_IP_HEADER` and/or `PROXIES_COUNT`.
* If you don't use a proxy, set `PROXIES_COUNT` to `0` to ignore these headers and prevent ip falsification.
#### Forwarded header
* If you don't use `X-Real-IP` (e.g. your proxy sends only `X-Forwarded-For`), set `REAL_IP_HEADER` to an empty string.
```
Forwarded: for="1.2.3.4"; proto="https"; host="yoursite.com"; secret="Pr0xy",
for="10.0.0.1"; proto="http"; host="proxy.internal"; by="_1234proxy"
```
The real ip will be available in `request.remote_addr`. If HTTP headers are unavailable or untrusted, `request.remote_addr` will be an empty string; in this case use `request.ip` instead.
* Set `FORWARDED_SECRET` to an identifier used by the proxy of interest.
The secret is used to securely identify a specific proxy server. Given the above header, secret `Pr0xy` would use the information on the first line and secret `_1234proxy` would use the second line. The secret must exactly match the value of `secret` or `by`. A secret in `by` must begin with an underscore and use only characters specified in [RFC 7239 section 6.3](https://tools.ietf.org/html/rfc7239#section-6.3), while `secret` has no such restrictions.
Sanic ignores any elements without the secret key, and will not even parse the header if no secret is set.
All other proxy headers are ignored once a trusted forwarded element is found, as it already carries complete information about the client.
#### Traditional proxy headers
```
X-Real-IP: 1.2.3.4
X-Forwarded-For: 1.2.3.4, 10.0.0.1
X-Forwarded-Proto: https
X-Forwarded-Host: yoursite.com
```
* Set `REAL_IP_HEADER` to `x-real-ip`, `true-client-ip`, `cf-connecting-ip` or other name of such header.
* Set `PROXIES_COUNT` to the number of entries expected in `x-forwarded-for` (name configurable via `FORWARDED_FOR_HEADER`).
If client IP is found by one of these methods, Sanic uses the following headers for URL parts:
* `x-forwarded-proto`, `x-forwarded-host`, `x-forwarded-port`, `x-forwarded-path` and if necessary, `x-scheme`.
#### Proxy config if using ...
* a proxy that supports `forwarded`: set `FORWARDED_SECRET` to the value that the proxy inserts in the header
* Apache Traffic Server: `CONFIG proxy.config.http.insert_forwarded STRING for|proto|host|by=_secret`
* NGHTTPX: `nghttpx --add-forwarded=for,proto,host,by --forwarded-for=ip --forwarded-by=_secret`
* NGINX: after [the official instructions](https://www.nginx.com/resources/wiki/start/topics/examples/forwarded/), add anywhere in your config:
proxy_set_header Forwarded "$proxy_add_forwarded;by=\"_$server_name\";proto=$scheme;host=\"$http_host\";path=\"$request_uri\";secret=_secret";
* a custom header with client IP: set `REAL_IP_HEADER` to the name of that header
* `x-forwarded-for`: set `PROXIES_COUNT` to `1` for a single proxy, or a greater number to allow Sanic to select the correct IP
* no proxies: no configuration required!
#### Changes in Sanic 19.9
Earlier Sanic versions had unsafe default settings. From 19.9 onwards proxy settings must be set manually, and support for negative PROXIES_COUNT has been removed.

View File

@@ -157,4 +157,35 @@ server = app.create_server(host="0.0.0.0", port=8000, return_asyncio_server=True
loop = asyncio.get_event_loop()
task = asyncio.ensure_future(server)
loop.run_forever()
```
Caveat: using this method, calling `app.create_server()` will trigger "before_server_start" server events, but not
"after_server_start", "before_server_stop", or "after_server_stop" server events.
For more advanced use-cases, you can trigger these events using the AsyncioServer object, returned by awaiting
the server task.
Here is an incomplete example (please see `run_async_advanced.py` in examples for something more complete):
```python
serv_coro = app.create_server(host="0.0.0.0", port=8000, return_asyncio_server=True)
loop = asyncio.get_event_loop()
serv_task = asyncio.ensure_future(serv_coro, loop=loop)
server = loop.run_until_complete(serv_task)
server.after_start()
try:
loop.run_forever()
except KeyboardInterrupt as e:
loop.stop()
finally:
server.before_stop()
# Wait for server to close
close_task = server.close()
loop.run_until_complete(close_task)
# Complete all tasks on the loop
for connection in server.connections:
connection.close_if_idle()
server.after_stop()
```

View File

@@ -39,8 +39,8 @@ app = Sanic(__name__)
@app.middleware('request')
async def add_key(request):
# Add a key to request object like dict object
request['foo'] = 'bar'
# Arbitrary data may be stored in request context:
request.ctx.foo = 'bar'
@app.middleware('response')
@@ -53,16 +53,21 @@ async def prevent_xss(request, response):
response.headers["x-xss-protection"] = "1; mode=block"
@app.get("/")
async def index(request):
return sanic.response.text(request.ctx.foo)
app.run(host="0.0.0.0", port=8000)
```
The above code will apply the three middleware in order. The first middleware
**add_key** will add a new key `foo` into `request` object. This worked because
`request` object can be manipulated like `dict` object. Then, the second middleware
**custom_banner** will change the HTTP response header *Server* to
*Fake-Server*, and the last middleware **prevent_xss** will add the HTTP
header for preventing Cross-Site-Scripting (XSS) attacks. These two functions
are invoked *after* a user function returns a response.
The three middlewares are executed in order:
1. The first request middleware **add_key** adds a new key `foo` into request context.
2. Request is routed to handler **index**, which gets the key from context and returns a text response.
3. The first response middleware **custom_banner** changes the HTTP response header *Server* to
say *Fake-Server*
4. The second response middleware **prevent_xss** adds the HTTP header for preventing Cross-Site-Scripting (XSS) attacks.
## Responding early
@@ -81,6 +86,16 @@ async def halt_response(request, response):
return text('I halted the response')
```
## Custom context
Arbitrary data may be stored in `request.ctx`. A typical use case
would be to store the user object acquired from database in an authentication
middleware. Keys added are accessible to all later middleware as well as
the handler over the duration of the request.
Custom context is reserved for applications and extensions. Sanic itself makes
no use of it.
## Listeners
If you want to execute startup/teardown code as your server starts or closes, you can use the following listeners:

View File

@@ -34,6 +34,10 @@ app.url_for('static', name='another', filename='any') == '/another.png'
bp = Blueprint('bp', url_prefix='/bp')
bp.static('/static', './static')
# specify a different content_type for your files
# such as adding 'charset'
app.static('/', '/public/index.html', content_type="text/html; charset=utf-8")
# servers the file directly
bp.static('/the_best.png', '/home/ubuntu/test.png', name='best_png')
app.blueprint(bp)

View File

@@ -1,7 +1,10 @@
WebSocket
=========
Sanic provides an easy to use abstraction on top of `websockets`. To setup a WebSocket:
Sanic provides an easy to use abstraction on top of `websockets`.
Sanic Supports websocket versions 7 and 8.
To setup a WebSocket:
.. code:: python

View File

@@ -0,0 +1,38 @@
from sanic import Sanic
from sanic import response
from signal import signal, SIGINT
import asyncio
import uvloop
app = Sanic(__name__)
@app.listener('after_server_start')
async def after_start_test(app, loop):
print("Async Server Started!")
@app.route("/")
async def test(request):
return response.json({"answer": "42"})
asyncio.set_event_loop(uvloop.new_event_loop())
serv_coro = app.create_server(host="0.0.0.0", port=8000, return_asyncio_server=True)
loop = asyncio.get_event_loop()
serv_task = asyncio.ensure_future(serv_coro, loop=loop)
signal(SIGINT, lambda s, f: loop.stop())
server = loop.run_until_complete(serv_task)
server.after_start()
try:
loop.run_forever()
except KeyboardInterrupt as e:
loop.stop()
finally:
server.before_stop()
# Wait for server to close
close_task = server.close()
loop.run_until_complete(close_task)
# Complete all tasks on the loop
for connection in server.connections:
connection.close_if_idle()
server.after_stop()

View File

@@ -1,5 +1,6 @@
from argparse import ArgumentParser
from importlib import import_module
from typing import Any, Dict, Optional
from sanic.app import Sanic
from sanic.log import logger
@@ -35,7 +36,10 @@ if __name__ == "__main__":
)
)
if args.cert is not None or args.key is not None:
ssl = {"cert": args.cert, "key": args.key}
ssl = {
"cert": args.cert,
"key": args.key,
} # type: Optional[Dict[str, Any]]
else:
ssl = None

View File

@@ -1 +1 @@
__version__ = "19.6.3"
__version__ = "19.9.0"

View File

@@ -11,7 +11,7 @@ from inspect import getmodulename, isawaitable, signature, stack
from socket import socket
from ssl import Purpose, SSLContext, create_default_context
from traceback import format_exc
from typing import Any, Optional, Type, Union
from typing import Any, Dict, Optional, Type, Union
from urllib.parse import urlencode, urlunparse
from sanic import reloader_helpers
@@ -138,11 +138,9 @@ class Sanic:
"""
Register the listener for a given event.
Args:
listener: callable i.e. setup_db(app, loop)
event: when to register listener i.e. 'before_server_start'
Returns: listener
:param listener: callable i.e. setup_db(app, loop)
:param event: when to register listener i.e. 'before_server_start'
:return: listener
"""
return self.listener(event)(listener)
@@ -770,7 +768,7 @@ class Sanic:
URLBuildError
"""
# find the route by the supplied view name
kw = {}
kw: Dict[str, str] = {}
# special static files url_for
if view_name == "static":
kw.update(name=kwargs.pop("name", "static"))
@@ -1309,6 +1307,12 @@ class Sanic:
"stop_event will be removed from future versions.",
DeprecationWarning,
)
if self.config.PROXIES_COUNT and self.config.PROXIES_COUNT < 0:
raise ValueError(
"PROXIES_COUNT cannot be negative. "
"https://sanic.readthedocs.io/en/latest/sanic/config.html"
"#proxy-configuration"
)
self.error_handler.debug = debug
self.debug = debug

View File

@@ -2,9 +2,23 @@ import asyncio
import warnings
from inspect import isawaitable
from typing import Any, Awaitable, Callable, MutableMapping, Union
from typing import (
Any,
Awaitable,
Callable,
Dict,
List,
MutableMapping,
Optional,
Tuple,
Union,
)
from urllib.parse import quote
from requests_async import ASGISession # type: ignore
import sanic.app # noqa
from sanic.compat import Header
from sanic.exceptions import InvalidUsage, ServerError
from sanic.log import logger
@@ -54,6 +68,8 @@ class MockProtocol:
class MockTransport:
_protocol: Optional[MockProtocol]
def __init__(
self, scope: ASGIScope, receive: ASGIReceive, send: ASGISend
) -> None:
@@ -68,11 +84,12 @@ class MockTransport:
self._protocol = MockProtocol(self, self.loop)
return self._protocol
def get_extra_info(self, info: str) -> Union[str, bool]:
def get_extra_info(self, info: str) -> Union[str, bool, None]:
if info == "peername":
return self.scope.get("server")
elif info == "sslcontext":
return self.scope.get("scheme") in ["https", "wss"]
return None
def get_websocket_connection(self) -> WebSocketConnection:
try:
@@ -172,6 +189,13 @@ class Lifespan:
class ASGIApp:
sanic_app: Union[ASGISession, "sanic.app.Sanic"]
request: Request
transport: MockTransport
do_stream: bool
lifespan: Lifespan
ws: Optional[WebSocketConnection]
def __init__(self) -> None:
self.ws = None
@@ -182,8 +206,8 @@ class ASGIApp:
instance = cls()
instance.sanic_app = sanic_app
instance.transport = MockTransport(scope, receive, send)
instance.transport.add_task = sanic_app.loop.create_task
instance.transport.loop = sanic_app.loop
setattr(instance.transport, "add_task", sanic_app.loop.create_task)
headers = Header(
[
@@ -286,8 +310,8 @@ class ASGIApp:
"""
Write the response.
"""
headers = []
cookies = {}
headers: List[Tuple[bytes, bytes]] = []
cookies: Dict[str, str] = {}
try:
cookies = {
v.key: v

View File

@@ -56,7 +56,7 @@ class BlueprintGroup(MutableSequence):
"""
return self._blueprints[item]
def __setitem__(self, index: int, item: object) -> None:
def __setitem__(self, index, item) -> None:
"""
Abstract method implemented to turn the `BlueprintGroup` class
into a list like object to support all the existing behavior.
@@ -69,7 +69,7 @@ class BlueprintGroup(MutableSequence):
"""
self._blueprints[index] = item
def __delitem__(self, index: int) -> None:
def __delitem__(self, index) -> None:
"""
Abstract method implemented to turn the `BlueprintGroup` class
into a list like object to support all the existing behavior.

View File

@@ -1,4 +1,4 @@
from multidict import CIMultiDict
from multidict import CIMultiDict # type: ignore
class Header(CIMultiDict):

View File

@@ -26,9 +26,10 @@ DEFAULT_CONFIG = {
"WEBSOCKET_WRITE_LIMIT": 2 ** 16,
"GRACEFUL_SHUTDOWN_TIMEOUT": 15.0, # 15 sec
"ACCESS_LOG": True,
"PROXIES_COUNT": -1,
"FORWARDED_SECRET": None,
"REAL_IP_HEADER": None,
"PROXIES_COUNT": None,
"FORWARDED_FOR_HEADER": "X-Forwarded-For",
"REAL_IP_HEADER": "X-Real-IP",
}

172
sanic/headers.py Normal file
View File

@@ -0,0 +1,172 @@
import re
from typing import Dict, Iterable, List, Optional, Tuple, Union
from urllib.parse import unquote
Options = Dict[str, Union[int, str]] # key=value fields in various headers
OptionsIterable = Iterable[Tuple[str, str]] # May contain duplicate keys
_token, _quoted = r"([\w!#$%&'*+\-.^_`|~]+)", r'"([^"]*)"'
_param = re.compile(fr";\s*{_token}=(?:{_token}|{_quoted})", re.ASCII)
_firefox_quote_escape = re.compile(r'\\"(?!; |\s*$)')
_ipv6 = "(?:[0-9A-Fa-f]{0,4}:){2,7}[0-9A-Fa-f]{0,4}"
_ipv6_re = re.compile(_ipv6)
_host_re = re.compile(
r"((?:\[" + _ipv6 + r"\])|[a-zA-Z0-9.\-]{1,253})(?::(\d{1,5}))?"
)
# RFC's quoted-pair escapes are mostly ignored by browsers. Chrome, Firefox and
# curl all have different escaping, that we try to handle as well as possible,
# even though no client espaces in a way that would allow perfect handling.
# For more information, consult ../tests/test_requests.py
def parse_content_header(value: str) -> Tuple[str, Options]:
"""Parse content-type and content-disposition header values.
E.g. 'form-data; name=upload; filename=\"file.txt\"' to
('form-data', {'name': 'upload', 'filename': 'file.txt'})
Mostly identical to cgi.parse_header and werkzeug.parse_options_header
but runs faster and handles special characters better. Unescapes quotes.
"""
value = _firefox_quote_escape.sub("%22", value)
pos = value.find(";")
if pos == -1:
options: Dict[str, Union[int, str]] = {}
else:
options = {
m.group(1).lower(): m.group(2) or m.group(3).replace("%22", '"')
for m in _param.finditer(value[pos:])
}
value = value[:pos]
return value.strip().lower(), options
# https://tools.ietf.org/html/rfc7230#section-3.2.6 and
# https://tools.ietf.org/html/rfc7239#section-4
# This regex is for *reversed* strings because that works much faster for
# right-to-left matching than the other way around. Be wary that all things are
# a bit backwards! _rparam matches forwarded pairs alike ";key=value"
_rparam = re.compile(f"(?:{_token}|{_quoted})={_token}\\s*($|[;,])", re.ASCII)
def parse_forwarded(headers, config) -> Optional[Options]:
"""Parse RFC 7239 Forwarded headers.
The value of `by` or `secret` must match `config.FORWARDED_SECRET`
:return: dict with keys and values, or None if nothing matched
"""
header = headers.getall("forwarded", None)
secret = config.FORWARDED_SECRET
if header is None or not secret:
return None
header = ",".join(header) # Join multiple header lines
if secret not in header:
return None
# Loop over <separator><key>=<value> elements from right to left
sep = pos = None
options: List[Tuple[str, str]] = []
found = False
for m in _rparam.finditer(header[::-1]):
# Start of new element? (on parser skips and non-semicolon right sep)
if m.start() != pos or sep != ";":
# Was the previous element (from right) what we wanted?
if found:
break
# Clear values and parse as new element
del options[:]
pos = m.end()
val_token, val_quoted, key, sep = m.groups()
key = key.lower()[::-1]
val = (val_token or val_quoted.replace('"\\', '"'))[::-1]
options.append((key, val))
if key in ("secret", "by") and val == secret:
found = True
# Check if we would return on next round, to avoid useless parse
if found and sep != ";":
break
# If secret was found, return the matching options in left-to-right order
return fwd_normalize(reversed(options)) if found else None
def parse_xforwarded(headers, config) -> Optional[Options]:
"""Parse traditional proxy headers."""
real_ip_header = config.REAL_IP_HEADER
proxies_count = config.PROXIES_COUNT
addr = real_ip_header and headers.get(real_ip_header)
if not addr and proxies_count:
assert proxies_count > 0
try:
# Combine, split and filter multiple headers' entries
forwarded_for = headers.getall(config.FORWARDED_FOR_HEADER)
proxies = [
p
for p in (
p.strip() for h in forwarded_for for p in h.split(",")
)
if p
]
addr = proxies[-proxies_count]
except (KeyError, IndexError):
pass
# No processing of other headers if no address is found
if not addr:
return None
def options():
yield "for", addr
for key, header in (
("proto", "x-scheme"),
("proto", "x-forwarded-proto"), # Overrides X-Scheme if present
("host", "x-forwarded-host"),
("port", "x-forwarded-port"),
("path", "x-forwarded-path"),
):
yield key, headers.get(header)
return fwd_normalize(options())
def fwd_normalize(fwd: OptionsIterable) -> Options:
"""Normalize and convert values extracted from forwarded headers."""
ret: Dict[str, Union[int, str]] = {}
for key, val in fwd:
if val is not None:
try:
if key in ("by", "for"):
ret[key] = fwd_normalize_address(val)
elif key in ("host", "proto"):
ret[key] = val.lower()
elif key == "port":
ret[key] = int(val)
elif key == "path":
ret[key] = unquote(val)
else:
ret[key] = val
except ValueError:
pass
return ret
def fwd_normalize_address(addr: str) -> str:
"""Normalize address fields of proxy headers."""
if addr == "unknown":
raise ValueError() # omit unknown value identifiers
if addr.startswith("_"):
return addr # do not lower-case obfuscated strings
if _ipv6_re.fullmatch(addr):
addr = f"[{addr}]" # bracket IPv6
return addr.lower()
def parse_host(host: str) -> Tuple[Optional[str], Optional[int]]:
"""Split host:port into hostname and port.
:return: None in place of missing elements
"""
m = _host_re.fullmatch(host)
if not m:
return None, None
host, port = m.groups()
return host.lower(), int(port) if port is not None else None

View File

@@ -8,6 +8,7 @@ STATUS_CODES = {
100: b"Continue",
101: b"Switching Protocols",
102: b"Processing",
103: b"Early Hints",
200: b"OK",
201: b"Created",
202: b"Accepted",

View File

@@ -1,31 +1,28 @@
import asyncio
import email.utils
import json
import sys
import warnings
from cgi import parse_header
from collections import defaultdict, namedtuple
from http.cookies import SimpleCookie
from types import SimpleNamespace
from urllib.parse import parse_qs, parse_qsl, unquote, urlunparse
from httptools import parse_url
from httptools import parse_url # type: ignore
from sanic.exceptions import InvalidUsage
from sanic.headers import (
parse_content_header,
parse_forwarded,
parse_host,
parse_xforwarded,
)
from sanic.log import error_logger, logger
try:
from ujson import loads as json_loads
from ujson import loads as json_loads # type: ignore
except ImportError:
if sys.version_info[:2] == (3, 5):
def json_loads(data):
# on Python 3.5 json.loads only supports str not bytes
return json.loads(data.decode())
else:
json_loads = json.loads
from json import loads as json_loads # type: ignore
DEFAULT_HTTP_CONTENT_TYPE = "application/octet-stream"
EXPECT_HEADER = "EXPECT"
@@ -66,7 +63,7 @@ class StreamBuffer:
return self._queue.full()
class Request(dict):
class Request:
"""Properties of an HTTP request such as URL, headers, etc."""
__slots__ = (
@@ -79,6 +76,7 @@ class Request(dict):
"_socket",
"app",
"body",
"ctx",
"endpoint",
"headers",
"method",
@@ -87,6 +85,7 @@ class Request(dict):
"parsed_files",
"parsed_form",
"parsed_json",
"parsed_forwarded",
"raw_url",
"stream",
"transport",
@@ -107,6 +106,8 @@ class Request(dict):
# Init but do not inhale
self.body_init()
self.ctx = SimpleNamespace()
self.parsed_forwarded = None
self.parsed_json = None
self.parsed_form = None
self.parsed_files = None
@@ -122,10 +123,30 @@ class Request(dict):
self.__class__.__name__, self.method, self.path
)
def __bool__(self):
if self.transport:
return True
return False
def get(self, key, default=None):
""".. deprecated:: 19.9
Custom context is now stored in `request.custom_context.yourkey`"""
return self.ctx.__dict__.get(key, default)
def __contains__(self, key):
""".. deprecated:: 19.9
Custom context is now stored in `request.custom_context.yourkey`"""
return key in self.ctx.__dict__
def __getitem__(self, key):
""".. deprecated:: 19.9
Custom context is now stored in `request.custom_context.yourkey`"""
return self.ctx.__dict__[key]
def __delitem__(self, key):
""".. deprecated:: 19.9
Custom context is now stored in `request.custom_context.yourkey`"""
del self.ctx.__dict__[key]
def __setitem__(self, key, value):
""".. deprecated:: 19.9
Custom context is now stored in `request.custom_context.yourkey`"""
setattr(self.ctx, key, value)
def body_init(self):
self.body = []
@@ -177,7 +198,7 @@ class Request(dict):
content_type = self.headers.get(
"Content-Type", DEFAULT_HTTP_CONTENT_TYPE
)
content_type, parameters = parse_header(content_type)
content_type, parameters = parse_content_header(content_type)
try:
if content_type == "application/x-www-form-urlencoded":
self.parsed_form = RequestParameters(
@@ -371,72 +392,58 @@ class Request(dict):
@property
def server_name(self):
"""
Attempt to get the server's hostname in this order:
`config.SERVER_NAME`, `x-forwarded-host` header, :func:`Request.host`
Attempt to get the server's external hostname in this order:
`config.SERVER_NAME`, proxied or direct Host headers
:func:`Request.host`
:return: the server name without port number
:rtype: str
"""
return (
self.app.config.get("SERVER_NAME")
or self.headers.get("x-forwarded-host")
or self.host.split(":")[0]
)
server_name = self.app.config.get("SERVER_NAME")
if server_name:
host = server_name.split("//", 1)[-1].split("/", 1)[0]
return parse_host(host)[0]
return parse_host(self.host)[0]
@property
def forwarded(self):
if self.parsed_forwarded is None:
self.parsed_forwarded = (
parse_forwarded(self.headers, self.app.config)
or parse_xforwarded(self.headers, self.app.config)
or {}
)
return self.parsed_forwarded
@property
def server_port(self):
"""
Attempt to get the server's port in this order:
`x-forwarded-port` header, :func:`Request.host`, actual port used by
the transport layer socket.
Attempt to get the server's external port number in this order:
`config.SERVER_NAME`, proxied or direct Host headers
:func:`Request.host`,
actual port used by the transport layer socket.
:return: server port
:rtype: int
"""
forwarded_port = self.headers.get("x-forwarded-port") or (
self.host.split(":")[1] if ":" in self.host else None
if self.forwarded:
return self.forwarded.get("port") or (
80 if self.scheme in ("http", "ws") else 443
)
return (
parse_host(self.host)[1]
or self.transport.get_extra_info("sockname")[1]
)
if forwarded_port:
return int(forwarded_port)
else:
port = self.transport.get_extra_info("sockname")[1]
return port
@property
def remote_addr(self):
"""Attempt to return the original client ip based on X-Forwarded-For
or X-Real-IP. If HTTP headers are unavailable or untrusted, returns
an empty string.
"""Attempt to return the original client ip based on `forwarded`,
`x-forwarded-for` or `x-real-ip`. If HTTP headers are unavailable or
untrusted, returns an empty string.
:return: original client ip.
"""
if not hasattr(self, "_remote_addr"):
if self.app.config.PROXIES_COUNT == 0:
self._remote_addr = ""
elif self.app.config.REAL_IP_HEADER and self.headers.get(
self.app.config.REAL_IP_HEADER
):
self._remote_addr = self.headers[
self.app.config.REAL_IP_HEADER
]
elif self.app.config.FORWARDED_FOR_HEADER:
forwarded_for = self.headers.get(
self.app.config.FORWARDED_FOR_HEADER, ""
).split(",")
remote_addrs = [
addr
for addr in [addr.strip() for addr in forwarded_for]
if addr
]
if self.app.config.PROXIES_COUNT == -1:
self._remote_addr = remote_addrs[0]
elif len(remote_addrs) >= self.app.config.PROXIES_COUNT:
self._remote_addr = remote_addrs[
-self.app.config.PROXIES_COUNT
]
else:
self._remote_addr = ""
else:
self._remote_addr = ""
self._remote_addr = self.forwarded.get("for", "")
return self._remote_addr
@property
@@ -444,14 +451,13 @@ class Request(dict):
"""
Attempt to get the request scheme.
Seeking the value in this order:
`x-forwarded-proto` header, `x-scheme` header, the sanic app itself.
`forwarded` header, `x-forwarded-proto` header,
`x-scheme` header, the sanic app itself.
:return: http|https|ws|wss or arbitrary value given by the headers.
:rtype: str
"""
forwarded_proto = self.headers.get(
"x-forwarded-proto"
) or self.headers.get("x-scheme")
forwarded_proto = self.forwarded.get("proto")
if forwarded_proto:
return forwarded_proto
@@ -471,12 +477,10 @@ class Request(dict):
@property
def host(self):
"""
:return: the Host specified in the header, may contains port number.
:return: proxied or direct Host header. Hostname and port number may be
separated by sanic.headers.parse_host(request.host).
"""
# it appears that httptools doesn't return the host
# so pull it from the headers
return self.headers.get("Host", "")
return self.forwarded.get("host", self.headers.get("Host", ""))
@property
def content_type(self):
@@ -514,6 +518,10 @@ class Request(dict):
:return: an absolute url to the given view
:rtype: str
"""
# Full URL SERVER_NAME can only be handled in app.url_for
if "//" in self.app.config.SERVER_NAME:
return self.app.url_for(view_name, _external=True, **kwargs)
scheme = self.scheme
host = self.server_name
port = self.server_port
@@ -561,7 +569,7 @@ def parse_multipart_form(body, boundary):
colon_index = form_line.index(":")
form_header_field = form_line[0:colon_index].lower()
form_header_value, form_parameters = parse_header(
form_header_value, form_parameters = parse_content_header(
form_line[colon_index + 2 :]
)

View File

@@ -3,7 +3,7 @@ from mimetypes import guess_type
from os import path
from urllib.parse import quote_plus
from aiofiles import open as open_async
from aiofiles import open as open_async # type: ignore
from sanic.compat import Header
from sanic.cookies import CookieJar
@@ -12,7 +12,7 @@ from sanic.helpers import STATUS_CODES, has_message_body, remove_entity_headers
try:
from ujson import dumps as json_dumps
except BaseException:
except ImportError:
from json import dumps
# This is done in order to ensure that the JSON response is

View File

@@ -10,8 +10,8 @@ from signal import signal as signal_func
from socket import SO_REUSEADDR, SOL_SOCKET, socket
from time import time
from httptools import HttpRequestParser
from httptools.parser.errors import HttpParserError
from httptools import HttpRequestParser # type: ignore
from httptools.parser.errors import HttpParserError # type: ignore
from sanic.compat import Header
from sanic.exceptions import (
@@ -28,9 +28,10 @@ from sanic.response import HTTPResponse
try:
import uvloop
import uvloop # type: ignore
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
if not isinstance(asyncio.get_event_loop_policy(), uvloop.EventLoopPolicy):
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
except ImportError:
pass
@@ -633,6 +634,78 @@ def trigger_events(events, loop):
loop.run_until_complete(result)
class AsyncioServer:
"""
Wraps an asyncio server with functionality that might be useful to
a user who needs to manage the server lifecycle manually.
"""
__slots__ = (
"loop",
"serve_coro",
"_after_start",
"_before_stop",
"_after_stop",
"server",
"connections",
)
def __init__(
self,
loop,
serve_coro,
connections,
after_start,
before_stop,
after_stop,
):
# Note, Sanic already called "before_server_start" events
# before this helper was even created. So we don't need it here.
self.loop = loop
self.serve_coro = serve_coro
self._after_start = after_start
self._before_stop = before_stop
self._after_stop = after_stop
self.server = None
self.connections = connections
def after_start(self):
"""Trigger "after_server_start" events"""
trigger_events(self._after_start, self.loop)
def before_stop(self):
"""Trigger "before_server_stop" events"""
trigger_events(self._before_stop, self.loop)
def after_stop(self):
"""Trigger "after_server_stop" events"""
trigger_events(self._after_stop, self.loop)
def is_serving(self):
if self.server:
return self.server.is_serving()
return False
def wait_closed(self):
if self.server:
return self.server.wait_closed()
def close(self):
if self.server:
self.server.close()
coro = self.wait_closed()
task = asyncio.ensure_future(coro, loop=self.loop)
return task
def __await__(self):
"""Starts the asyncio server, returns AsyncServerCoro"""
task = asyncio.ensure_future(self.serve_coro)
while not task.done():
yield
self.server = task.result()
return self
def serve(
host,
port,
@@ -699,6 +772,8 @@ def serve(
:param reuse_port: `True` for multiple workers
:param loop: asyncio compatible event loop
:param protocol: subclass of asyncio protocol class
:param run_async: bool: Do not create a new event loop for the server,
and return an AsyncServer object rather than running it
:param request_class: Request class to use
:param access_log: disable/enable access log
:param websocket_max_size: enforces the maximum size for
@@ -770,7 +845,14 @@ def serve(
)
if run_async:
return server_coroutine
return AsyncioServer(
loop,
server_coroutine,
connections,
after_start,
before_stop,
after_stop,
)
trigger_events(before_start, loop)

View File

@@ -4,7 +4,7 @@ from re import sub
from time import gmtime, strftime
from urllib.parse import unquote
from aiofiles.os import stat
from aiofiles.os import stat # type: ignore
from sanic.exceptions import (
ContentRangeError,

View File

@@ -6,9 +6,9 @@ from json import JSONDecodeError
from socket import socket
from urllib.parse import unquote, urlsplit
import httpcore
import requests_async as requests
import websockets
import httpcore # type: ignore
import requests_async as requests # type: ignore
import websockets # type: ignore
from sanic.asgi import ASGIApp
from sanic.exceptions import MethodNotSupported
@@ -288,6 +288,14 @@ class SanicASGIAdapter(requests.asgi.ASGIAdapter): # noqa
request_complete = True
return {"type": "http.request", "body": body_bytes}
request_complete = False
response_started = False
response_complete = False
raw_kwargs = {"content": b""} # type: typing.Dict[str, typing.Any]
template = None
context = None
return_value = None
async def send(message) -> None:
nonlocal raw_kwargs, response_started, response_complete, template, context # noqa
@@ -316,14 +324,6 @@ class SanicASGIAdapter(requests.asgi.ASGIAdapter): # noqa
template = message["template"]
context = message["context"]
request_complete = False
response_started = False
response_complete = False
raw_kwargs = {"content": b""} # type: typing.Dict[str, typing.Any]
template = None
context = None
return_value = None
try:
return_value = await self.app(scope, receive, send)
except BaseException as exc:

View File

@@ -1,3 +1,5 @@
from typing import Any, Callable, List
from sanic.constants import HTTP_METHODS
from sanic.exceptions import InvalidUsage
@@ -37,7 +39,7 @@ class HTTPMethodView:
To add any decorator you could set it into decorators variable
"""
decorators = []
decorators: List[Callable[[Callable[..., Any]], Callable[..., Any]]] = []
def dispatch_request(self, request, *args, **kwargs):
handler = getattr(self, request.method.lower(), None)

View File

@@ -1,13 +1,27 @@
from typing import Any, Awaitable, Callable, MutableMapping, Optional, Union
from typing import (
Any,
Awaitable,
Callable,
Dict,
MutableMapping,
Optional,
Union,
)
from httptools import HttpParserUpgrade
from websockets import ConnectionClosed # noqa
from websockets import InvalidHandshake, WebSocketCommonProtocol, handshake
from httptools import HttpParserUpgrade # type: ignore
from websockets import ( # type: ignore
ConnectionClosed,
InvalidHandshake,
WebSocketCommonProtocol,
handshake,
)
from sanic.exceptions import InvalidUsage
from sanic.server import HttpProtocol
__all__ = ["ConnectionClosed", "WebSocketProtocol", "WebSocketConnection"]
ASIMessage = MutableMapping[str, Any]
@@ -105,6 +119,9 @@ class WebSocketProtocol(HttpProtocol):
read_limit=self.websocket_read_limit,
write_limit=self.websocket_write_limit,
)
# Following two lines are required for websockets 8.x
self.websocket.is_client = False
self.websocket.side = "server"
self.websocket.subprotocol = subprotocol
self.websocket.connection_made(request.transport)
self.websocket.connection_open()
@@ -125,14 +142,12 @@ class WebSocketConnection:
self._receive = receive
async def send(self, data: Union[str, bytes], *args, **kwargs) -> None:
message = {"type": "websocket.send"}
message: Dict[str, Union[str, bytes]] = {"type": "websocket.send"}
try:
data.decode()
except AttributeError:
message.update({"text": str(data)})
else:
if isinstance(data, bytes):
message.update({"bytes": data})
else:
message.update({"text": str(data)})
await self._send(message)
@@ -144,6 +159,8 @@ class WebSocketConnection:
elif message["type"] == "websocket.disconnect":
pass
return None
receive = recv
async def accept(self) -> None:

View File

@@ -5,19 +5,19 @@ import signal
import sys
import traceback
import gunicorn.workers.base as base
import gunicorn.workers.base as base # type: ignore
from sanic.server import HttpProtocol, Signal, serve, trigger_events
from sanic.websocket import WebSocketProtocol
try:
import ssl
import ssl # type: ignore
except ImportError:
ssl = None
ssl = None # type: ignore
try:
import uvloop
import uvloop # type: ignore
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
except ImportError:

View File

@@ -14,7 +14,7 @@ multi_line_output = 3
not_skip = __init__.py
[version]
current_version = 19.6.3
current_version = 19.9.0
files = sanic/__version__.py
current_version_pattern = __version__ = "{current_version}"
new_version_pattern = __version__ = "{new_version}"

View File

@@ -80,13 +80,13 @@ requirements = [
uvloop,
ujson,
"aiofiles>=0.3.0",
"websockets>=7.0,<8.0",
"websockets>=7.0,<9.0",
"multidict>=4.0,<5.0",
"requests-async==0.5.0",
]
tests_require = [
"pytest==4.1.0",
"pytest==5.2.1",
"multidict>=4.0,<5.0",
"gunicorn",
"pytest-cov",

57
tests/test_headers.py Normal file
View File

@@ -0,0 +1,57 @@
import pytest
from sanic import headers
@pytest.mark.parametrize(
"input, expected",
[
("text/plain", ("text/plain", {})),
("text/vnd.just.made.this.up ; ", ("text/vnd.just.made.this.up", {})),
("text/plain;charset=us-ascii", ("text/plain", {"charset": "us-ascii"})),
('text/plain ; charset="us-ascii"', ("text/plain", {"charset": "us-ascii"})),
(
'text/plain ; charset="us-ascii"; another=opt',
("text/plain", {"charset": "us-ascii", "another": "opt"})
),
(
'attachment; filename="silly.txt"',
("attachment", {"filename": "silly.txt"})
),
(
'attachment; filename="strange;name"',
("attachment", {"filename": "strange;name"})
),
(
'attachment; filename="strange;name";size=123;',
("attachment", {"filename": "strange;name", "size": "123"})
),
(
'form-data; name="files"; filename="fo\\"o;bar\\"',
('form-data', {'name': 'files', 'filename': 'fo"o;bar\\'})
# cgi.parse_header:
# ('form-data', {'name': 'files', 'filename': 'fo"o;bar\\'})
# werkzeug.parse_options_header:
# ('form-data', {'name': 'files', 'filename': '"fo\\"o', 'bar\\"': None})
),
# <input type=file name="foo&quot;;bar\"> with Unicode filename!
(
# Chrome:
# Content-Disposition: form-data; name="foo%22;bar\"; filename="😀"
'form-data; name="foo%22;bar\\"; filename="😀"',
('form-data', {'name': 'foo";bar\\', 'filename': '😀'})
# cgi: ('form-data', {'name': 'foo%22;bar"; filename="😀'})
# werkzeug: ('form-data', {'name': 'foo%22;bar"; filename='})
),
(
# Firefox:
# Content-Disposition: form-data; name="foo\";bar\"; filename="😀"
'form-data; name="foo\\";bar\\"; filename="😀"',
('form-data', {'name': 'foo";bar\\', 'filename': '😀'})
# cgi: ('form-data', {'name': 'foo";bar"; filename="😀'})
# werkzeug: ('form-data', {'name': 'foo";bar"; filename='})
),
]
)
def test_parse_headers(input, expected):
assert headers.parse_content_header(input) == expected

View File

@@ -5,6 +5,7 @@ import signal
import pytest
from sanic import Blueprint
from sanic.response import text
from sanic.testing import HOST, PORT
@@ -37,8 +38,6 @@ def test_multiprocessing(app):
reason="SIGALRM is not implemented for this platform",
)
def test_multiprocessing_with_blueprint(app):
from sanic import Blueprint
# Selects a number at random so we can spot check
num_workers = random.choice(range(2, multiprocessing.cpu_count() * 2 + 1))
process_list = set()
@@ -64,27 +63,27 @@ def handler(request):
return text("Hello")
# Muliprocessing on Windows requires app to be able to be pickled
# Multiprocessing on Windows requires app to be able to be pickled
@pytest.mark.parametrize("protocol", [3, 4])
def test_pickle_app(app, protocol):
app.route("/")(handler)
p_app = pickle.dumps(app, protocol=protocol)
del app
up_p_app = pickle.loads(p_app)
assert up_p_app
request, response = app.test_client.get("/")
request, response = up_p_app.test_client.get("/")
assert response.text == "Hello"
@pytest.mark.parametrize("protocol", [3, 4])
def test_pickle_app_with_bp(app, protocol):
from sanic import Blueprint
bp = Blueprint("test_text")
bp.route("/")(handler)
app.blueprint(bp)
p_app = pickle.dumps(app, protocol=protocol)
del app
up_p_app = pickle.loads(p_app)
assert up_p_app
request, response = app.test_client.get("/")
assert app.is_request_stream is False
request, response = up_p_app.test_client.get("/")
assert up_p_app.is_request_stream is False
assert response.text == "Hello"

View File

@@ -8,22 +8,72 @@ try:
except ImportError:
from json import loads
def test_storage(app):
def test_custom_context(app):
@app.middleware("request")
def store(request):
request.ctx.user = "sanic"
request.ctx.session = None
@app.route("/")
def handler(request):
# Accessing non-existant key should fail with AttributeError
try:
invalid = request.ctx.missing
except AttributeError as e:
invalid = str(e)
return json({
"user": request.ctx.user,
"session": request.ctx.session,
"has_user": hasattr(request.ctx, "user"),
"has_session": hasattr(request.ctx, "session"),
"has_missing": hasattr(request.ctx, "missing"),
"invalid": invalid
})
request, response = app.test_client.get("/")
assert response.json == {
"user": "sanic",
"session": None,
"has_user": True,
"has_session": True,
"has_missing": False,
"invalid": "'types.SimpleNamespace' object has no attribute 'missing'",
}
# Remove this once the deprecated API is abolished.
def test_custom_context_old(app):
@app.middleware("request")
def store(request):
try:
request["foo"]
except KeyError:
pass
request["user"] = "sanic"
request["sidekick"] = "tails"
sidekick = request.get("sidekick", "tails") # Item missing -> default
request["sidekick"] = sidekick
request["bar"] = request["sidekick"]
del request["sidekick"]
@app.route("/")
def handler(request):
return json(
{"user": request.get("user"), "sidekick": request.get("sidekick")}
{
"user": request.get("user"),
"sidekick": request.get("sidekick"),
"has_bar": "bar" in request,
"has_sidekick": "sidekick" in request,
}
)
request, response = app.test_client.get("/")
assert response.json == {
"user": "sanic",
"sidekick": None,
"has_bar": True,
"has_sidekick": False,
}
response_json = loads(response.text)
assert response_json["user"] == "sanic"
assert response_json.get("sidekick") is None

View File

@@ -401,8 +401,232 @@ async def test_content_type_asgi(app):
assert response.text == "application/json"
def test_standard_forwarded(app):
@app.route("/")
async def handler(request):
return json(request.forwarded)
# Without configured FORWARDED_SECRET, x-headers should be respected
app.config.PROXIES_COUNT = 1
app.config.REAL_IP_HEADER = "x-real-ip"
headers = {
"Forwarded": (
'for=1.1.1.1, for=injected;host="'
', for="[::2]";proto=https;host=me.tld;path="/app/";secret=mySecret'
',for=broken;;secret=b0rked'
', for=127.0.0.3;scheme=http;port=1234'
),
"X-Real-IP": "127.0.0.2",
"X-Forwarded-For": "127.0.1.1",
"X-Scheme": "ws",
}
request, response = app.test_client.get("/", headers=headers)
assert response.json == { "for": "127.0.0.2", "proto": "ws" }
assert request.remote_addr == "127.0.0.2"
assert request.scheme == "ws"
assert request.server_port == 80
app.config.FORWARDED_SECRET = "mySecret"
request, response = app.test_client.get("/", headers=headers)
assert response.json == {
"for": "[::2]",
"proto": "https",
"host": "me.tld",
"path": "/app/",
"secret": "mySecret"
}
assert request.remote_addr == "[::2]"
assert request.server_name == "me.tld"
assert request.scheme == "https"
assert request.server_port == 443
# Empty Forwarded header -> use X-headers
headers["Forwarded"] = ""
request, response = app.test_client.get("/", headers=headers)
assert response.json == { "for": "127.0.0.2", "proto": "ws" }
# Header present but not matching anything
request, response = app.test_client.get("/", headers={"Forwarded": "."})
assert response.json == {}
# Forwarded header present but no matching secret -> use X-headers
headers = {
"Forwarded": 'for=1.1.1.1;secret=x, for=127.0.0.1',
"X-Real-IP": "127.0.0.2"
}
request, response = app.test_client.get("/", headers=headers)
assert response.json == {"for": "127.0.0.2"}
assert request.remote_addr == "127.0.0.2"
# Different formatting and hitting both ends of the header
headers = {"Forwarded": 'Secret="mySecret";For=127.0.0.4;Port=1234'}
request, response = app.test_client.get("/", headers=headers)
assert response.json == {
"for": "127.0.0.4",
"port": 1234,
"secret": "mySecret"
}
# Test escapes (modify this if you see anyone implementing quoted-pairs)
headers = {"Forwarded": 'for=test;quoted="\\,x=x;y=\\";secret=mySecret'}
request, response = app.test_client.get("/", headers=headers)
assert response.json == {
"for": "test",
"quoted": '\\,x=x;y=\\',
"secret": "mySecret"
}
# Secret insulated by malformed field #1
headers = {"Forwarded": 'for=test;secret=mySecret;b0rked;proto=wss;'}
request, response = app.test_client.get("/", headers=headers)
assert response.json == {"for": "test", "secret": "mySecret"}
# Secret insulated by malformed field #2
headers = {"Forwarded": 'for=test;b0rked;secret=mySecret;proto=wss'}
request, response = app.test_client.get("/", headers=headers)
assert response.json == {"proto": "wss", "secret": "mySecret"}
# Unexpected termination should not lose existing acceptable values
headers = {"Forwarded": 'b0rked;secret=mySecret;proto=wss'}
request, response = app.test_client.get("/", headers=headers)
assert response.json == {"proto": "wss", "secret": "mySecret"}
# Field normalization
headers = {
"Forwarded": 'PROTO=WSS;BY="CAFE::8000";FOR=unknown;PORT=X;HOST="A:2";'
'PATH="/With%20Spaces%22Quoted%22/sanicApp?key=val";SECRET=mySecret'
}
request, response = app.test_client.get("/", headers=headers)
assert response.json == {
"proto": "wss",
"by": "[cafe::8000]",
"host": "a:2",
"path": '/With Spaces"Quoted"/sanicApp?key=val',
"secret": "mySecret",
}
# Using "by" field as secret
app.config.FORWARDED_SECRET = "_proxySecret"
headers = {"Forwarded": 'for=1.2.3.4; by=_proxySecret'}
request, response = app.test_client.get("/", headers=headers)
assert response.json == {"for": "1.2.3.4", "by": "_proxySecret"}
@pytest.mark.asyncio
async def test_standard_forwarded_asgi(app):
@app.route("/")
async def handler(request):
return json(request.forwarded)
# Without configured FORWARDED_SECRET, x-headers should be respected
app.config.PROXIES_COUNT = 1
app.config.REAL_IP_HEADER = "x-real-ip"
headers = {
"Forwarded": (
'for=1.1.1.1, for=injected;host="'
', for="[::2]";proto=https;host=me.tld;path="/app/";secret=mySecret'
',for=broken;;secret=b0rked'
', for=127.0.0.3;scheme=http;port=1234'
),
"X-Real-IP": "127.0.0.2",
"X-Forwarded-For": "127.0.1.1",
"X-Scheme": "ws",
}
request, response = await app.asgi_client.get("/", headers=headers)
assert response.json() == { "for": "127.0.0.2", "proto": "ws" }
assert request.remote_addr == "127.0.0.2"
assert request.scheme == "ws"
assert request.server_port == 80
app.config.FORWARDED_SECRET = "mySecret"
request, response = await app.asgi_client.get("/", headers=headers)
assert response.json() == {
"for": "[::2]",
"proto": "https",
"host": "me.tld",
"path": "/app/",
"secret": "mySecret"
}
assert request.remote_addr == "[::2]"
assert request.server_name == "me.tld"
assert request.scheme == "https"
assert request.server_port == 443
# Empty Forwarded header -> use X-headers
headers["Forwarded"] = ""
request, response = await app.asgi_client.get("/", headers=headers)
assert response.json() == { "for": "127.0.0.2", "proto": "ws" }
# Header present but not matching anything
request, response = await app.asgi_client.get("/", headers={"Forwarded": "."})
assert response.json() == {}
# Forwarded header present but no matching secret -> use X-headers
headers = {
"Forwarded": 'for=1.1.1.1;secret=x, for=127.0.0.1',
"X-Real-IP": "127.0.0.2"
}
request, response = await app.asgi_client.get("/", headers=headers)
assert response.json() == {"for": "127.0.0.2"}
assert request.remote_addr == "127.0.0.2"
# Different formatting and hitting both ends of the header
headers = {"Forwarded": 'Secret="mySecret";For=127.0.0.4;Port=1234'}
request, response = await app.asgi_client.get("/", headers=headers)
assert response.json() == {
"for": "127.0.0.4",
"port": 1234,
"secret": "mySecret"
}
# Test escapes (modify this if you see anyone implementing quoted-pairs)
headers = {"Forwarded": 'for=test;quoted="\\,x=x;y=\\";secret=mySecret'}
request, response = await app.asgi_client.get("/", headers=headers)
assert response.json() == {
"for": "test",
"quoted": '\\,x=x;y=\\',
"secret": "mySecret"
}
# Secret insulated by malformed field #1
headers = {"Forwarded": 'for=test;secret=mySecret;b0rked;proto=wss;'}
request, response = await app.asgi_client.get("/", headers=headers)
assert response.json() == {"for": "test", "secret": "mySecret"}
# Secret insulated by malformed field #2
headers = {"Forwarded": 'for=test;b0rked;secret=mySecret;proto=wss'}
request, response = await app.asgi_client.get("/", headers=headers)
assert response.json() == {"proto": "wss", "secret": "mySecret"}
# Unexpected termination should not lose existing acceptable values
headers = {"Forwarded": 'b0rked;secret=mySecret;proto=wss'}
request, response = await app.asgi_client.get("/", headers=headers)
assert response.json() == {"proto": "wss", "secret": "mySecret"}
# Field normalization
headers = {
"Forwarded": 'PROTO=WSS;BY="CAFE::8000";FOR=unknown;PORT=X;HOST="A:2";'
'PATH="/With%20Spaces%22Quoted%22/sanicApp?key=val";SECRET=mySecret'
}
request, response = await app.asgi_client.get("/", headers=headers)
assert response.json() == {
"proto": "wss",
"by": "[cafe::8000]",
"host": "a:2",
"path": '/With Spaces"Quoted"/sanicApp?key=val',
"secret": "mySecret",
}
# Using "by" field as secret
app.config.FORWARDED_SECRET = "_proxySecret"
headers = {"Forwarded": 'for=1.2.3.4; by=_proxySecret'}
request, response = await app.asgi_client.get("/", headers=headers)
assert response.json() == {"for": "1.2.3.4", "by": "_proxySecret"}
def test_remote_addr_with_two_proxies(app):
app.config.PROXIES_COUNT = 2
app.config.REAL_IP_HEADER = "x-real-ip"
@app.route("/")
async def handler(request):
@@ -443,6 +667,7 @@ def test_remote_addr_with_two_proxies(app):
@pytest.mark.asyncio
async def test_remote_addr_with_two_proxies_asgi(app):
app.config.PROXIES_COUNT = 2
app.config.REAL_IP_HEADER = "x-real-ip"
@app.route("/")
async def handler(request):
@@ -480,57 +705,6 @@ async def test_remote_addr_with_two_proxies_asgi(app):
assert response.text == "127.0.0.1"
def test_remote_addr_with_infinite_number_of_proxies(app):
app.config.PROXIES_COUNT = -1
@app.route("/")
async def handler(request):
return text(request.remote_addr)
headers = {"X-Real-IP": "127.0.0.2", "X-Forwarded-For": "127.0.1.1"}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == "127.0.0.2"
assert response.text == "127.0.0.2"
headers = {"X-Forwarded-For": "127.0.1.1"}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == "127.0.1.1"
assert response.text == "127.0.1.1"
headers = {
"X-Forwarded-For": "127.0.0.5, 127.0.0.4, 127.0.0.3, 127.0.0.2, 127.0.0.1"
}
request, response = app.test_client.get("/", headers=headers)
assert request.remote_addr == "127.0.0.5"
assert response.text == "127.0.0.5"
@pytest.mark.asyncio
async def test_remote_addr_with_infinite_number_of_proxies_asgi(app):
app.config.PROXIES_COUNT = -1
@app.route("/")
async def handler(request):
return text(request.remote_addr)
headers = {"X-Real-IP": "127.0.0.2", "X-Forwarded-For": "127.0.1.1"}
request, response = await app.asgi_client.get("/", headers=headers)
assert request.remote_addr == "127.0.0.2"
assert response.text == "127.0.0.2"
headers = {"X-Forwarded-For": "127.0.1.1"}
request, response = await app.asgi_client.get("/", headers=headers)
assert request.remote_addr == "127.0.1.1"
assert response.text == "127.0.1.1"
headers = {
"X-Forwarded-For": "127.0.0.5, 127.0.0.4, 127.0.0.3, 127.0.0.2, 127.0.0.1"
}
request, response = await app.asgi_client.get("/", headers=headers)
assert request.remote_addr == "127.0.0.5"
assert response.text == "127.0.0.5"
def test_remote_addr_without_proxy(app):
app.config.PROXIES_COUNT = 0
@@ -634,15 +808,16 @@ def test_forwarded_scheme(app):
async def handler(request):
return text(request.remote_addr)
app.config.PROXIES_COUNT = 1
request, response = app.test_client.get("/")
assert request.scheme == "http"
request, response = app.test_client.get(
"/", headers={"X-Forwarded-Proto": "https"}
"/", headers={"X-Forwarded-For": "127.1.2.3", "X-Forwarded-Proto": "https"}
)
assert request.scheme == "https"
request, response = app.test_client.get("/", headers={"X-Scheme": "https"})
request, response = app.test_client.get("/", headers={"X-Forwarded-For": "127.1.2.3", "X-Scheme": "https"})
assert request.scheme == "https"
@@ -1324,9 +1499,6 @@ def test_request_bool(app):
request, response = app.test_client.get("/")
assert bool(request)
request.transport = False
assert not bool(request)
def test_request_parsing_form_failed(app, caplog):
@app.route("/", methods=["POST"])
@@ -1688,9 +1860,19 @@ def test_request_server_name_in_host_header(app):
return text("OK")
request, response = app.test_client.get(
"/", headers={"Host": "my_server:5555"}
"/", headers={"Host": "my-server:5555"}
)
assert request.server_name == "my_server"
assert request.server_name == "my-server"
request, response = app.test_client.get(
"/", headers={"Host": "[2a00:1450:400f:80c::200e]:5555"}
)
assert request.server_name == "[2a00:1450:400f:80c::200e]"
request, response = app.test_client.get(
"/", headers={"Host": "mal_formed"}
)
assert request.server_name == None # For now (later maybe 127.0.0.1)
def test_request_server_name_forwarded(app):
@@ -1698,11 +1880,12 @@ def test_request_server_name_forwarded(app):
def handler(request):
return text("OK")
app.config.PROXIES_COUNT = 1
request, response = app.test_client.get(
"/",
headers={"Host": "my_server:5555", "X-Forwarded-Host": "your_server"},
headers={"Host": "my-server:5555", "X-Forwarded-For": "127.1.2.3", "X-Forwarded-Host": "your-server"},
)
assert request.server_name == "your_server"
assert request.server_name == "your-server"
def test_request_server_port(app):
@@ -1710,7 +1893,7 @@ def test_request_server_port(app):
def handler(request):
return text("OK")
request, response = app.test_client.get("/", headers={"Host": "my_server"})
request, response = app.test_client.get("/", headers={"Host": "my-server"})
assert request.server_port == app.test_client.port
@@ -1720,18 +1903,29 @@ def test_request_server_port_in_host_header(app):
return text("OK")
request, response = app.test_client.get(
"/", headers={"Host": "my_server:5555"}
"/", headers={"Host": "my-server:5555"}
)
assert request.server_port == 5555
request, response = app.test_client.get(
"/", headers={"Host": "[2a00:1450:400f:80c::200e]:5555"}
)
assert request.server_port == 5555
request, response = app.test_client.get(
"/", headers={"Host": "mal_formed:5555"}
)
assert request.server_port == app.test_client.port
def test_request_server_port_forwarded(app):
@app.get("/")
def handler(request):
return text("OK")
app.config.PROXIES_COUNT = 1
request, response = app.test_client.get(
"/", headers={"Host": "my_server:5555", "X-Forwarded-Port": "4444"}
"/", headers={"Host": "my-server:5555", "X-Forwarded-For": "127.1.2.3", "X-Forwarded-Port": "4444"}
)
assert request.server_port == 4444
@@ -1746,6 +1940,23 @@ def test_request_form_invalid_content_type(app):
assert request.form == {}
def test_server_name_and_url_for(app):
@app.get("/foo")
def handler(request):
return text("ok")
app.config.SERVER_NAME = "my-server"
assert app.url_for("handler", _external=True) == "http://my-server/foo"
request, response = app.test_client.get("/foo")
assert request.url_for("handler") == f"http://my-server:{app.test_client.port}/foo"
app.config.SERVER_NAME = "https://my-server/path"
request, response = app.test_client.get("/foo")
url = f"https://my-server/path/foo"
assert app.url_for("handler", _external=True) == url
assert request.url_for("handler") == url
def test_url_for_with_forwarded_request(app):
@app.get("/")
def handler(request):
@@ -1755,32 +1966,24 @@ def test_url_for_with_forwarded_request(app):
def view_name(request):
return text("OK")
app.config.SERVER_NAME = "my-server"
app.config.PROXIES_COUNT = 1
request, response = app.test_client.get(
"/", headers={"X-Forwarded-Proto": "https"}
)
assert app.url_for("view_name") == "/another_view"
assert app.url_for("view_name", _external=True) == "http:///another_view"
assert request.url_for(
"view_name"
) == "https://127.0.0.1:{}/another_view".format(app.test_client.port)
app.config.SERVER_NAME = "my_server"
request, response = app.test_client.get(
"/", headers={"X-Forwarded-Proto": "https", "X-Forwarded-Port": "6789"}
"/", headers={"X-Forwarded-For": "127.1.2.3", "X-Forwarded-Proto": "https", "X-Forwarded-Port": "6789"}
)
assert app.url_for("view_name") == "/another_view"
assert (
app.url_for("view_name", _external=True)
== "http://my_server/another_view"
== "http://my-server/another_view"
)
assert (
request.url_for("view_name") == "https://my_server:6789/another_view"
request.url_for("view_name") == "https://my-server:6789/another_view"
)
request, response = app.test_client.get(
"/", headers={"X-Forwarded-Proto": "https", "X-Forwarded-Port": "443"}
"/", headers={"X-Forwarded-For": "127.1.2.3", "X-Forwarded-Proto": "https", "X-Forwarded-Port": "443"}
)
assert request.url_for("view_name") == "https://my_server/another_view"
assert request.url_for("view_name") == "https://my-server/another_view"
@pytest.mark.asyncio

View File

@@ -1,3 +1,4 @@
import asyncio
import signal
import pytest
@@ -89,3 +90,52 @@ async def test_trigger_before_events_create_server(app):
assert hasattr(app, "db")
assert isinstance(app.db, MySanicDb)
def test_create_server_trigger_events(app):
"""Test if create_server can trigger server events"""
flag1 = False
flag2 = False
flag3 = False
async def stop(app, loop):
nonlocal flag1
flag1 = True
await asyncio.sleep(0.1)
app.stop()
async def before_stop(app, loop):
nonlocal flag2
flag2 = True
async def after_stop(app, loop):
nonlocal flag3
flag3 = True
app.listener("after_server_start")(stop)
app.listener("before_server_stop")(before_stop)
app.listener("after_server_stop")(after_stop)
loop = asyncio.get_event_loop()
serv_coro = app.create_server(return_asyncio_server=True)
serv_task = asyncio.ensure_future(serv_coro, loop=loop)
server = loop.run_until_complete(serv_task)
server.after_start()
try:
loop.run_forever()
except KeyboardInterrupt as e:
loop.stop()
finally:
# Run the on_stop function if provided
server.before_stop()
# Wait for server to close
close_task = server.close()
loop.run_until_complete(close_task)
# Complete all tasks on the loop
signal.stopped = True
for connection in server.connections:
connection.close_if_idle()
server.after_stop()
assert flag1 and flag2 and flag3

View File

@@ -8,7 +8,7 @@ setenv =
{py36,py37}-no-ext: SANIC_NO_UVLOOP=1
deps =
coverage
pytest==4.1.0
pytest==5.2.1
pytest-cov
pytest-sanic
pytest-sugar
@@ -38,6 +38,13 @@ commands =
black --config ./.black.toml --check --verbose sanic/
isort --check-only --recursive sanic
[testenv:type-checking]
deps =
mypy
commands =
mypy sanic
[testenv:check]
deps =
docutils