Compare commits

...

24 Commits

Author SHA1 Message Date
Adam Hopkins
8a3fbb555f Merge branch 'master' of github.com:huge-success/sanic 2020-06-29 08:56:20 +03:00
L. Kärkkäinen
a62c84a954 Socket binding implemented properly for IPv6 and UNIX sockets. (#1641)
* Socket binding implemented properly for IPv6 and UNIX sockets.

- app.run("::1") for IPv6
- app.run("unix:/tmp/server.sock") for UNIX sockets
- app.run("localhost") retains old functionality (randomly either IPv4 or IPv6)

Do note that IPv6 and UNIX sockets are not fully supported by other Sanic facilities.
In particular, request.server_name and request.server_port are currently unreliable.

* Fix Windows compatibility by not referring to socket.AF_UNIX unless needed.

* Compatibility fix.

* Fix test of existing unix socket.

* Cleaner unix socket removal.

* Remove unix socket on exit also with workers=1.

* More pedantic UNIX socket implementation.

* Refactor app to take unix= argument instead of unix:-prefixed host. Goin' fast @ unix-socket fixed.

* Linter

* Proxy properties cleanup. Slight changes of semantics. SERVER_NAME now overrides everything.

* Have server fill in connection info instead of request asking the socket.

- Would be a good idea to remove request.transport entirely but I didn't dare to touch it yet.

* Linter 💣🌟💀

* Fix typing issues. request.server_name returns empty string if host header is missing.

* Fix tests

* Tests were failing, fix connection info.

* Linter nazi says you need that empty line.

* Rename a to addr, leave client empty for unix sockets.

* Add --unix support when sanic is run as module.

* Remove remove_route, deprecated in 19.6.

* Improved unix socket binding.

* More robust creating and unlinking of sockets. Show proper and not temporary name in conn_info.

* Add comprehensive tests for unix socket mode.

* Hide some imports inside functions to avoid Windows failure.

* Mention unix socket mode in deployment docs.

* Fix merge commit.

* Make test_unix_connection_multiple_workers pickleable for spawn mode multiprocessing.

Co-authored-by: L. Kärkkäinen <tronic@users.noreply.github.com>
Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-29 08:55:32 +03:00
Adam Hopkins
4aba74d050 V20.6 version (#1882)
* Version

* Version 20.6.1

Co-authored-by: 7 <yunxu1992@gmail.com>
2020-06-29 00:17:24 +03:00
Adam Hopkins
ab2cb88cf4 CHANGELOG for v 20.6 and documentation change for sanic command (#1881)
* CHANGELOG for v 20.6 and documentation change for sanic command

* Update CHANGELOG.rst

20.6.0 and 20.6.1 are the same release. One change from `blueprints` had not made it in by accident, therefore the second subsequent release.
2020-06-28 11:42:12 -07:00
Adam Hopkins
e79ec7d7e0 Version 20.6.1 2020-06-28 17:21:48 +03:00
Adam Hopkins
ccdb74a9a7 Merge branch 'master' of github.com:huge-success/sanic 2020-06-28 17:21:12 +03:00
Adam Hopkins
7b96d633db Version 2020-06-28 17:19:57 +03:00
Adam Hopkins
938c49b899 Add handler names for websockets for url_for usage (#1880) 2020-06-28 14:45:52 +03:00
Ashley Sommer
761eef7d96 Fix pickle error when attempting to pickle an application which contains websocket routes. (#1853)
Moves the websocket_handler subfunction out to a class-level method, which can be more easily pickled by the built-in python Pickler.
Also includes a similar fix for the add_task deferred task scheduler subfunction.

Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-28 11:05:06 +03:00
David Bordeynik
83511a0ba7 fix-#1851: correct step name (#1852)
* fix-#1851: correct step name

* fix-#1851: correct step name elsewhere as well

Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-28 10:52:43 +03:00
Damian Jimenez
cf9ccdae47 Bug fix for host parameter issue with lists (#1776)
* Bug fix for host parameter issue with lists

As explained in #1772 there is an issue when using a list as an argument for the host parameter in the Blueprint.route() decorator. I've traced the issue back to this line, and the if conditional should ensure that the name attribute isn't accessed when route is None.

* Unit tests for blueprint.route host paramter set to list.

Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-28 09:42:18 +03:00
Kiril Yershov
d81096fdc0 Clarified response middleware execution order in the documentation (#1846)
Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-28 09:29:48 +03:00
Adam Hopkins
6c8e20a859 Add version parameter to websocket routes (#1760)
* Add version parameter to websockets

* Run black and cleanup code
2020-06-28 09:17:18 +03:00
Liran Nuna
6239fa4f56 Deprecate body_bytes to merge into body (#1739)
Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-28 08:59:23 +03:00
David Bordeynik
1b324ae981 fix-#1856: adjust websockets version to setup.py and make nightly (py39) tests pass (#1857)
* fix-#1856: adjust websockets version to setup.py and make nightly (py39) tests pass

* fix-#1856: set min websockets version to 8.1

* fix-#1856: suppress timeout for CI to pass

* fix-#1856: timeout -> close_timeout due to deprecation warning

Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
Co-authored-by: 7 <yunxu1992@gmail.com>
2020-06-28 08:43:12 +03:00
Linus Groh
bedf68a9b2 Wrap run()'s "protocol" type annotation in Optional[] (#1869)
As the default is None and the function will determine a sane value
in that case, the correct annotation is "Optional[Type[Protocol]]".
2020-06-11 11:40:12 -07:00
Adam Hopkins
496e87e4ba Add sanic as an entry point command (#1866)
* Add sanic as an entry point command

* Fix linting issue in imports

Co-authored-by: 7 <yunxu1992@gmail.com>
2020-06-05 07:14:18 -07:00
Luca Fabbri
fa4f85eb32 Fixing rst format issue (#1865)
Co-authored-by: 7 <yunxu1992@gmail.com>
2020-06-04 17:08:14 -07:00
Adam Hopkins
1b1dfedc74 Add changes from version 20.3 to CHANGELOG (#1867) 2020-06-04 15:45:55 -07:00
L. Kärkkäinen
230941ff4f Fix reloader on OSX py38 and Windows (#1827)
* Fix watchdog reload worker repeatedly if there are multiple changed files

* Simplify autoreloader, don't need multiprocessing.Process. Now works on OSX py38.

* Allow autoreloader with multiple workers and run it earlier.

* This works OK on Windows too.

* I don't see how cwd could be different here.

* app.run and app.create_server argument fixup.

* Add test for auto_reload (coverage not working unfortunately).

* Reloader cleanup, don't use external kill commands and exit normally.

* Strip newlines on test output (Windows-compat).

* Report failures in test_auto_reload to avoid timeouts.

* Use different test server ports to avoid binding problems on Windows.

* Fix previous commit

* Listen on same port after reload.

* Show Goin' Fast banner on reloads.

* More robust testing, also -m sanic.

* Add a timeout to terminate process

* Try a workaround for tmpdir deletion on Windows.

* Join process also on error (context manager doesn't).

* Cleaner autoreloader termination on Windows.

* Remove unused code.

* Rename test.

* Longer timeout on test exit.

Co-authored-by: Hùng X. Lê <lexhung@gmail.com>
Co-authored-by: L. Kärkkäinen <tronic@users.noreply.github.com>
Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-03 16:45:07 +03:00
Adam Hopkins
4658e0f2f3 Merge pull request #1842 from ashleysommer/fix_pickle_again
Fix static _handler pickling error.
2020-06-03 15:53:17 +03:00
Ashley Sommer
7c3c532dae Merge branch 'master' into fix_pickle_again 2020-05-14 20:48:06 +10:00
Adam Hopkins
6aaccd1e8b Merge branch 'master' into fix_pickle_again 2020-05-13 15:46:37 +03:00
Ashley Sommer
aacbd022cf Fix static _handler pickling error.
Moves the subfunction _handler out to a module-level function, and parameterizes it with functools.partial().
Fixes the case when picking a sanic app which has a registered static route handler. This is usually encountered when attempting to use multiprocessing or auto_reload on OSX or Windows.
Fixes #1774
2020-05-07 11:58:36 +10:00
35 changed files with 1314 additions and 431 deletions

View File

@@ -71,14 +71,14 @@ matrix:
name: "Python nightly with Extensions" name: "Python nightly with Extensions"
- env: TOX_ENV=pyNightly-no-ext - env: TOX_ENV=pyNightly-no-ext
python: 'nightly' python: 'nightly'
name: "Python nightly Extensions" name: "Python nightly without Extensions"
allow_failures: allow_failures:
- env: TOX_ENV=pyNightly - env: TOX_ENV=pyNightly
python: 'nightly' python: 'nightly'
name: "Python nightly with Extensions" name: "Python nightly with Extensions"
- env: TOX_ENV=pyNightly-no-ext - env: TOX_ENV=pyNightly-no-ext
python: 'nightly' python: 'nightly'
name: "Python nightly Extensions" name: "Python nightly without Extensions"
install: install:
- pip install -U tox - pip install -U tox
- pip install codecov - pip install codecov

View File

@@ -1,3 +1,209 @@
Version 20.6.1
===============
Features
********
*
`#1760 <https://github.com/huge-success/sanic/pull/1760>`_
Add version parameter to websocket routes
*
`#1866 <https://github.com/huge-success/sanic/pull/1866>`_
Add ``sanic`` as an entry point command
*
`#1880 <https://github.com/huge-success/sanic/pull/1880>`_
Add handler names for websockets for url_for usage
Bugfixes
********
*
`#1776 <https://github.com/huge-success/sanic/pull/1776>`_
Bug fix for host parameter issue with lists
*
`#1842 <https://github.com/huge-success/sanic/pull/1842>`_
Fix static _handler pickling error
*
`#1827 <https://github.com/huge-success/sanic/pull/1827>`_
Fix reloader on OSX py38 and Windows
*
`#1848 <https://github.com/huge-success/sanic/pull/1848>`_
Reverse named_response_middlware execution order, to match normal response middleware execution order
*
`#1853 <https://github.com/huge-success/sanic/pull/1853>`_
Fix pickle error when attempting to pickle an application which contains websocket routes
Deprecations and Removals
*************************
*
`#1739 <https://github.com/huge-success/sanic/pull/1739>`_
Deprecate body_bytes to merge into body
Developer infrastructure
************************
*
`#1852 <https://github.com/huge-success/sanic/pull/1852>`_
Fix naming of CI test env on Python nightlies
*
`#1857 <https://github.com/huge-success/sanic/pull/1857>`_
Adjust websockets version to setup.py
*
`#1869 <https://github.com/huge-success/sanic/pull/1869>`_
Wrap run()'s "protocol" type annotation in Optional[]
Improved Documentation
**********************
*
`#1846 <https://github.com/huge-success/sanic/pull/1846>`_
Update docs to clarify response middleware execution order
*
`#1865 <https://github.com/huge-success/sanic/pull/1865>`_
Fixing rst format issue that was hiding documentation
Version 20.3.0
===============
Features
********
*
`#1762 <https://github.com/huge-success/sanic/pull/1762>`_
Add ``srv.start_serving()`` and ``srv.serve_forever()`` to ``AsyncioServer``
*
`#1767 <https://github.com/huge-success/sanic/pull/1767>`_
Make Sanic usable on ``hypercorn -k trio myweb.app``
*
`#1768 <https://github.com/huge-success/sanic/pull/1768>`_
No tracebacks on normal errors and prettier error pages
*
`#1769 <https://github.com/huge-success/sanic/pull/1769>`_
Code cleanup in file responses
*
`#1793 <https://github.com/huge-success/sanic/pull/1793>`_ and
`#1819 <https://github.com/huge-success/sanic/pull/1819>`_
Upgrade ``str.format()`` to f-strings
*
`#1798 <https://github.com/huge-success/sanic/pull/1798>`_
Allow multiple workers on MacOS with Python 3.8
*
`#1820 <https://github.com/huge-success/sanic/pull/1820>`_
Do not set content-type and content-length headers in exceptions
Bugfixes
********
*
`#1748 <https://github.com/huge-success/sanic/pull/1748>`_
Remove loop argument in ``asyncio.Event`` in Python 3.8
*
`#1764 <https://github.com/huge-success/sanic/pull/1764>`_
Allow route decorators to stack up again
*
`#1789 <https://github.com/huge-success/sanic/pull/1789>`_
Fix tests using hosts yielding incorrect ``url_for``
*
`#1808 <https://github.com/huge-success/sanic/pull/1808>`_
Fix Ctrl+C and tests on Windows
Deprecations and Removals
*************************
*
`#1800 <https://github.com/huge-success/sanic/pull/1800>`_
Begin deprecation in way of first-class streaming, removal of ``body_init``, ``body_push``, and ``body_finish``
*
`#1801 <https://github.com/huge-success/sanic/pull/1801>`_
Complete deprecation from `#1666 <https://github.com/huge-success/sanic/pull/1666>`_ of dictionary context on ``request`` objects.
*
`#1807 <https://github.com/huge-success/sanic/pull/1807>`_
Remove server config args that can be read directly from app
*
`#1818 <https://github.com/huge-success/sanic/pull/1818>`_
Complete deprecation of ``app.remove_route`` and ``request.raw_args``
Dependencies
************
*
`#1794 <https://github.com/huge-success/sanic/pull/1794>`_
Bump ``httpx`` to 0.11.1
*
`#1806 <https://github.com/huge-success/sanic/pull/1806>`_
Import ``ASGIDispatch`` from top-level ``httpx`` (from third-party deprecation)
Developer infrastructure
************************
*
`#1833 <https://github.com/huge-success/sanic/pull/1833>`_
Resolve broken documentation builds
Improved Documentation
**********************
*
`#1755 <https://github.com/huge-success/sanic/pull/1755>`_
Usage of ``response.empty()``
*
`#1778 <https://github.com/huge-success/sanic/pull/1778>`_
Update README
*
`#1783 <https://github.com/huge-success/sanic/pull/1783>`_
Fix typo
*
`#1784 <https://github.com/huge-success/sanic/pull/1784>`_
Corrected changelog for docs move of MD to RST (`#1691 <https://github.com/huge-success/sanic/pull/1691>`_)
*
`#1803 <https://github.com/huge-success/sanic/pull/1803>`_
Update config docs to match DEFAULT_CONFIG
*
`#1814 <https://github.com/huge-success/sanic/pull/1814>`_
Update getting_started.rst
*
`#1821 <https://github.com/huge-success/sanic/pull/1821>`_
Update to deployment
*
`#1822 <https://github.com/huge-success/sanic/pull/1822>`_
Update docs with changes done in 20.3
*
`#1834 <https://github.com/huge-success/sanic/pull/1834>`_
Order of listeners
Version 19.12.0 Version 19.12.0
=============== ===============

View File

@@ -16,6 +16,7 @@ keyword arguments:
- `host` *(default `"127.0.0.1"`)*: Address to host the server on. - `host` *(default `"127.0.0.1"`)*: Address to host the server on.
- `port` *(default `8000`)*: Port to host the server on. - `port` *(default `8000`)*: Port to host the server on.
- `unix` *(default `None`)*: Unix socket name to host the server on (instead of TCP).
- `debug` *(default `False`)*: Enables debug output (slows server). - `debug` *(default `False`)*: Enables debug output (slows server).
- `ssl` *(default `None`)*: `SSLContext` for SSL encryption of worker(s). - `ssl` *(default `None`)*: `SSLContext` for SSL encryption of worker(s).
- `sock` *(default `None`)*: Socket for the server to accept connections from. - `sock` *(default `None`)*: Socket for the server to accept connections from.
@@ -50,7 +51,15 @@ If you like using command line arguments, you can launch a Sanic webserver by
executing the module. For example, if you initialized Sanic as `app` in a file executing the module. For example, if you initialized Sanic as `app` in a file
named `server.py`, you could run the server like so: named `server.py`, you could run the server like so:
.. python -m sanic server.app --host=0.0.0.0 --port=1337 --workers=4 ::
sanic server.app --host=0.0.0.0 --port=1337 --workers=4
It can also be called directly as a module.
::
python -m sanic server.app --host=0.0.0.0 --port=1337 --workers=4
With this way of running sanic, it is not necessary to invoke `app.run` in your With this way of running sanic, it is not necessary to invoke `app.run` in your
Python file. If you do, make sure you wrap it so that it only executes when Python file. If you do, make sure you wrap it so that it only executes when

View File

@@ -14,8 +14,8 @@ There are two types of middleware: request and response. Both are declared
using the `@app.middleware` decorator, with the decorator's parameter being a using the `@app.middleware` decorator, with the decorator's parameter being a
string representing its type: `'request'` or `'response'`. string representing its type: `'request'` or `'response'`.
* Request middleware receives only the `request` as argument. * Request middleware receives only the `request` as an argument and are executed in the order they were added.
* Response middleware receives both the `request` and `response`. * Response middleware receives both the `request` and `response` and are executed in *reverse* order.
The simplest middleware doesn't modify the request or response at all: The simplest middleware doesn't modify the request or response at all:
@@ -64,12 +64,12 @@ this.
app.run(host="0.0.0.0", port=8000) app.run(host="0.0.0.0", port=8000)
The three middlewares are executed in order: The three middlewares are executed in the following order:
1. The first request middleware **add_key** adds a new key `foo` into request context. 1. The first request middleware **add_key** adds a new key `foo` into request context.
2. Request is routed to handler **index**, which gets the key from context and returns a text response. 2. Request is routed to handler **index**, which gets the key from context and returns a text response.
3. The first response middleware **custom_banner** changes the HTTP response header *Server* to say *Fake-Server* 3. The second response middleware **prevent_xss** adds the HTTP header for preventing Cross-Site-Scripting (XSS) attacks.
4. The second response middleware **prevent_xss** adds the HTTP header for preventing Cross-Site-Scripting (XSS) attacks. 4. The first response middleware **custom_banner** changes the HTTP response header *Server* to say *Fake-Server*
Responding early Responding early
---------------- ----------------

View File

@@ -0,0 +1,18 @@
from asyncio import sleep
from sanic import Sanic, response
app = Sanic(__name__, strict_slashes=True)
@app.get("/")
async def handler(request):
return response.redirect("/sleep/3")
@app.get("/sleep/<t:number>")
async def handler2(request, t=0.3):
await sleep(t)
return response.text(f"Slept {t:.1f} seconds.\n")
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,3 +1,6 @@
import os
import sys
from argparse import ArgumentParser from argparse import ArgumentParser
from importlib import import_module from importlib import import_module
from typing import Any, Dict, Optional from typing import Any, Dict, Optional
@@ -6,10 +9,11 @@ from sanic.app import Sanic
from sanic.log import logger from sanic.log import logger
if __name__ == "__main__": def main():
parser = ArgumentParser(prog="sanic") parser = ArgumentParser(prog="sanic")
parser.add_argument("--host", dest="host", type=str, default="127.0.0.1") parser.add_argument("--host", dest="host", type=str, default="127.0.0.1")
parser.add_argument("--port", dest="port", type=int, default=8000) parser.add_argument("--port", dest="port", type=int, default=8000)
parser.add_argument("--unix", dest="unix", type=str, default="")
parser.add_argument( parser.add_argument(
"--cert", dest="cert", type=str, help="location of certificate for SSL" "--cert", dest="cert", type=str, help="location of certificate for SSL"
) )
@@ -22,6 +26,10 @@ if __name__ == "__main__":
args = parser.parse_args() args = parser.parse_args()
try: try:
module_path = os.path.abspath(os.getcwd())
if module_path not in sys.path:
sys.path.append(module_path)
module_parts = args.module.split(".") module_parts = args.module.split(".")
module_name = ".".join(module_parts[:-1]) module_name = ".".join(module_parts[:-1])
app_name = module_parts[-1] app_name = module_parts[-1]
@@ -46,6 +54,7 @@ if __name__ == "__main__":
app.run( app.run(
host=args.host, host=args.host,
port=args.port, port=args.port,
unix=args.unix,
workers=args.workers, workers=args.workers,
debug=args.debug, debug=args.debug,
ssl=ssl, ssl=ssl,
@@ -58,3 +67,7 @@ if __name__ == "__main__":
) )
except ValueError: except ValueError:
logger.exception("Failed to run app") logger.exception("Failed to run app")
if __name__ == "__main__":
main()

View File

@@ -1 +1 @@
__version__ = "20.3.0" __version__ = "20.6.1"

View File

@@ -117,24 +117,12 @@ class Sanic:
:param task: future, couroutine or awaitable :param task: future, couroutine or awaitable
""" """
try: try:
if callable(task): loop = self.loop # Will raise SanicError if loop is not started
try: self._loop_add_task(task, self, loop)
self.loop.create_task(task(self))
except TypeError:
self.loop.create_task(task())
else:
self.loop.create_task(task)
except SanicException: except SanicException:
self.listener("before_server_start")(
@self.listener("before_server_start") partial(self._loop_add_task, task)
def run(app, loop): )
if callable(task):
try:
loop.create_task(task(self))
except TypeError:
loop.create_task(task())
else:
loop.create_task(task)
# Decorator # Decorator
def listener(self, event): def listener(self, event):
@@ -462,7 +450,13 @@ class Sanic:
# Decorator # Decorator
def websocket( def websocket(
self, uri, host=None, strict_slashes=None, subprotocols=None, name=None self,
uri,
host=None,
strict_slashes=None,
subprotocols=None,
version=None,
name=None,
): ):
""" """
Decorate a function to be registered as a websocket route Decorate a function to be registered as a websocket route
@@ -493,42 +487,12 @@ class Sanic:
routes, handler = handler routes, handler = handler
else: else:
routes = [] routes = []
websocket_handler = partial(
async def websocket_handler(request, *args, **kwargs): self._websocket_handler, handler, subprotocols=subprotocols
request.app = self )
if not getattr(handler, "__blueprintname__", False): websocket_handler.__name__ = (
request.endpoint = handler.__name__ "websocket_handler_" + handler.__name__
else: )
request.endpoint = (
getattr(handler, "__blueprintname__", "")
+ handler.__name__
)
pass
if self.asgi:
ws = request.transport.get_websocket_connection()
else:
protocol = request.transport.get_protocol()
protocol.app = self
ws = await protocol.websocket_handshake(
request, subprotocols
)
# schedule the application handler
# its future is kept in self.websocket_tasks in case it
# needs to be cancelled due to the server being stopped
fut = ensure_future(handler(request, ws, *args, **kwargs))
self.websocket_tasks.add(fut)
try:
await fut
except (CancelledError, ConnectionClosed):
pass
finally:
self.websocket_tasks.remove(fut)
await ws.close()
routes.extend( routes.extend(
self.router.add( self.router.add(
uri=uri, uri=uri,
@@ -536,6 +500,7 @@ class Sanic:
methods=frozenset({"GET"}), methods=frozenset({"GET"}),
host=host, host=host,
strict_slashes=strict_slashes, strict_slashes=strict_slashes,
version=version,
name=name, name=name,
) )
) )
@@ -550,6 +515,7 @@ class Sanic:
host=None, host=None,
strict_slashes=None, strict_slashes=None,
subprotocols=None, subprotocols=None,
version=None,
name=None, name=None,
): ):
""" """
@@ -577,6 +543,7 @@ class Sanic:
host=host, host=host,
strict_slashes=strict_slashes, strict_slashes=strict_slashes,
subprotocols=subprotocols, subprotocols=subprotocols,
version=version,
name=name, name=name,
)(handler) )(handler)
@@ -589,10 +556,7 @@ class Sanic:
if not self.websocket_enabled: if not self.websocket_enabled:
# if the server is stopped, we want to cancel any ongoing # if the server is stopped, we want to cancel any ongoing
# websocket tasks, to allow the server to exit promptly # websocket tasks, to allow the server to exit promptly
@self.listener("before_server_stop") self.listener("before_server_stop")(self._cancel_websocket_tasks)
def cancel_websocket_tasks(app, loop):
for task in self.websocket_tasks:
task.cancel()
self.websocket_enabled = enable self.websocket_enabled = enable
@@ -1058,16 +1022,19 @@ class Sanic:
self, self,
host: Optional[str] = None, host: Optional[str] = None,
port: Optional[int] = None, port: Optional[int] = None,
*,
debug: bool = False, debug: bool = False,
auto_reload: Optional[bool] = None,
ssl: Union[dict, SSLContext, None] = None, ssl: Union[dict, SSLContext, None] = None,
sock: Optional[socket] = None, sock: Optional[socket] = None,
workers: int = 1, workers: int = 1,
protocol: Type[Protocol] = None, protocol: Optional[Type[Protocol]] = None,
backlog: int = 100, backlog: int = 100,
stop_event: Any = None, stop_event: Any = None,
register_sys_signals: bool = True, register_sys_signals: bool = True,
access_log: Optional[bool] = None, access_log: Optional[bool] = None,
**kwargs: Any, unix: Optional[str] = None,
loop: None = None,
) -> None: ) -> None:
"""Run the HTTP Server and listen until keyboard interrupt or term """Run the HTTP Server and listen until keyboard interrupt or term
signal. On termination, drain connections before closing. signal. On termination, drain connections before closing.
@@ -1078,6 +1045,9 @@ class Sanic:
:type port: int :type port: int
:param debug: Enables debug output (slows server) :param debug: Enables debug output (slows server)
:type debug: bool :type debug: bool
:param auto_reload: Reload app whenever its source code is changed.
Enabled by default in debug mode.
:type auto_relaod: bool
:param ssl: SSLContext, or location of certificate and key :param ssl: SSLContext, or location of certificate and key
for SSL encryption of worker(s) for SSL encryption of worker(s)
:type ssl: SSLContext or dict :type ssl: SSLContext or dict
@@ -1097,9 +1067,11 @@ class Sanic:
:type register_sys_signals: bool :type register_sys_signals: bool
:param access_log: Enables writing access logs (slows server) :param access_log: Enables writing access logs (slows server)
:type access_log: bool :type access_log: bool
:param unix: Unix socket to listen on instead of TCP port
:type unix: str
:return: Nothing :return: Nothing
""" """
if "loop" in kwargs: if loop is not None:
raise TypeError( raise TypeError(
"loop is not a valid argument. To use an existing loop, " "loop is not a valid argument. To use an existing loop, "
"change to create_server().\nSee more: " "change to create_server().\nSee more: "
@@ -1107,13 +1079,9 @@ class Sanic:
"#asynchronous-support" "#asynchronous-support"
) )
# Default auto_reload to false if auto_reload or auto_reload is None and debug:
auto_reload = False if os.environ.get("SANIC_SERVER_RUNNING") != "true":
# If debug is set, default it to true (unless on windows) return reloader_helpers.watchdog(1.0)
if debug and os.name == "posix":
auto_reload = True
# Allow for overriding either of the defaults
auto_reload = kwargs.get("auto_reload", auto_reload)
if sock is None: if sock is None:
host, port = host or "127.0.0.1", port or 8000 host, port = host or "127.0.0.1", port or 8000
@@ -1139,6 +1107,7 @@ class Sanic:
debug=debug, debug=debug,
ssl=ssl, ssl=ssl,
sock=sock, sock=sock,
unix=unix,
workers=workers, workers=workers,
protocol=protocol, protocol=protocol,
backlog=backlog, backlog=backlog,
@@ -1156,18 +1125,7 @@ class Sanic:
) )
workers = 1 workers = 1
if workers == 1: if workers == 1:
if auto_reload and os.name != "posix": serve(**server_settings)
# This condition must be removed after implementing
# auto reloader for other operating systems.
raise NotImplementedError
if (
auto_reload
and os.environ.get("SANIC_SERVER_RUNNING") != "true"
):
reloader_helpers.watchdog(2)
else:
serve(**server_settings)
else: else:
serve_multiple(server_settings, workers) serve_multiple(server_settings, workers)
except BaseException: except BaseException:
@@ -1189,6 +1147,7 @@ class Sanic:
self, self,
host: Optional[str] = None, host: Optional[str] = None,
port: Optional[int] = None, port: Optional[int] = None,
*,
debug: bool = False, debug: bool = False,
ssl: Union[dict, SSLContext, None] = None, ssl: Union[dict, SSLContext, None] = None,
sock: Optional[socket] = None, sock: Optional[socket] = None,
@@ -1196,6 +1155,7 @@ class Sanic:
backlog: int = 100, backlog: int = 100,
stop_event: Any = None, stop_event: Any = None,
access_log: Optional[bool] = None, access_log: Optional[bool] = None,
unix: Optional[str] = None,
return_asyncio_server=False, return_asyncio_server=False,
asyncio_server_kwargs=None, asyncio_server_kwargs=None,
) -> Optional[AsyncioServer]: ) -> Optional[AsyncioServer]:
@@ -1265,6 +1225,7 @@ class Sanic:
debug=debug, debug=debug,
ssl=ssl, ssl=ssl,
sock=sock, sock=sock,
unix=unix,
loop=get_event_loop(), loop=get_event_loop(),
protocol=protocol, protocol=protocol,
backlog=backlog, backlog=backlog,
@@ -1330,6 +1291,7 @@ class Sanic:
debug=False, debug=False,
ssl=None, ssl=None,
sock=None, sock=None,
unix=None,
workers=1, workers=1,
loop=None, loop=None,
protocol=HttpProtocol, protocol=HttpProtocol,
@@ -1371,6 +1333,7 @@ class Sanic:
"host": host, "host": host,
"port": port, "port": port,
"sock": sock, "sock": sock,
"unix": unix,
"ssl": ssl, "ssl": ssl,
"app": self, "app": self,
"signal": Signal(), "signal": Signal(),
@@ -1413,11 +1376,14 @@ class Sanic:
server_settings["run_async"] = True server_settings["run_async"] = True
# Serve # Serve
if host and port and os.environ.get("SANIC_SERVER_RUNNING") != "true": if host and port:
proto = "http" proto = "http"
if ssl is not None: if ssl is not None:
proto = "https" proto = "https"
logger.info(f"Goin' Fast @ {proto}://{host}:{port}") if unix:
logger.info(f"Goin' Fast @ {unix} {proto}://...")
else:
logger.info(f"Goin' Fast @ {proto}://{host}:{port}")
return server_settings return server_settings
@@ -1425,6 +1391,55 @@ class Sanic:
parts = [self.name, *parts] parts = [self.name, *parts]
return ".".join(parts) return ".".join(parts)
@classmethod
def _loop_add_task(cls, task, app, loop):
if callable(task):
try:
loop.create_task(task(app))
except TypeError:
loop.create_task(task())
else:
loop.create_task(task)
@classmethod
def _cancel_websocket_tasks(cls, app, loop):
for task in app.websocket_tasks:
task.cancel()
async def _websocket_handler(
self, handler, request, *args, subprotocols=None, **kwargs
):
request.app = self
if not getattr(handler, "__blueprintname__", False):
request.endpoint = handler.__name__
else:
request.endpoint = (
getattr(handler, "__blueprintname__", "") + handler.__name__
)
pass
if self.asgi:
ws = request.transport.get_websocket_connection()
else:
protocol = request.transport.get_protocol()
protocol.app = self
ws = await protocol.websocket_handshake(request, subprotocols)
# schedule the application handler
# its future is kept in self.websocket_tasks in case it
# needs to be cancelled due to the server being stopped
fut = ensure_future(handler(request, ws, *args, **kwargs))
self.websocket_tasks.add(fut)
try:
await fut
except (CancelledError, ConnectionClosed):
pass
finally:
self.websocket_tasks.remove(fut)
await ws.close()
# -------------------------------------------------------------------- # # -------------------------------------------------------------------- #
# ASGI # ASGI
# -------------------------------------------------------------------- # # -------------------------------------------------------------------- #

View File

@@ -22,7 +22,7 @@ from sanic.exceptions import InvalidUsage, ServerError
from sanic.log import logger from sanic.log import logger
from sanic.request import Request from sanic.request import Request
from sanic.response import HTTPResponse, StreamingHTTPResponse from sanic.response import HTTPResponse, StreamingHTTPResponse
from sanic.server import StreamBuffer from sanic.server import ConnInfo, StreamBuffer
from sanic.websocket import WebSocketConnection from sanic.websocket import WebSocketConnection
@@ -255,6 +255,7 @@ class ASGIApp:
instance.transport, instance.transport,
sanic_app, sanic_app,
) )
instance.request.conn_info = ConnInfo(instance.transport)
if sanic_app.is_request_stream: if sanic_app.is_request_stream:
is_stream_handler = sanic_app.router.is_stream_handler( is_stream_handler = sanic_app.router.is_stream_handler(

View File

@@ -143,7 +143,7 @@ class Blueprint:
if _routes: if _routes:
routes += _routes routes += _routes
route_names = [route.name for route in routes] route_names = [route.name for route in routes if route]
# Middleware # Middleware
for future in self.middlewares: for future in self.middlewares:
if future.args or future.kwargs: if future.args or future.kwargs:
@@ -283,6 +283,13 @@ class Blueprint:
strict_slashes = self.strict_slashes strict_slashes = self.strict_slashes
def decorator(handler): def decorator(handler):
nonlocal uri
nonlocal host
nonlocal strict_slashes
nonlocal version
nonlocal name
name = f"{self.name}.{name or handler.__name__}"
route = FutureRoute( route = FutureRoute(
handler, uri, [], host, strict_slashes, False, version, name handler, uri, [], host, strict_slashes, False, version, name
) )

View File

@@ -3,7 +3,6 @@ import signal
import subprocess import subprocess
import sys import sys
from multiprocessing import Process
from time import sleep from time import sleep
@@ -35,101 +34,26 @@ def _iter_module_files():
def _get_args_for_reloading(): def _get_args_for_reloading():
"""Returns the executable.""" """Returns the executable."""
rv = [sys.executable]
main_module = sys.modules["__main__"] main_module = sys.modules["__main__"]
mod_spec = getattr(main_module, "__spec__", None) mod_spec = getattr(main_module, "__spec__", None)
if sys.argv[0] in ("", "-c"):
raise RuntimeError(
f"Autoreloader cannot work with argv[0]={sys.argv[0]!r}"
)
if mod_spec: if mod_spec:
# Parent exe was launched as a module rather than a script # Parent exe was launched as a module rather than a script
rv.extend(["-m", mod_spec.name]) return [sys.executable, "-m", mod_spec.name] + sys.argv[1:]
if len(sys.argv) > 1: return [sys.executable] + sys.argv
rv.extend(sys.argv[1:])
else:
rv.extend(sys.argv)
return rv
def restart_with_reloader(): def restart_with_reloader():
"""Create a new process and a subprocess in it with the same arguments as """Create a new process and a subprocess in it with the same arguments as
this one. this one.
""" """
cwd = os.getcwd() return subprocess.Popen(
args = _get_args_for_reloading() _get_args_for_reloading(),
new_environ = os.environ.copy() env={**os.environ, "SANIC_SERVER_RUNNING": "true"},
new_environ["SANIC_SERVER_RUNNING"] = "true"
cmd = " ".join(args)
worker_process = Process(
target=subprocess.call,
args=(cmd,),
kwargs={"cwd": cwd, "shell": True, "env": new_environ},
) )
worker_process.start()
return worker_process
def kill_process_children_unix(pid):
"""Find and kill child processes of a process (maximum two level).
:param pid: PID of parent process (process ID)
:return: Nothing
"""
root_process_path = f"/proc/{pid}/task/{pid}/children"
if not os.path.isfile(root_process_path):
return
with open(root_process_path) as children_list_file:
children_list_pid = children_list_file.read().split()
for child_pid in children_list_pid:
children_proc_path = "/proc/%s/task/%s/children" % (
child_pid,
child_pid,
)
if not os.path.isfile(children_proc_path):
continue
with open(children_proc_path) as children_list_file_2:
children_list_pid_2 = children_list_file_2.read().split()
for _pid in children_list_pid_2:
try:
os.kill(int(_pid), signal.SIGTERM)
except ProcessLookupError:
continue
try:
os.kill(int(child_pid), signal.SIGTERM)
except ProcessLookupError:
continue
def kill_process_children_osx(pid):
"""Find and kill child processes of a process.
:param pid: PID of parent process (process ID)
:return: Nothing
"""
subprocess.run(["pkill", "-P", str(pid)])
def kill_process_children(pid):
"""Find and kill child processes of a process.
:param pid: PID of parent process (process ID)
:return: Nothing
"""
if sys.platform == "darwin":
kill_process_children_osx(pid)
elif sys.platform == "linux":
kill_process_children_unix(pid)
else:
pass # should signal error here
def kill_program_completly(proc):
"""Kill worker and it's child processes and exit.
:param proc: worker process (process ID)
:return: Nothing
"""
kill_process_children(proc.pid)
proc.terminate()
os._exit(0)
def watchdog(sleep_interval): def watchdog(sleep_interval):
@@ -138,30 +62,42 @@ def watchdog(sleep_interval):
:param sleep_interval: interval in second. :param sleep_interval: interval in second.
:return: Nothing :return: Nothing
""" """
def interrupt_self(*args):
raise KeyboardInterrupt
mtimes = {} mtimes = {}
signal.signal(signal.SIGTERM, interrupt_self)
if os.name == "nt":
signal.signal(signal.SIGBREAK, interrupt_self)
worker_process = restart_with_reloader() worker_process = restart_with_reloader()
signal.signal(
signal.SIGTERM, lambda *args: kill_program_completly(worker_process)
)
signal.signal(
signal.SIGINT, lambda *args: kill_program_completly(worker_process)
)
while True:
for filename in _iter_module_files():
try:
mtime = os.stat(filename).st_mtime
except OSError:
continue
old_time = mtimes.get(filename) try:
if old_time is None: while True:
mtimes[filename] = mtime need_reload = False
continue
elif mtime > old_time: for filename in _iter_module_files():
kill_process_children(worker_process.pid) try:
mtime = os.stat(filename).st_mtime
except OSError:
continue
old_time = mtimes.get(filename)
if old_time is None:
mtimes[filename] = mtime
elif mtime > old_time:
mtimes[filename] = mtime
need_reload = True
if need_reload:
worker_process.terminate() worker_process.terminate()
worker_process.wait()
worker_process = restart_with_reloader() worker_process = restart_with_reloader()
mtimes[filename] = mtime
break
sleep(sleep_interval) sleep(sleep_interval)
except KeyboardInterrupt:
pass
finally:
worker_process.terminate()
worker_process.wait()

View File

@@ -87,6 +87,7 @@ class Request:
"_socket", "_socket",
"app", "app",
"body", "body",
"conn_info",
"ctx", "ctx",
"endpoint", "endpoint",
"headers", "headers",
@@ -117,6 +118,7 @@ class Request:
# Init but do not inhale # Init but do not inhale
self.body_init() self.body_init()
self.conn_info = None
self.ctx = SimpleNamespace() self.ctx = SimpleNamespace()
self.parsed_forwarded = None self.parsed_forwarded = None
self.parsed_json = None self.parsed_json = None
@@ -349,56 +351,55 @@ class Request:
self._cookies = {} self._cookies = {}
return self._cookies return self._cookies
@property
def content_type(self):
return self.headers.get("Content-Type", DEFAULT_HTTP_CONTENT_TYPE)
@property
def match_info(self):
"""return matched info after resolving route"""
return self.app.router.get(self)[2]
# Transport properties (obtained from local interface only)
@property @property
def ip(self): def ip(self):
""" """
:return: peer ip of the socket :return: peer ip of the socket
""" """
if not hasattr(self, "_socket"): return self.conn_info.client if self.conn_info else ""
self._get_address()
return self._ip
@property @property
def port(self): def port(self):
""" """
:return: peer port of the socket :return: peer port of the socket
""" """
if not hasattr(self, "_socket"): return self.conn_info.client_port if self.conn_info else 0
self._get_address()
return self._port
@property @property
def socket(self): def socket(self):
if not hasattr(self, "_socket"): return self.conn_info.peername if self.conn_info else (None, None)
self._get_address()
return self._socket
def _get_address(self):
self._socket = self.transport.get_extra_info("peername") or (
None,
None,
)
self._ip = self._socket[0]
self._port = self._socket[1]
@property @property
def server_name(self): def path(self) -> str:
""" """Path of the local HTTP request."""
Attempt to get the server's external hostname in this order: return self._parsed_url.path.decode("utf-8")
`config.SERVER_NAME`, proxied or direct Host headers
:func:`Request.host`
:return: the server name without port number # Proxy properties (using SERVER_NAME/forwarded/request/transport info)
:rtype: str
"""
server_name = self.app.config.get("SERVER_NAME")
if server_name:
host = server_name.split("//", 1)[-1].split("/", 1)[0]
return parse_host(host)[0]
return parse_host(self.host)[0]
@property @property
def forwarded(self): def forwarded(self):
"""
Active proxy information obtained from request headers, as specified in
Sanic configuration.
Field names by, for, proto, host, port and path are normalized.
- for and by IPv6 addresses are bracketed
- port (int) is only set by port headers, not from host.
- path is url-unencoded
Additional values may be available from new style Forwarded headers.
"""
if self.parsed_forwarded is None: if self.parsed_forwarded is None:
self.parsed_forwarded = ( self.parsed_forwarded = (
parse_forwarded(self.headers, self.app.config) parse_forwarded(self.headers, self.app.config)
@@ -408,50 +409,30 @@ class Request:
return self.parsed_forwarded return self.parsed_forwarded
@property @property
def server_port(self): def remote_addr(self) -> str:
""" """
Attempt to get the server's external port number in this order: Client IP address, if available.
`config.SERVER_NAME`, proxied or direct Host headers 1. proxied remote address `self.forwarded['for']`
:func:`Request.host`, 2. local remote address `self.ip`
actual port used by the transport layer socket. :return: IPv4, bracketed IPv6, UNIX socket name or arbitrary string
:return: server port
:rtype: int
"""
if self.forwarded:
return self.forwarded.get("port") or (
80 if self.scheme in ("http", "ws") else 443
)
return (
parse_host(self.host)[1]
or self.transport.get_extra_info("sockname")[1]
)
@property
def remote_addr(self):
"""Attempt to return the original client ip based on `forwarded`,
`x-forwarded-for` or `x-real-ip`. If HTTP headers are unavailable or
untrusted, returns an empty string.
:return: original client ip.
""" """
if not hasattr(self, "_remote_addr"): if not hasattr(self, "_remote_addr"):
self._remote_addr = self.forwarded.get("for", "") self._remote_addr = self.forwarded.get("for", "") # or self.ip
return self._remote_addr return self._remote_addr
@property @property
def scheme(self): def scheme(self) -> str:
""" """
Attempt to get the request scheme. Determine request scheme.
Seeking the value in this order: 1. `config.SERVER_NAME` if in full URL format
`forwarded` header, `x-forwarded-proto` header, 2. proxied proto/scheme
`x-scheme` header, the sanic app itself. 3. local connection protocol
:return: http|https|ws|wss or arbitrary value given by the headers. :return: http|https|ws|wss or arbitrary value given by the headers.
:rtype: str
""" """
forwarded_proto = self.forwarded.get("proto") if "//" in self.app.config.get("SERVER_NAME", ""):
if forwarded_proto: return self.app.config.SERVER_NAME.split("//")[0]
return forwarded_proto if "proto" in self.forwarded:
return self.forwarded["proto"]
if ( if (
self.app.websocket_enabled self.app.websocket_enabled
@@ -467,25 +448,41 @@ class Request:
return scheme return scheme
@property @property
def host(self): def host(self) -> str:
""" """
:return: proxied or direct Host header. Hostname and port number may be The currently effective server 'host' (hostname or hostname:port).
separated by sanic.headers.parse_host(request.host). 1. `config.SERVER_NAME` overrides any client headers
2. proxied host of original request
3. request host header
hostname and port may be separated by
`sanic.headers.parse_host(request.host)`.
:return: the first matching host found, or empty string
""" """
return self.forwarded.get("host", self.headers.get("Host", "")) server_name = self.app.config.get("SERVER_NAME")
if server_name:
return server_name.split("//", 1)[-1].split("/", 1)[0]
return self.forwarded.get("host") or self.headers.get("host", "")
@property @property
def content_type(self): def server_name(self) -> str:
return self.headers.get("Content-Type", DEFAULT_HTTP_CONTENT_TYPE) """The hostname the client connected to, by `request.host`."""
return parse_host(self.host)[0] or ""
@property @property
def match_info(self): def server_port(self) -> int:
"""return matched info after resolving route""" """
return self.app.router.get(self)[2] The port the client connected to, by forwarded `port` or
`request.host`.
Default port is returned as 80 and 443 based on `request.scheme`.
"""
port = self.forwarded.get("port") or parse_host(self.host)[1]
return port or (80 if self.scheme in ("http", "ws") else 443)
@property @property
def path(self): def server_path(self) -> str:
return self._parsed_url.path.decode("utf-8") """Full path of current URL. Uses proxied or local path."""
return self.forwarded.get("path") or self.path
@property @property
def query_string(self): def query_string(self):

View File

@@ -149,6 +149,12 @@ class HTTPResponse(BaseHTTPResponse):
self.headers = Header(headers or {}) self.headers = Header(headers or {})
self._cookies = None self._cookies = None
if body_bytes:
warnings.warn(
"Parameter `body_bytes` is deprecated, use `body` instead",
DeprecationWarning,
)
def output(self, version="1.1", keep_alive=False, keep_alive_timeout=None): def output(self, version="1.1", keep_alive=False, keep_alive_timeout=None):
body = b"" body = b""
if has_message_body(self.status): if has_message_body(self.status):
@@ -173,7 +179,7 @@ def empty(status=204, headers=None):
:param status Response code. :param status Response code.
:param headers Custom Headers. :param headers Custom Headers.
""" """
return HTTPResponse(body_bytes=b"", status=status, headers=headers) return HTTPResponse(body=b"", status=status, headers=headers)
def json( def json(
@@ -243,10 +249,7 @@ def raw(
:param content_type: the content type (string) of the response. :param content_type: the content type (string) of the response.
""" """
return HTTPResponse( return HTTPResponse(
body_bytes=body, body=body, status=status, headers=headers, content_type=content_type,
status=status,
headers=headers,
content_type=content_type,
) )
@@ -306,10 +309,10 @@ async def file(
mime_type = mime_type or guess_type(filename)[0] or "text/plain" mime_type = mime_type or guess_type(filename)[0] or "text/plain"
return HTTPResponse( return HTTPResponse(
body=out_stream,
status=status, status=status,
headers=headers, headers=headers,
content_type=mime_type, content_type=mime_type,
body_bytes=out_stream,
) )

View File

@@ -1,15 +1,18 @@
import asyncio import asyncio
import multiprocessing import multiprocessing
import os import os
import secrets
import socket
import stat
import sys import sys
import traceback import traceback
from collections import deque from collections import deque
from functools import partial from functools import partial
from inspect import isawaitable from inspect import isawaitable
from ipaddress import ip_address
from signal import SIG_IGN, SIGINT, SIGTERM, Signals from signal import SIG_IGN, SIGINT, SIGTERM, Signals
from signal import signal as signal_func from signal import signal as signal_func
from socket import SO_REUSEADDR, SOL_SOCKET, socket
from time import time from time import time
from httptools import HttpRequestParser # type: ignore from httptools import HttpRequestParser # type: ignore
@@ -44,6 +47,41 @@ class Signal:
stopped = False stopped = False
class ConnInfo:
"""Local and remote addresses and SSL status info."""
__slots__ = (
"sockname",
"peername",
"server",
"server_port",
"client",
"client_port",
"ssl",
)
def __init__(self, transport, unix=None):
self.ssl = bool(transport.get_extra_info("sslcontext"))
self.server = self.client = ""
self.server_port = self.client_port = 0
self.peername = None
self.sockname = addr = transport.get_extra_info("sockname")
if isinstance(addr, str): # UNIX socket
self.server = unix or addr
return
# IPv4 (ip, port) or IPv6 (ip, port, flowinfo, scopeid)
if isinstance(addr, tuple):
self.server = addr[0] if len(addr) == 2 else f"[{addr[0]}]"
self.server_port = addr[1]
# self.server gets non-standard port appended
if addr[1] != (443 if self.ssl else 80):
self.server = f"{self.server}:{addr[1]}"
self.peername = addr = transport.get_extra_info("peername")
if isinstance(addr, tuple):
self.client = addr[0] if len(addr) == 2 else f"[{addr[0]}]"
self.client_port = addr[1]
class HttpProtocol(asyncio.Protocol): class HttpProtocol(asyncio.Protocol):
""" """
This class provides a basic HTTP implementation of the sanic framework. This class provides a basic HTTP implementation of the sanic framework.
@@ -57,6 +95,7 @@ class HttpProtocol(asyncio.Protocol):
"transport", "transport",
"connections", "connections",
"signal", "signal",
"conn_info",
# request params # request params
"parser", "parser",
"request", "request",
@@ -88,6 +127,7 @@ class HttpProtocol(asyncio.Protocol):
"_keep_alive", "_keep_alive",
"_header_fragment", "_header_fragment",
"state", "state",
"_unix",
"_body_chunks", "_body_chunks",
) )
@@ -99,6 +139,7 @@ class HttpProtocol(asyncio.Protocol):
signal=Signal(), signal=Signal(),
connections=None, connections=None,
state=None, state=None,
unix=None,
**kwargs, **kwargs,
): ):
asyncio.set_event_loop(loop) asyncio.set_event_loop(loop)
@@ -106,6 +147,7 @@ class HttpProtocol(asyncio.Protocol):
deprecated_loop = self.loop if sys.version_info < (3, 7) else None deprecated_loop = self.loop if sys.version_info < (3, 7) else None
self.app = app self.app = app
self.transport = None self.transport = None
self.conn_info = None
self.request = None self.request = None
self.parser = None self.parser = None
self.url = None self.url = None
@@ -139,6 +181,7 @@ class HttpProtocol(asyncio.Protocol):
self.state = state if state else {} self.state = state if state else {}
if "requests_count" not in self.state: if "requests_count" not in self.state:
self.state["requests_count"] = 0 self.state["requests_count"] = 0
self._unix = unix
self._not_paused.set() self._not_paused.set()
self._body_chunks = deque() self._body_chunks = deque()
@@ -167,6 +210,7 @@ class HttpProtocol(asyncio.Protocol):
self.request_timeout, self.request_timeout_callback self.request_timeout, self.request_timeout_callback
) )
self.transport = transport self.transport = transport
self.conn_info = ConnInfo(transport, unix=self._unix)
self._last_request_time = time() self._last_request_time = time()
def connection_lost(self, exc): def connection_lost(self, exc):
@@ -304,6 +348,7 @@ class HttpProtocol(asyncio.Protocol):
transport=self.transport, transport=self.transport,
app=self.app, app=self.app,
) )
self.request.conn_info = self.conn_info
# Remove any existing KeepAlive handler here, # Remove any existing KeepAlive handler here,
# It will be recreated if required on the new request. # It will be recreated if required on the new request.
if self._keep_alive_timeout_handler: if self._keep_alive_timeout_handler:
@@ -750,6 +795,7 @@ def serve(
after_stop=None, after_stop=None,
ssl=None, ssl=None,
sock=None, sock=None,
unix=None,
reuse_port=False, reuse_port=False,
loop=None, loop=None,
protocol=HttpProtocol, protocol=HttpProtocol,
@@ -778,6 +824,7 @@ def serve(
`app` instance and `loop` `app` instance and `loop`
:param ssl: SSLContext :param ssl: SSLContext
:param sock: Socket for the server to accept connections from :param sock: Socket for the server to accept connections from
:param unix: Unix socket to listen on instead of TCP port
:param reuse_port: `True` for multiple workers :param reuse_port: `True` for multiple workers
:param loop: asyncio compatible event loop :param loop: asyncio compatible event loop
:param run_async: bool: Do not create a new event loop for the server, :param run_async: bool: Do not create a new event loop for the server,
@@ -804,14 +851,18 @@ def serve(
signal=signal, signal=signal,
app=app, app=app,
state=state, state=state,
unix=unix,
) )
asyncio_server_kwargs = ( asyncio_server_kwargs = (
asyncio_server_kwargs if asyncio_server_kwargs else {} asyncio_server_kwargs if asyncio_server_kwargs else {}
) )
# UNIX sockets are always bound by us (to preserve semantics between modes)
if unix:
sock = bind_unix_socket(unix, backlog=backlog)
server_coroutine = loop.create_server( server_coroutine = loop.create_server(
server, server,
host, None if sock else host,
port, None if sock else port,
ssl=ssl, ssl=ssl,
reuse_port=reuse_port, reuse_port=reuse_port,
sock=sock, sock=sock,
@@ -894,6 +945,85 @@ def serve(
trigger_events(after_stop, loop) trigger_events(after_stop, loop)
loop.close() loop.close()
remove_unix_socket(unix)
def bind_socket(host: str, port: int, *, backlog=100) -> socket.socket:
"""Create TCP server socket.
:param host: IPv4, IPv6 or hostname may be specified
:param port: TCP port number
:param backlog: Maximum number of connections to queue
:return: socket.socket object
"""
try: # IP address: family must be specified for IPv6 at least
ip = ip_address(host)
host = str(ip)
sock = socket.socket(
socket.AF_INET6 if ip.version == 6 else socket.AF_INET
)
except ValueError: # Hostname, may become AF_INET or AF_INET6
sock = socket.socket()
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
sock.bind((host, port))
sock.listen(backlog)
return sock
def bind_unix_socket(path: str, *, mode=0o666, backlog=100) -> socket.socket:
"""Create unix socket.
:param path: filesystem path
:param backlog: Maximum number of connections to queue
:return: socket.socket object
"""
"""Open or atomically replace existing socket with zero downtime."""
# Sanitise and pre-verify socket path
path = os.path.abspath(path)
folder = os.path.dirname(path)
if not os.path.isdir(folder):
raise FileNotFoundError(f"Socket folder does not exist: {folder}")
try:
if not stat.S_ISSOCK(os.stat(path, follow_symlinks=False).st_mode):
raise FileExistsError(f"Existing file is not a socket: {path}")
except FileNotFoundError:
pass
# Create new socket with a random temporary name
tmp_path = f"{path}.{secrets.token_urlsafe()}"
sock = socket.socket(socket.AF_UNIX)
try:
# Critical section begins (filename races)
sock.bind(tmp_path)
try:
os.chmod(tmp_path, mode)
# Start listening before rename to avoid connection failures
sock.listen(backlog)
os.rename(tmp_path, path)
except: # noqa: E722
try:
os.unlink(tmp_path)
finally:
raise
except: # noqa: E722
try:
sock.close()
finally:
raise
return sock
def remove_unix_socket(path: str) -> None:
"""Remove dead unix socket during server exit."""
if not path:
return
try:
if stat.S_ISSOCK(os.stat(path, follow_symlinks=False).st_mode):
# Is it actually dead (doesn't belong to a new server instance)?
with socket.socket(socket.AF_UNIX) as testsock:
try:
testsock.connect(path)
except ConnectionRefusedError:
os.unlink(path)
except FileNotFoundError:
pass
def serve_multiple(server_settings, workers): def serve_multiple(server_settings, workers):
@@ -908,11 +1038,17 @@ def serve_multiple(server_settings, workers):
server_settings["reuse_port"] = True server_settings["reuse_port"] = True
server_settings["run_multiple"] = True server_settings["run_multiple"] = True
# Handling when custom socket is not provided. # Create a listening socket or use the one in settings
if server_settings.get("sock") is None: sock = server_settings.get("sock")
sock = socket() unix = server_settings["unix"]
sock.setsockopt(SOL_SOCKET, SO_REUSEADDR, 1) backlog = server_settings["backlog"]
sock.bind((server_settings["host"], server_settings["port"])) if unix:
sock = bind_unix_socket(unix, backlog=backlog)
server_settings["unix"] = unix
if sock is None:
sock = bind_socket(
server_settings["host"], server_settings["port"], backlog=backlog
)
sock.set_inheritable(True) sock.set_inheritable(True)
server_settings["sock"] = sock server_settings["sock"] = sock
server_settings["host"] = None server_settings["host"] = None
@@ -927,7 +1063,7 @@ def serve_multiple(server_settings, workers):
signal_func(SIGINT, lambda s, f: sig_handler(s, f)) signal_func(SIGINT, lambda s, f: sig_handler(s, f))
signal_func(SIGTERM, lambda s, f: sig_handler(s, f)) signal_func(SIGTERM, lambda s, f: sig_handler(s, f))
mp = multiprocessing.get_context("fork") mp = multiprocessing.get_context("spawn")
for _ in range(workers): for _ in range(workers):
process = mp.Process(target=serve, kwargs=server_settings) process = mp.Process(target=serve, kwargs=server_settings)
@@ -941,4 +1077,6 @@ def serve_multiple(server_settings, workers):
# the above processes will block this until they're stopped # the above processes will block this until they're stopped
for process in processes: for process in processes:
process.terminate() process.terminate()
server_settings.get("sock").close()
sock.close()
remove_unix_socket(unix)

View File

@@ -1,3 +1,4 @@
from functools import partial, wraps
from mimetypes import guess_type from mimetypes import guess_type
from os import path from os import path
from re import sub from re import sub
@@ -15,6 +16,89 @@ from sanic.handlers import ContentRangeHandler
from sanic.response import HTTPResponse, file, file_stream from sanic.response import HTTPResponse, file, file_stream
async def _static_request_handler(
file_or_directory,
use_modified_since,
use_content_range,
stream_large_files,
request,
content_type=None,
file_uri=None,
):
# Using this to determine if the URL is trying to break out of the path
# served. os.path.realpath seems to be very slow
if file_uri and "../" in file_uri:
raise InvalidUsage("Invalid URL")
# Merge served directory and requested file if provided
# Strip all / that in the beginning of the URL to help prevent python
# from herping a derp and treating the uri as an absolute path
root_path = file_path = file_or_directory
if file_uri:
file_path = path.join(file_or_directory, sub("^[/]*", "", file_uri))
# URL decode the path sent by the browser otherwise we won't be able to
# match filenames which got encoded (filenames with spaces etc)
file_path = path.abspath(unquote(file_path))
if not file_path.startswith(path.abspath(unquote(root_path))):
raise FileNotFound(
"File not found", path=file_or_directory, relative_url=file_uri
)
try:
headers = {}
# Check if the client has been sent this file before
# and it has not been modified since
stats = None
if use_modified_since:
stats = await stat_async(file_path)
modified_since = strftime(
"%a, %d %b %Y %H:%M:%S GMT", gmtime(stats.st_mtime)
)
if request.headers.get("If-Modified-Since") == modified_since:
return HTTPResponse(status=304)
headers["Last-Modified"] = modified_since
_range = None
if use_content_range:
_range = None
if not stats:
stats = await stat_async(file_path)
headers["Accept-Ranges"] = "bytes"
headers["Content-Length"] = str(stats.st_size)
if request.method != "HEAD":
try:
_range = ContentRangeHandler(request, stats)
except HeaderNotFound:
pass
else:
del headers["Content-Length"]
for key, value in _range.headers.items():
headers[key] = value
headers["Content-Type"] = (
content_type or guess_type(file_path)[0] or "text/plain"
)
if request.method == "HEAD":
return HTTPResponse(headers=headers)
else:
if stream_large_files:
if type(stream_large_files) == int:
threshold = stream_large_files
else:
threshold = 1024 * 1024
if not stats:
stats = await stat_async(file_path)
if stats.st_size >= threshold:
return await file_stream(
file_path, headers=headers, _range=_range
)
return await file(file_path, headers=headers, _range=_range)
except ContentRangeError:
raise
except Exception:
raise FileNotFound(
"File not found", path=file_or_directory, relative_url=file_uri
)
def register( def register(
app, app,
uri, uri,
@@ -56,86 +140,21 @@ def register(
if not path.isfile(file_or_directory): if not path.isfile(file_or_directory):
uri += "<file_uri:" + pattern + ">" uri += "<file_uri:" + pattern + ">"
async def _handler(request, file_uri=None):
# Using this to determine if the URL is trying to break out of the path
# served. os.path.realpath seems to be very slow
if file_uri and "../" in file_uri:
raise InvalidUsage("Invalid URL")
# Merge served directory and requested file if provided
# Strip all / that in the beginning of the URL to help prevent python
# from herping a derp and treating the uri as an absolute path
root_path = file_path = file_or_directory
if file_uri:
file_path = path.join(
file_or_directory, sub("^[/]*", "", file_uri)
)
# URL decode the path sent by the browser otherwise we won't be able to
# match filenames which got encoded (filenames with spaces etc)
file_path = path.abspath(unquote(file_path))
if not file_path.startswith(path.abspath(unquote(root_path))):
raise FileNotFound(
"File not found", path=file_or_directory, relative_url=file_uri
)
try:
headers = {}
# Check if the client has been sent this file before
# and it has not been modified since
stats = None
if use_modified_since:
stats = await stat_async(file_path)
modified_since = strftime(
"%a, %d %b %Y %H:%M:%S GMT", gmtime(stats.st_mtime)
)
if request.headers.get("If-Modified-Since") == modified_since:
return HTTPResponse(status=304)
headers["Last-Modified"] = modified_since
_range = None
if use_content_range:
_range = None
if not stats:
stats = await stat_async(file_path)
headers["Accept-Ranges"] = "bytes"
headers["Content-Length"] = str(stats.st_size)
if request.method != "HEAD":
try:
_range = ContentRangeHandler(request, stats)
except HeaderNotFound:
pass
else:
del headers["Content-Length"]
for key, value in _range.headers.items():
headers[key] = value
headers["Content-Type"] = (
content_type or guess_type(file_path)[0] or "text/plain"
)
if request.method == "HEAD":
return HTTPResponse(headers=headers)
else:
if stream_large_files:
if type(stream_large_files) == int:
threshold = stream_large_files
else:
threshold = 1024 * 1024
if not stats:
stats = await stat_async(file_path)
if stats.st_size >= threshold:
return await file_stream(
file_path, headers=headers, _range=_range
)
return await file(file_path, headers=headers, _range=_range)
except ContentRangeError:
raise
except Exception:
raise FileNotFound(
"File not found", path=file_or_directory, relative_url=file_uri
)
# special prefix for static files # special prefix for static files
if not name.startswith("_static_"): if not name.startswith("_static_"):
name = f"_static_{name}" name = f"_static_{name}"
_handler = wraps(_static_request_handler)(
partial(
_static_request_handler,
file_or_directory,
use_modified_since,
use_content_range,
stream_large_files,
content_type=content_type,
)
)
app.route( app.route(
uri, uri,
methods=["GET", "HEAD"], methods=["GET", "HEAD"],

View File

@@ -113,7 +113,7 @@ class WebSocketProtocol(HttpProtocol):
# hook up the websocket protocol # hook up the websocket protocol
self.websocket = WebSocketCommonProtocol( self.websocket = WebSocketCommonProtocol(
timeout=self.websocket_timeout, close_timeout=self.websocket_timeout,
max_size=self.websocket_max_size, max_size=self.websocket_max_size,
max_queue=self.websocket_max_queue, max_queue=self.websocket_max_queue,
read_limit=self.websocket_read_limit, read_limit=self.websocket_read_limit,

View File

@@ -5,7 +5,6 @@ import codecs
import os import os
import re import re
import sys import sys
from distutils.util import strtobool from distutils.util import strtobool
from setuptools import setup from setuptools import setup
@@ -39,9 +38,7 @@ def open_local(paths, mode="r", encoding="utf8"):
with open_local(["sanic", "__version__.py"], encoding="latin1") as fp: with open_local(["sanic", "__version__.py"], encoding="latin1") as fp:
try: try:
version = re.findall( version = re.findall(r"^__version__ = \"([^']+)\"\r?$", fp.read(), re.M)[0]
r"^__version__ = \"([^']+)\"\r?$", fp.read(), re.M
)[0]
except IndexError: except IndexError:
raise RuntimeError("Unable to determine version.") raise RuntimeError("Unable to determine version.")
@@ -70,11 +67,10 @@ setup_kwargs = {
"Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8", "Programming Language :: Python :: 3.8",
], ],
"entry_points": {"console_scripts": ["sanic = sanic.__main__:main"]},
} }
env_dependency = ( env_dependency = '; sys_platform != "win32" ' 'and implementation_name == "cpython"'
'; sys_platform != "win32" ' 'and implementation_name == "cpython"'
)
ujson = "ujson>=1.35" + env_dependency ujson = "ujson>=1.35" + env_dependency
uvloop = "uvloop>=0.5.3" + env_dependency uvloop = "uvloop>=0.5.3" + env_dependency
@@ -83,7 +79,7 @@ requirements = [
uvloop, uvloop,
ujson, ujson,
"aiofiles>=0.3.0", "aiofiles>=0.3.0",
"websockets>=7.0,<9.0", "websockets>=8.1,<9.0",
"multidict>=4.0,<5.0", "multidict>=4.0,<5.0",
"httpx==0.11.1", "httpx==0.11.1",
] ]

View File

@@ -56,6 +56,7 @@ def test_asyncio_server_no_start_serving(app):
srv = loop.run_until_complete(asyncio_srv_coro) srv = loop.run_until_complete(asyncio_srv_coro)
assert srv.is_serving() is False assert srv.is_serving() is False
@pytest.mark.skipif( @pytest.mark.skipif(
sys.version_info < (3, 7), reason="requires python3.7 or higher" sys.version_info < (3, 7), reason="requires python3.7 or higher"
) )
@@ -75,6 +76,7 @@ def test_asyncio_server_start_serving(app):
loop.run_until_complete(wait_close) loop.run_until_complete(wait_close)
# Looks like we can't easily test `serve_forever()` # Looks like we can't easily test `serve_forever()`
def test_app_loop_not_running(app): def test_app_loop_not_running(app):
with pytest.raises(SanicException) as excinfo: with pytest.raises(SanicException) as excinfo:
app.loop app.loop
@@ -125,7 +127,10 @@ def test_app_handle_request_handler_is_none(app, monkeypatch):
request, response = app.test_client.get("/test") request, response = app.test_client.get("/test")
assert "'None' was returned while requesting a handler from the router" in response.text assert (
"'None' was returned while requesting a handler from the router"
in response.text
)
@pytest.mark.parametrize("websocket_enabled", [True, False]) @pytest.mark.parametrize("websocket_enabled", [True, False])

View File

@@ -84,8 +84,8 @@ def test_listeners_triggered(app):
all_tasks = ( all_tasks = (
asyncio.Task.all_tasks() asyncio.Task.all_tasks()
if sys.version_info < (3, 7) else if sys.version_info < (3, 7)
asyncio.all_tasks(asyncio.get_event_loop()) else asyncio.all_tasks(asyncio.get_event_loop())
) )
for task in all_tasks: for task in all_tasks:
task.cancel() task.cancel()
@@ -134,8 +134,8 @@ def test_listeners_triggered_async(app):
all_tasks = ( all_tasks = (
asyncio.Task.all_tasks() asyncio.Task.all_tasks()
if sys.version_info < (3, 7) else if sys.version_info < (3, 7)
asyncio.all_tasks(asyncio.get_event_loop()) else asyncio.all_tasks(asyncio.get_event_loop())
) )
for task in all_tasks: for task in all_tasks:
task.cancel() task.cancel()

View File

@@ -252,6 +252,88 @@ def test_several_bp_with_host(app):
assert response.text == "Hello3" assert response.text == "Hello3"
def test_bp_with_host_list(app):
bp = Blueprint(
"test_bp_host",
url_prefix="/test1",
host=["example.com", "sub.example.com"],
)
@bp.route("/")
def handler1(request):
return text("Hello")
@bp.route("/", host=["sub1.example.com"])
def handler2(request):
return text("Hello subdomain!")
app.blueprint(bp)
headers = {"Host": "example.com"}
request, response = app.test_client.get("/test1/", headers=headers)
assert response.text == "Hello"
headers = {"Host": "sub.example.com"}
request, response = app.test_client.get("/test1/", headers=headers)
assert response.text == "Hello"
headers = {"Host": "sub1.example.com"}
request, response = app.test_client.get("/test1/", headers=headers)
assert response.text == "Hello subdomain!"
def test_several_bp_with_host_list(app):
bp = Blueprint(
"test_text",
url_prefix="/test",
host=["example.com", "sub.example.com"],
)
bp2 = Blueprint(
"test_text2",
url_prefix="/test",
host=["sub1.example.com", "sub2.example.com"],
)
@bp.route("/")
def handler(request):
return text("Hello")
@bp2.route("/")
def handler1(request):
return text("Hello2")
@bp2.route("/other/")
def handler2(request):
return text("Hello3")
app.blueprint(bp)
app.blueprint(bp2)
assert bp.host == ["example.com", "sub.example.com"]
headers = {"Host": "example.com"}
request, response = app.test_client.get("/test/", headers=headers)
assert response.text == "Hello"
assert bp.host == ["example.com", "sub.example.com"]
headers = {"Host": "sub.example.com"}
request, response = app.test_client.get("/test/", headers=headers)
assert response.text == "Hello"
assert bp2.host == ["sub1.example.com", "sub2.example.com"]
headers = {"Host": "sub1.example.com"}
request, response = app.test_client.get("/test/", headers=headers)
assert response.text == "Hello2"
request, response = app.test_client.get("/test/other/", headers=headers)
assert response.text == "Hello3"
assert bp2.host == ["sub1.example.com", "sub2.example.com"]
headers = {"Host": "sub2.example.com"}
request, response = app.test_client.get("/test/", headers=headers)
assert response.text == "Hello2"
request, response = app.test_client.get("/test/other/", headers=headers)
assert response.text == "Hello3"
def test_bp_middleware(app): def test_bp_middleware(app):
blueprint = Blueprint("test_bp_middleware") blueprint = Blueprint("test_bp_middleware")
@@ -270,24 +352,31 @@ def test_bp_middleware(app):
assert response.status == 200 assert response.status == 200
assert response.text == "FAIL" assert response.text == "FAIL"
def test_bp_middleware_order(app): def test_bp_middleware_order(app):
blueprint = Blueprint("test_bp_middleware_order") blueprint = Blueprint("test_bp_middleware_order")
order = list() order = list()
@blueprint.middleware("request") @blueprint.middleware("request")
def mw_1(request): def mw_1(request):
order.append(1) order.append(1)
@blueprint.middleware("request") @blueprint.middleware("request")
def mw_2(request): def mw_2(request):
order.append(2) order.append(2)
@blueprint.middleware("request") @blueprint.middleware("request")
def mw_3(request): def mw_3(request):
order.append(3) order.append(3)
@blueprint.middleware("response") @blueprint.middleware("response")
def mw_4(request, response): def mw_4(request, response):
order.append(6) order.append(6)
@blueprint.middleware("response") @blueprint.middleware("response")
def mw_5(request, response): def mw_5(request, response):
order.append(5) order.append(5)
@blueprint.middleware("response") @blueprint.middleware("response")
def mw_6(request, response): def mw_6(request, response):
order.append(4) order.append(4)
@@ -303,6 +392,7 @@ def test_bp_middleware_order(app):
assert response.status == 200 assert response.status == 200
assert order == [1, 2, 3, 4, 5, 6] assert order == [1, 2, 3, 4, 5, 6]
def test_bp_exception_handler(app): def test_bp_exception_handler(app):
blueprint = Blueprint("test_middleware") blueprint = Blueprint("test_middleware")
@@ -585,9 +675,7 @@ def test_bp_group_with_default_url_prefix(app):
from uuid import uuid4 from uuid import uuid4
resource_id = str(uuid4()) resource_id = str(uuid4())
request, response = app.test_client.get( request, response = app.test_client.get(f"/api/v1/resources/{resource_id}")
f"/api/v1/resources/{resource_id}"
)
assert response.json == {"resource_id": resource_id} assert response.json == {"resource_id": resource_id}

View File

@@ -9,6 +9,7 @@ from sanic import Sanic, server
from sanic.response import text from sanic.response import text
from sanic.testing import HOST, SanicTestClient from sanic.testing import HOST, SanicTestClient
CONFIG_FOR_TESTS = {"KEEP_ALIVE_TIMEOUT": 2, "KEEP_ALIVE": True} CONFIG_FOR_TESTS = {"KEEP_ALIVE_TIMEOUT": 2, "KEEP_ALIVE": True}
old_conn = None old_conn = None
@@ -46,7 +47,7 @@ class ReusableSanicConnectionPool(
cert=self.cert, cert=self.cert,
verify=self.verify, verify=self.verify,
trust_env=self.trust_env, trust_env=self.trust_env,
http2=self.http2 http2=self.http2,
) )
connection = httpx.dispatch.connection.HTTPConnection( connection = httpx.dispatch.connection.HTTPConnection(
origin, origin,
@@ -166,9 +167,7 @@ class ReuseableSanicTestClient(SanicTestClient):
try: try:
return results[-1] return results[-1]
except Exception: except Exception:
raise ValueError( raise ValueError(f"Request object expected, got ({results})")
f"Request object expected, got ({results})"
)
def kill_server(self): def kill_server(self):
try: try:

View File

@@ -87,3 +87,15 @@ def test_pickle_app_with_bp(app, protocol):
request, response = up_p_app.test_client.get("/") request, response = up_p_app.test_client.get("/")
assert up_p_app.is_request_stream is False assert up_p_app.is_request_stream is False
assert response.text == "Hello" assert response.text == "Hello"
@pytest.mark.parametrize("protocol", [3, 4])
def test_pickle_app_with_static(app, protocol):
app.route("/")(handler)
app.static("/static", "/tmp/static")
p_app = pickle.dumps(app, protocol=protocol)
del app
up_p_app = pickle.loads(p_app)
assert up_p_app
request, response = up_p_app.test_client.get("/static/missing.txt")
assert response.status == 404

108
tests/test_reloader.py Normal file
View File

@@ -0,0 +1,108 @@
import os
import secrets
import sys
from contextlib import suppress
from subprocess import PIPE, Popen, TimeoutExpired
from tempfile import TemporaryDirectory
from textwrap import dedent
from threading import Timer
from time import sleep
import pytest
# We need to interrupt the autoreloader without killing it, so that the server gets terminated
# https://stefan.sofa-rockers.org/2013/08/15/handling-sub-process-hierarchies-python-linux-os-x/
try:
from signal import CTRL_BREAK_EVENT
from subprocess import CREATE_NEW_PROCESS_GROUP
flags = CREATE_NEW_PROCESS_GROUP
except ImportError:
flags = 0
def terminate(proc):
if flags:
proc.send_signal(CTRL_BREAK_EVENT)
else:
proc.terminate()
def write_app(filename, **runargs):
text = secrets.token_urlsafe()
with open(filename, "w") as f:
f.write(
dedent(
f"""\
import os
from sanic import Sanic
app = Sanic(__name__)
@app.listener("after_server_start")
def complete(*args):
print("complete", os.getpid(), {text!r})
if __name__ == "__main__":
app.run(**{runargs!r})
"""
)
)
return text
def scanner(proc):
for line in proc.stdout:
line = line.decode().strip()
print(">", line)
if line.startswith("complete"):
yield line
argv = dict(
script=[sys.executable, "reloader.py"],
module=[sys.executable, "-m", "reloader"],
sanic=[
sys.executable,
"-m",
"sanic",
"--port",
"42104",
"--debug",
"reloader.app",
],
)
@pytest.mark.parametrize(
"runargs, mode",
[
(dict(port=42102, auto_reload=True), "script"),
(dict(port=42103, debug=True), "module"),
(dict(), "sanic"),
],
)
async def test_reloader_live(runargs, mode):
with TemporaryDirectory() as tmpdir:
filename = os.path.join(tmpdir, "reloader.py")
text = write_app(filename, **runargs)
proc = Popen(argv[mode], cwd=tmpdir, stdout=PIPE, creationflags=flags)
try:
timeout = Timer(5, terminate, [proc])
timeout.start()
# Python apparently keeps using the old source sometimes if
# we don't sleep before rewrite (pycache timestamp problem?)
sleep(1)
line = scanner(proc)
assert text in next(line)
# Edit source code and try again
text = write_app(filename, **runargs)
assert text in next(line)
finally:
timeout.cancel()
terminate(proc)
with suppress(TimeoutExpired):
proc.wait(timeout=3)

View File

@@ -614,6 +614,7 @@ def test_request_stream(app):
assert response.status == 200 assert response.status == 200
assert response.text == data assert response.text == data
def test_streaming_new_api(app): def test_streaming_new_api(app):
@app.post("/non-stream") @app.post("/non-stream")
async def handler(request): async def handler(request):

View File

@@ -17,7 +17,7 @@ class DelayableHTTPConnection(httpx.dispatch.connection.HTTPConnection):
async def send(self, request, timeout=None): async def send(self, request, timeout=None):
if self.connection is None: if self.connection is None:
self.connection = (await self.connect(timeout=timeout)) self.connection = await self.connect(timeout=timeout)
if self._request_delay: if self._request_delay:
await asyncio.sleep(self._request_delay) await asyncio.sleep(self._request_delay)

View File

@@ -454,11 +454,13 @@ def test_standard_forwarded(app):
"X-Real-IP": "127.0.0.2", "X-Real-IP": "127.0.0.2",
"X-Forwarded-For": "127.0.1.1", "X-Forwarded-For": "127.0.1.1",
"X-Scheme": "ws", "X-Scheme": "ws",
"Host": "local.site",
} }
request, response = app.test_client.get("/", headers=headers) request, response = app.test_client.get("/", headers=headers)
assert response.json == {"for": "127.0.0.2", "proto": "ws"} assert response.json == {"for": "127.0.0.2", "proto": "ws"}
assert request.remote_addr == "127.0.0.2" assert request.remote_addr == "127.0.0.2"
assert request.scheme == "ws" assert request.scheme == "ws"
assert request.server_name == "local.site"
assert request.server_port == 80 assert request.server_port == 80
app.config.FORWARDED_SECRET = "mySecret" app.config.FORWARDED_SECRET = "mySecret"
@@ -1807,13 +1809,17 @@ def test_request_port(app):
port = request.port port = request.port
assert isinstance(port, int) assert isinstance(port, int)
delattr(request, "_socket")
delattr(request, "_port") @pytest.mark.asyncio
async def test_request_port_asgi(app):
@app.get("/")
def handler(request):
return text("OK")
request, response = await app.asgi_client.get("/")
port = request.port port = request.port
assert isinstance(port, int) assert isinstance(port, int)
assert hasattr(request, "_socket")
assert hasattr(request, "_port")
def test_request_socket(app): def test_request_socket(app):
@@ -1832,12 +1838,6 @@ def test_request_socket(app):
assert ip == request.ip assert ip == request.ip
assert port == request.port assert port == request.port
delattr(request, "_socket")
socket = request.socket
assert isinstance(socket, tuple)
assert hasattr(request, "_socket")
def test_request_server_name(app): def test_request_server_name(app):
@app.get("/") @app.get("/")
@@ -1866,7 +1866,7 @@ def test_request_server_name_in_host_header(app):
request, response = app.test_client.get( request, response = app.test_client.get(
"/", headers={"Host": "mal_formed"} "/", headers={"Host": "mal_formed"}
) )
assert request.server_name == None # For now (later maybe 127.0.0.1) assert request.server_name == ""
def test_request_server_name_forwarded(app): def test_request_server_name_forwarded(app):
@@ -1893,7 +1893,7 @@ def test_request_server_port(app):
test_client = SanicTestClient(app) test_client = SanicTestClient(app)
request, response = test_client.get("/", headers={"Host": "my-server"}) request, response = test_client.get("/", headers={"Host": "my-server"})
assert request.server_port == test_client.port assert request.server_port == 80
def test_request_server_port_in_host_header(app): def test_request_server_port_in_host_header(app):
@@ -1952,12 +1952,12 @@ def test_server_name_and_url_for(app):
def handler(request): def handler(request):
return text("ok") return text("ok")
app.config.SERVER_NAME = "my-server" app.config.SERVER_NAME = "my-server" # This means default port
assert app.url_for("handler", _external=True) == "http://my-server/foo" assert app.url_for("handler", _external=True) == "http://my-server/foo"
request, response = app.test_client.get("/foo") request, response = app.test_client.get("/foo")
assert ( assert (
request.url_for("handler") request.url_for("handler")
== f"http://my-server:{request.server_port}/foo" == f"http://my-server/foo"
) )
app.config.SERVER_NAME = "https://my-server/path" app.config.SERVER_NAME = "https://my-server/path"

View File

@@ -1,6 +1,7 @@
import asyncio import asyncio
import inspect import inspect
import os import os
import warnings
from collections import namedtuple from collections import namedtuple
from mimetypes import guess_type from mimetypes import guess_type
@@ -242,7 +243,7 @@ def test_non_chunked_streaming_adds_correct_headers(non_chunked_streaming_app):
def test_non_chunked_streaming_returns_correct_content( def test_non_chunked_streaming_returns_correct_content(
non_chunked_streaming_app non_chunked_streaming_app,
): ):
request, response = non_chunked_streaming_app.test_client.get("/") request, response = non_chunked_streaming_app.test_client.get("/")
assert response.text == "foo,bar" assert response.text == "foo,bar"
@@ -257,7 +258,7 @@ def test_stream_response_status_returns_correct_headers(status):
@pytest.mark.parametrize("keep_alive_timeout", [10, 20, 30]) @pytest.mark.parametrize("keep_alive_timeout", [10, 20, 30])
def test_stream_response_keep_alive_returns_correct_headers( def test_stream_response_keep_alive_returns_correct_headers(
keep_alive_timeout keep_alive_timeout,
): ):
response = StreamingHTTPResponse(sample_streaming_fn) response = StreamingHTTPResponse(sample_streaming_fn)
headers = response.get_headers( headers = response.get_headers(
@@ -286,7 +287,7 @@ def test_stream_response_does_not_include_chunked_header_if_disabled():
def test_stream_response_writes_correct_content_to_transport_when_chunked( def test_stream_response_writes_correct_content_to_transport_when_chunked(
streaming_app streaming_app,
): ):
response = StreamingHTTPResponse(sample_streaming_fn) response = StreamingHTTPResponse(sample_streaming_fn)
response.protocol = MagicMock(HttpProtocol) response.protocol = MagicMock(HttpProtocol)
@@ -434,9 +435,10 @@ def test_file_response_custom_filename(
request, response = app.test_client.get(f"/files/{source}") request, response = app.test_client.get(f"/files/{source}")
assert response.status == 200 assert response.status == 200
assert response.body == get_file_content(static_file_directory, source) assert response.body == get_file_content(static_file_directory, source)
assert response.headers[ assert (
"Content-Disposition" response.headers["Content-Disposition"]
] == f'attachment; filename="{dest}"' == f'attachment; filename="{dest}"'
)
@pytest.mark.parametrize("file_name", ["test.file", "decode me.txt"]) @pytest.mark.parametrize("file_name", ["test.file", "decode me.txt"])
@@ -510,9 +512,10 @@ def test_file_stream_response_custom_filename(
request, response = app.test_client.get(f"/files/{source}") request, response = app.test_client.get(f"/files/{source}")
assert response.status == 200 assert response.status == 200
assert response.body == get_file_content(static_file_directory, source) assert response.body == get_file_content(static_file_directory, source)
assert response.headers[ assert (
"Content-Disposition" response.headers["Content-Disposition"]
] == f'attachment; filename="{dest}"' == f'attachment; filename="{dest}"'
)
@pytest.mark.parametrize("file_name", ["test.file", "decode me.txt"]) @pytest.mark.parametrize("file_name", ["test.file", "decode me.txt"])
@@ -581,7 +584,10 @@ def test_file_stream_response_range(
request, response = app.test_client.get(f"/files/{file_name}") request, response = app.test_client.get(f"/files/{file_name}")
assert response.status == 206 assert response.status == 206
assert "Content-Range" in response.headers assert "Content-Range" in response.headers
assert response.headers["Content-Range"] == f"bytes {range.start}-{range.end}/{range.total}" assert (
response.headers["Content-Range"]
== f"bytes {range.start}-{range.end}/{range.total}"
)
def test_raw_response(app): def test_raw_response(app):
@@ -602,3 +608,17 @@ def test_empty_response(app):
request, response = app.test_client.get("/test") request, response = app.test_client.get("/test")
assert response.content_type is None assert response.content_type is None
assert response.body == b"" assert response.body == b""
def test_response_body_bytes_deprecated(app):
with warnings.catch_warnings(record=True) as w:
warnings.simplefilter("always")
HTTPResponse(body_bytes=b"bytes")
assert len(w) == 1
assert issubclass(w[0].category, DeprecationWarning)
assert (
"Parameter `body_bytes` is deprecated, use `body` instead"
in str(w[0].message)
)

View File

@@ -531,6 +531,19 @@ def test_add_webscoket_route(app, strict_slashes):
assert ev.is_set() assert ev.is_set()
def test_add_webscoket_route_with_version(app):
ev = asyncio.Event()
async def handler(request, ws):
assert ws.subprotocol is None
ev.set()
app.add_websocket_route(handler, "/ws", version=1)
request, response = app.test_client.websocket("/v1/ws")
assert response.opened is True
assert ev.is_set()
def test_route_duplicate(app): def test_route_duplicate(app):
with pytest.raises(RouteExists): with pytest.raises(RouteExists):
@@ -580,7 +593,7 @@ async def test_websocket_route_asgi(app):
ev.clear() ev.clear()
request, response = await app.asgi_client.websocket("/test/1") request, response = await app.asgi_client.websocket("/test/1")
second_set = ev.is_set() second_set = ev.is_set()
assert(first_set and second_set) assert first_set and second_set
def test_method_not_allowed(app): def test_method_not_allowed(app):

View File

@@ -33,9 +33,7 @@ def after(app, loop):
calledq.put(mock.called) calledq.put(mock.called)
@pytest.mark.skipif( @pytest.mark.skipif(os.name == "nt", reason="May hang CI on py38/windows")
os.name == "nt", reason="May hang CI on py38/windows"
)
def test_register_system_signals(app): def test_register_system_signals(app):
"""Test if sanic register system signals""" """Test if sanic register system signals"""
@@ -51,9 +49,7 @@ def test_register_system_signals(app):
assert calledq.get() is True assert calledq.get() is True
@pytest.mark.skipif( @pytest.mark.skipif(os.name == "nt", reason="May hang CI on py38/windows")
os.name == "nt", reason="May hang CI on py38/windows"
)
def test_dont_register_system_signals(app): def test_dont_register_system_signals(app):
"""Test if sanic don't register system signals""" """Test if sanic don't register system signals"""
@@ -69,9 +65,7 @@ def test_dont_register_system_signals(app):
assert calledq.get() is False assert calledq.get() is False
@pytest.mark.skipif( @pytest.mark.skipif(os.name == "nt", reason="windows cannot SIGINT processes")
os.name == "nt", reason="windows cannot SIGINT processes"
)
def test_windows_workaround(): def test_windows_workaround():
"""Test Windows workaround (on any other OS)""" """Test Windows workaround (on any other OS)"""
# At least some code coverage, even though this test doesn't work on # At least some code coverage, even though this test doesn't work on

View File

@@ -97,9 +97,7 @@ def test_static_file_content_type(app, static_file_directory, file_name):
def test_static_directory(app, file_name, base_uri, static_file_directory): def test_static_directory(app, file_name, base_uri, static_file_directory):
app.static(base_uri, static_file_directory) app.static(base_uri, static_file_directory)
request, response = app.test_client.get( request, response = app.test_client.get(uri=f"{base_uri}/{file_name}")
uri=f"{base_uri}/{file_name}"
)
assert response.status == 200 assert response.status == 200
assert response.body == get_file_content(static_file_directory, file_name) assert response.body == get_file_content(static_file_directory, file_name)

235
tests/test_unix_socket.py Normal file
View File

@@ -0,0 +1,235 @@
import asyncio
import logging
import os
import subprocess
import sys
import httpx
import pytest
from sanic import Sanic
from sanic.response import text
pytestmark = pytest.mark.skipif(os.name != "posix", reason="UNIX only")
SOCKPATH = "/tmp/sanictest.sock"
SOCKPATH2 = "/tmp/sanictest2.sock"
@pytest.fixture(autouse=True)
def socket_cleanup():
try:
os.unlink(SOCKPATH)
except FileNotFoundError:
pass
try:
os.unlink(SOCKPATH2)
except FileNotFoundError:
pass
# Run test function
yield
try:
os.unlink(SOCKPATH2)
except FileNotFoundError:
pass
try:
os.unlink(SOCKPATH)
except FileNotFoundError:
pass
def test_unix_socket_creation(caplog):
from socket import AF_UNIX, socket
with socket(AF_UNIX) as sock:
sock.bind(SOCKPATH)
assert os.path.exists(SOCKPATH)
ino = os.stat(SOCKPATH).st_ino
app = Sanic(name=__name__)
@app.listener("after_server_start")
def running(app, loop):
assert os.path.exists(SOCKPATH)
assert ino != os.stat(SOCKPATH).st_ino
app.stop()
with caplog.at_level(logging.INFO):
app.run(unix=SOCKPATH)
assert (
"sanic.root",
logging.INFO,
f"Goin' Fast @ {SOCKPATH} http://...",
) in caplog.record_tuples
assert not os.path.exists(SOCKPATH)
def test_invalid_paths():
app = Sanic(name=__name__)
with pytest.raises(FileExistsError):
app.run(unix=".")
with pytest.raises(FileNotFoundError):
app.run(unix="no-such-directory/sanictest.sock")
def test_dont_replace_file():
with open(SOCKPATH, "w") as f:
f.write("File, not socket")
app = Sanic(name=__name__)
@app.listener("after_server_start")
def stop(app, loop):
app.stop()
with pytest.raises(FileExistsError):
app.run(unix=SOCKPATH)
def test_dont_follow_symlink():
from socket import AF_UNIX, socket
with socket(AF_UNIX) as sock:
sock.bind(SOCKPATH2)
os.symlink(SOCKPATH2, SOCKPATH)
app = Sanic(name=__name__)
@app.listener("after_server_start")
def stop(app, loop):
app.stop()
with pytest.raises(FileExistsError):
app.run(unix=SOCKPATH)
def test_socket_deleted_while_running():
app = Sanic(name=__name__)
@app.listener("after_server_start")
async def hack(app, loop):
os.unlink(SOCKPATH)
app.stop()
app.run(host="myhost.invalid", unix=SOCKPATH)
def test_socket_replaced_with_file():
app = Sanic(name=__name__)
@app.listener("after_server_start")
async def hack(app, loop):
os.unlink(SOCKPATH)
with open(SOCKPATH, "w") as f:
f.write("Not a socket")
app.stop()
app.run(host="myhost.invalid", unix=SOCKPATH)
def test_unix_connection():
app = Sanic(name=__name__)
@app.get("/")
def handler(request):
return text(f"{request.conn_info.server}")
@app.listener("after_server_start")
async def client(app, loop):
try:
async with httpx.AsyncClient(uds=SOCKPATH) as client:
r = await client.get("http://myhost.invalid/")
assert r.status_code == 200
assert r.text == os.path.abspath(SOCKPATH)
finally:
app.stop()
app.run(host="myhost.invalid", unix=SOCKPATH)
app_multi = Sanic(name=__name__)
def handler(request):
return text(f"{request.conn_info.server}")
async def client(app, loop):
try:
async with httpx.AsyncClient(uds=SOCKPATH) as client:
r = await client.get("http://myhost.invalid/")
assert r.status_code == 200
assert r.text == os.path.abspath(SOCKPATH)
finally:
app.stop()
def test_unix_connection_multiple_workers():
app_multi.get("/")(handler)
app_multi.listener("after_server_start")(client)
app_multi.run(host="myhost.invalid", unix=SOCKPATH, workers=2)
async def test_zero_downtime():
"""Graceful server termination and socket replacement on restarts"""
from signal import SIGINT
from time import monotonic as current_time
async def client():
for _ in range(40):
async with httpx.AsyncClient(uds=SOCKPATH) as client:
r = await client.get("http://localhost/sleep/0.1")
assert r.status_code == 200
assert r.text == f"Slept 0.1 seconds.\n"
def spawn():
command = [
sys.executable,
"-m",
"sanic",
"--unix",
SOCKPATH,
"examples.delayed_response.app",
]
DN = subprocess.DEVNULL
return subprocess.Popen(
command, stdin=DN, stdout=DN, stderr=subprocess.PIPE
)
try:
processes = [spawn()]
while not os.path.exists(SOCKPATH):
if processes[0].poll() is not None:
raise Exception("Worker did not start properly")
await asyncio.sleep(0.0001)
ino = os.stat(SOCKPATH).st_ino
task = asyncio.get_event_loop().create_task(client())
start_time = current_time()
while current_time() < start_time + 4:
# Start a new one and wait until the socket is replaced
processes.append(spawn())
while ino == os.stat(SOCKPATH).st_ino:
await asyncio.sleep(0.001)
ino = os.stat(SOCKPATH).st_ino
# Graceful termination of the previous one
processes[-2].send_signal(SIGINT)
# Wait until client has completed all requests
await task
processes[-1].send_signal(SIGINT)
for worker in processes:
try:
worker.wait(1.0)
except subprocess.TimeoutExpired:
raise Exception(
f"Worker would not terminate:\n{worker.stderr}"
)
finally:
for worker in processes:
worker.kill()
# Test for clean run and termination
assert len(processes) > 5
assert [worker.poll() for worker in processes] == len(processes) * [0]
assert not os.path.exists(SOCKPATH)

View File

@@ -1,3 +1,8 @@
import asyncio
from sanic.blueprints import Blueprint
def test_routes_with_host(app): def test_routes_with_host(app):
@app.route("/") @app.route("/")
@app.route("/", name="hostindex", host="example.com") @app.route("/", name="hostindex", host="example.com")
@@ -9,4 +14,50 @@ def test_routes_with_host(app):
assert app.url_for("hostindex") == "/" assert app.url_for("hostindex") == "/"
assert app.url_for("hostpath") == "/path" assert app.url_for("hostpath") == "/path"
assert app.url_for("hostindex", _external=True) == "http://example.com/" assert app.url_for("hostindex", _external=True) == "http://example.com/"
assert app.url_for("hostpath", _external=True) == "http://path.example.com/path" assert (
app.url_for("hostpath", _external=True)
== "http://path.example.com/path"
)
def test_websocket_bp_route_name(app):
"""Tests that blueprint websocket route is named."""
event = asyncio.Event()
bp = Blueprint("test_bp", url_prefix="/bp")
@bp.get("/main")
async def main(request):
...
@bp.websocket("/route")
async def test_route(request, ws):
event.set()
@bp.websocket("/route2")
async def test_route2(request, ws):
event.set()
@bp.websocket("/route3", name="foobar_3")
async def test_route3(request, ws):
event.set()
app.blueprint(bp)
uri = app.url_for("test_bp.main")
assert uri == "/bp/main"
uri = app.url_for("test_bp.test_route")
assert uri == "/bp/route"
request, response = app.test_client.websocket(uri)
assert response.opened is True
assert event.is_set()
event.clear()
uri = app.url_for("test_bp.test_route2")
assert uri == "/bp/route2"
request, response = app.test_client.websocket(uri)
assert response.opened is True
assert event.is_set()
uri = app.url_for("test_bp.foobar_3")
assert uri == "/bp/route3"

View File

@@ -151,8 +151,7 @@ def test_with_custom_class_methods(app):
def get(self, request): def get(self, request):
self._iternal_method() self._iternal_method()
return text( return text(
f"I am get method and global var " f"I am get method and global var " f"is {self.global_var}"
f"is {self.global_var}"
) )
app.add_route(DummyView.as_view(), "/") app.add_route(DummyView.as_view(), "/")

View File

@@ -128,9 +128,11 @@ def test_handle_quit(worker):
assert not worker.alive assert not worker.alive
assert worker.exit_code == 0 assert worker.exit_code == 0
async def _a_noop(*a, **kw): async def _a_noop(*a, **kw):
pass pass
def test_run_max_requests_exceeded(worker): def test_run_max_requests_exceeded(worker):
loop = asyncio.new_event_loop() loop = asyncio.new_event_loop()
worker.ppid = 1 worker.ppid = 1

View File

@@ -19,7 +19,7 @@ deps =
gunicorn gunicorn
pytest-benchmark pytest-benchmark
uvicorn uvicorn
websockets>=7.0,<8.0 websockets>=8.1,<9.0
commands = commands =
pytest {posargs:tests --cov sanic} pytest {posargs:tests --cov sanic}
- coverage combine --append - coverage combine --append