Compare commits

...

113 Commits

Author SHA1 Message Date
Adam Hopkins
5928c50057 Version 20.9 (#1940) 2020-09-30 17:30:21 +03:00
Tomasz Drożdż
1de4bcef55 Update config (#1903)
* New aproach for uploading sanic app config.

* Update config.rst

Co-authored-by: tigerthelion <bjt.thompson@gmail.com>
Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-09-30 16:44:09 +03:00
Adam Hopkins
7b7559309d Add issue config.yml (#1936)
* Add issue config.yml

* Update SECURITY.md
2020-09-30 15:38:08 +03:00
Adam Hopkins
066df2c142 Add text and json fallback error handlers (#1937)
* Add text and json fallback error handlers

* Add tests and auto-detect error fallback type
2020-09-30 15:11:27 +03:00
Adam Hopkins
0c4a9b1dce Merge pull request #1909 from brooklet/master
fix websocket ping variables issues
2020-09-29 01:08:04 +03:00
Adam Hopkins
65a7060d3b Merge branch 'master' into master 2020-09-29 00:41:22 +03:00
Adam Hopkins
3483e7b061 Fix linting issues 2020-09-29 00:40:24 +03:00
Adam Hopkins
13094e02bc Revert check for websocket protocol to use hasattr 2020-09-29 00:24:00 +03:00
Ashley Sommer
ed777e9d5b Merge pull request #1935 from huge-success/httpx-upgrade
Upgrade httpx
2020-09-28 09:06:37 +10:00
Adam Hopkins
8ad80a282a Merge branch 'master' into httpx-upgrade 2020-09-27 11:20:07 +03:00
Adam Hopkins
0b7eb49839 Merge pull request #1924 from tomaszdrozdz/strict_markers_for_pytest
Adding --strict-markers for pytest
2020-09-27 11:18:24 +03:00
Adam Hopkins
de3b40c2e6 Merge branch 'master' into strict_markers_for_pytest 2020-09-27 10:57:31 +03:00
Adam Hopkins
efa0aaf2c2 Add asyncio markers to tox.ini 2020-09-27 10:46:51 +03:00
Adam Hopkins
bd4e1cdc1e squash 2020-09-27 10:27:12 +03:00
Adam Hopkins
eb8df1fc18 Upgrade httpx 2020-09-27 02:58:36 +03:00
tomaszdrozdz
9a8e49751d Adding --strict-markers for pytest 2020-09-08 13:08:49 +02:00
raphaelauv
58e15134fd Add explicit ASGI compliance to the README (#1922) 2020-09-02 23:22:02 +03:00
Adam Hopkins
875be11ae5 Update README.rst (#1917) 2020-08-27 10:28:56 +03:00
Andrew Scott
3f7c9ea3f5 feat: fixes exception due to unread bytes in stream (#1897)
* feat: fixes exception due to unread bytes in stream

* feat: additonal unit tests to cover changes

* fix: automated changes by `make fix-import`

* fix: additonal changes by `make fix-import`

Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-08-27 10:22:02 +03:00
brook
33aa4daac8 fixed the problem that the websocket ping_timeout and ping_interval parameter settings did not take effect 2020-08-13 14:39:55 +08:00
Shawn Hill
58e4087d4b Add websocket ping variables (#1906)
* Add config params for websocket ping_timeout & ping_interval

* Include changelog

* Pass websocket config values to WebSocketProtocol init, test

* Linting

* Improve docs

Co-authored-by: shawnhill <shawn.hill@equipmentshare.com>
2020-08-07 06:37:59 +03:00
Ashley Sommer
0072fd1573 Add an additional component to the request_data context test. This checks if items stored a request.ctx are able to be accessed from a response-middleware after a response is issued. (#1888)
Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-07-29 14:25:31 +03:00
Lee Tat Wai David
5d5ed10a45 Websocket subprotocol (#1887)
* Added fix to include subprotocols from scope

* Added unit test to validate fix

* Changes by black

* Made changes to WebsocketConnection protocol

* Linter changes

* Added unit tests

* Fixing bugs in linting due to isort import checks

* Reverting compat import changes

* Fixing linter errors in compat.py

Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-07-29 14:09:26 +03:00
Ashley Sommer
5ee8ee7b04 Merge pull request #1894 from huge-success/test_mode
add a test_mode boolean variable to sanic `app` which is set to True when using Sanic TestClient or ASGIClient, and False all other times.
2020-07-15 22:46:23 +10:00
Adam Hopkins
521ae7f60e squash 2020-07-14 10:41:28 +03:00
Adam Hopkins
27c8c12420 squash 2020-07-14 10:30:48 +03:00
Adam Hopkins
3d1f100781 squash 2020-07-14 10:30:01 +03:00
Adam Hopkins
16d36fc17f squash 2020-07-14 10:25:56 +03:00
Adam Hopkins
eddb5bad91 squash 2020-07-14 10:25:30 +03:00
Adam Hopkins
23e1b5ee3f squash 2020-07-14 10:23:31 +03:00
Adam Hopkins
9e053bef19 squash 2020-07-14 10:13:30 +03:00
Adam Hopkins
cf234fca15 squash this 2020-07-13 23:59:45 +03:00
Adam Hopkins
050a563e1d Add documentation on test mode 2020-07-09 14:57:42 +03:00
Adam Hopkins
c347ff742e Add app.test_mode which is set on testing calls 2020-07-09 14:52:58 +03:00
Adam Hopkins
db1c819fe1 Merge branch 'master' of github.com:huge-success/sanic 2020-07-09 14:24:06 +03:00
Egor
9f2818ee29 Remove version section (#1893) 2020-07-09 07:17:50 +03:00
Adam Hopkins
26aa6d23c7 Fix imports and isort to remove from Makefile deprecated options (#1891)
* Version

* Version 20.6.1

* Fix imports and isort to remove from Makefile deprecated options

* duplicate the mypy ignore hint across both lines

after splitting the `from trio import ...` statement onto two lines, need to duplicate the mypy ignore hint across both lines to keep mypy from complaining

Co-authored-by: Ashley Sommer <ashleysommer@gmail.com>
2020-07-07 16:13:03 +03:00
Adam Hopkins
ec7e894eb3 Merge branch 'master' of github.com:huge-success/sanic 2020-07-07 08:46:01 +03:00
Ashley Sommer
71a08382d6 Adjust isort options and invocation to work on isort 5.0.0 (#1890)
isort 5.0.0 removed command line option `recursive` and removed config option `not_skip`.
2020-07-07 08:43:33 +03:00
Adam Hopkins
09224f8676 Merge branch 'master' of github.com:huge-success/sanic 2020-06-29 15:19:32 +03:00
Adam Hopkins
008b8ac394 V2.6.3 changelog (#1886)
* Version

* Version 20.6.1

* v2.6.3 changelog and version
2020-06-29 15:16:06 +03:00
Adam Hopkins
a357add14e Merge branch 'master' of github.com:huge-success/sanic 2020-06-29 14:55:52 +03:00
Adam Hopkins
0cfd7b528b V20.6.2 changelog (#1885)
* Version

* Version 20.6.1

* CHANGELOG for v20.6.2
2020-06-29 14:54:44 +03:00
Adam Hopkins
9ba4fe05df Merge branch 'master' of github.com:huge-success/sanic 2020-06-29 14:54:02 +03:00
Ashley Sommer
35786b4b74 Revert change to multiprocessing mode (#1884)
Revert change to multiprocessing mode accidentally included in https://github.com/huge-success/sanic/pull/1853

Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-29 14:51:30 +03:00
Ashley Sommer
c7430d805a Revert change to multiprocessing mode (#1884)
Revert change to multiprocessing mode accidentally included in https://github.com/huge-success/sanic/pull/1853

Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-29 13:51:55 +03:00
Adam Hopkins
8a3fbb555f Merge branch 'master' of github.com:huge-success/sanic 2020-06-29 08:56:20 +03:00
L. Kärkkäinen
a62c84a954 Socket binding implemented properly for IPv6 and UNIX sockets. (#1641)
* Socket binding implemented properly for IPv6 and UNIX sockets.

- app.run("::1") for IPv6
- app.run("unix:/tmp/server.sock") for UNIX sockets
- app.run("localhost") retains old functionality (randomly either IPv4 or IPv6)

Do note that IPv6 and UNIX sockets are not fully supported by other Sanic facilities.
In particular, request.server_name and request.server_port are currently unreliable.

* Fix Windows compatibility by not referring to socket.AF_UNIX unless needed.

* Compatibility fix.

* Fix test of existing unix socket.

* Cleaner unix socket removal.

* Remove unix socket on exit also with workers=1.

* More pedantic UNIX socket implementation.

* Refactor app to take unix= argument instead of unix:-prefixed host. Goin' fast @ unix-socket fixed.

* Linter

* Proxy properties cleanup. Slight changes of semantics. SERVER_NAME now overrides everything.

* Have server fill in connection info instead of request asking the socket.

- Would be a good idea to remove request.transport entirely but I didn't dare to touch it yet.

* Linter 💣🌟💀

* Fix typing issues. request.server_name returns empty string if host header is missing.

* Fix tests

* Tests were failing, fix connection info.

* Linter nazi says you need that empty line.

* Rename a to addr, leave client empty for unix sockets.

* Add --unix support when sanic is run as module.

* Remove remove_route, deprecated in 19.6.

* Improved unix socket binding.

* More robust creating and unlinking of sockets. Show proper and not temporary name in conn_info.

* Add comprehensive tests for unix socket mode.

* Hide some imports inside functions to avoid Windows failure.

* Mention unix socket mode in deployment docs.

* Fix merge commit.

* Make test_unix_connection_multiple_workers pickleable for spawn mode multiprocessing.

Co-authored-by: L. Kärkkäinen <tronic@users.noreply.github.com>
Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-29 08:55:32 +03:00
Adam Hopkins
4aba74d050 V20.6 version (#1882)
* Version

* Version 20.6.1

Co-authored-by: 7 <yunxu1992@gmail.com>
2020-06-29 00:17:24 +03:00
Adam Hopkins
ab2cb88cf4 CHANGELOG for v 20.6 and documentation change for sanic command (#1881)
* CHANGELOG for v 20.6 and documentation change for sanic command

* Update CHANGELOG.rst

20.6.0 and 20.6.1 are the same release. One change from `blueprints` had not made it in by accident, therefore the second subsequent release.
2020-06-28 11:42:12 -07:00
Adam Hopkins
e79ec7d7e0 Version 20.6.1 2020-06-28 17:21:48 +03:00
Adam Hopkins
ccdb74a9a7 Merge branch 'master' of github.com:huge-success/sanic 2020-06-28 17:21:12 +03:00
Adam Hopkins
7b96d633db Version 2020-06-28 17:19:57 +03:00
Adam Hopkins
938c49b899 Add handler names for websockets for url_for usage (#1880) 2020-06-28 14:45:52 +03:00
Ashley Sommer
761eef7d96 Fix pickle error when attempting to pickle an application which contains websocket routes. (#1853)
Moves the websocket_handler subfunction out to a class-level method, which can be more easily pickled by the built-in python Pickler.
Also includes a similar fix for the add_task deferred task scheduler subfunction.

Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-28 11:05:06 +03:00
David Bordeynik
83511a0ba7 fix-#1851: correct step name (#1852)
* fix-#1851: correct step name

* fix-#1851: correct step name elsewhere as well

Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-28 10:52:43 +03:00
Damian Jimenez
cf9ccdae47 Bug fix for host parameter issue with lists (#1776)
* Bug fix for host parameter issue with lists

As explained in #1772 there is an issue when using a list as an argument for the host parameter in the Blueprint.route() decorator. I've traced the issue back to this line, and the if conditional should ensure that the name attribute isn't accessed when route is None.

* Unit tests for blueprint.route host paramter set to list.

Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-28 09:42:18 +03:00
Kiril Yershov
d81096fdc0 Clarified response middleware execution order in the documentation (#1846)
Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-28 09:29:48 +03:00
Adam Hopkins
6c8e20a859 Add version parameter to websocket routes (#1760)
* Add version parameter to websockets

* Run black and cleanup code
2020-06-28 09:17:18 +03:00
Liran Nuna
6239fa4f56 Deprecate body_bytes to merge into body (#1739)
Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-28 08:59:23 +03:00
David Bordeynik
1b324ae981 fix-#1856: adjust websockets version to setup.py and make nightly (py39) tests pass (#1857)
* fix-#1856: adjust websockets version to setup.py and make nightly (py39) tests pass

* fix-#1856: set min websockets version to 8.1

* fix-#1856: suppress timeout for CI to pass

* fix-#1856: timeout -> close_timeout due to deprecation warning

Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
Co-authored-by: 7 <yunxu1992@gmail.com>
2020-06-28 08:43:12 +03:00
Linus Groh
bedf68a9b2 Wrap run()'s "protocol" type annotation in Optional[] (#1869)
As the default is None and the function will determine a sane value
in that case, the correct annotation is "Optional[Type[Protocol]]".
2020-06-11 11:40:12 -07:00
Adam Hopkins
496e87e4ba Add sanic as an entry point command (#1866)
* Add sanic as an entry point command

* Fix linting issue in imports

Co-authored-by: 7 <yunxu1992@gmail.com>
2020-06-05 07:14:18 -07:00
Luca Fabbri
fa4f85eb32 Fixing rst format issue (#1865)
Co-authored-by: 7 <yunxu1992@gmail.com>
2020-06-04 17:08:14 -07:00
Adam Hopkins
1b1dfedc74 Add changes from version 20.3 to CHANGELOG (#1867) 2020-06-04 15:45:55 -07:00
L. Kärkkäinen
230941ff4f Fix reloader on OSX py38 and Windows (#1827)
* Fix watchdog reload worker repeatedly if there are multiple changed files

* Simplify autoreloader, don't need multiprocessing.Process. Now works on OSX py38.

* Allow autoreloader with multiple workers and run it earlier.

* This works OK on Windows too.

* I don't see how cwd could be different here.

* app.run and app.create_server argument fixup.

* Add test for auto_reload (coverage not working unfortunately).

* Reloader cleanup, don't use external kill commands and exit normally.

* Strip newlines on test output (Windows-compat).

* Report failures in test_auto_reload to avoid timeouts.

* Use different test server ports to avoid binding problems on Windows.

* Fix previous commit

* Listen on same port after reload.

* Show Goin' Fast banner on reloads.

* More robust testing, also -m sanic.

* Add a timeout to terminate process

* Try a workaround for tmpdir deletion on Windows.

* Join process also on error (context manager doesn't).

* Cleaner autoreloader termination on Windows.

* Remove unused code.

* Rename test.

* Longer timeout on test exit.

Co-authored-by: Hùng X. Lê <lexhung@gmail.com>
Co-authored-by: L. Kärkkäinen <tronic@users.noreply.github.com>
Co-authored-by: Adam Hopkins <admhpkns@gmail.com>
2020-06-03 16:45:07 +03:00
Adam Hopkins
4658e0f2f3 Merge pull request #1842 from ashleysommer/fix_pickle_again
Fix static _handler pickling error.
2020-06-03 15:53:17 +03:00
Ashley Sommer
7c3c532dae Merge branch 'master' into fix_pickle_again 2020-05-14 20:48:06 +10:00
Ashley Sommer
7c04c9a227 Merge pull request #1848 from ashleysommer/fix_named_response_middleware
Reverse named_response_middlware execution order, to match normal response middleware execution order.
2020-05-14 20:45:29 +10:00
Ashley Sommer
44973125c1 Reverse named_response_middlware execution order, to match normal response middleware execution order.
Fixes #1847
Adds a test to ensure fix is correct
Adds an example which demonstrates correct blueprint-middlware execution order behavior.
2020-05-14 09:54:47 +10:00
Adam Hopkins
6aaccd1e8b Merge branch 'master' into fix_pickle_again 2020-05-13 15:46:37 +03:00
7
e7001b0074 release 20.3.0 (#1844) 2020-05-12 16:58:42 -07:00
Ashley Sommer
aacbd022cf Fix static _handler pickling error.
Moves the subfunction _handler out to a module-level function, and parameterizes it with functools.partial().
Fixes the case when picking a sanic app which has a registered static route handler. This is usually encountered when attempting to use multiprocessing or auto_reload on OSX or Windows.
Fixes #1774
2020-05-07 11:58:36 +10:00
WH-2099
ae1874ce34 Delete unnecessary isolated blanks and letters. (#1838) 2020-04-30 10:07:06 -07:00
L. Kärkkäinen
8abba597a8 Do not set content-type and content-length headers in exceptions. (#1820)
Co-authored-by: L. Kärkkäinen <tronic@users.noreply.github.com>
2020-04-25 20:18:59 -07:00
Prasanna Walimbe
9987893963 Update docs for order of listeners #1805 (#1834) 2020-04-25 17:03:23 -07:00
Jacob
638322d905 docs: Fix doc build (#1833)
* docs: Fix doc build

* docs: Use python-3.8 instead

* test: Remove pytest-asyncio form tox.ini
2020-04-24 14:13:35 -07:00
wangqr
ae40f960ff Import ASGIDispatch from top-level httpx (#1806)
As importing from submodules of httpx is deprecated and removed in 0.12.0
2020-04-10 12:03:51 -07:00
koug44
d969fdc19f [Doc] Update getting_started.rst (#1814)
* Update getting_started.rst

Replacing command to install Sanic without uvloop as the provided one is not working (at least in my case)

* Same thing as oneliner

* Update getting_started.rst

Dummy commit for Travis
2020-04-09 10:07:07 -07:00
L. Kärkkäinen
710024125e Remove server config args that can be read directly from app. (#1807)
* Remove server config args that can be read directly from app.

* Linter

Co-authored-by: L. Kärkkäinen <tronic@users.noreply.github.com>
2020-04-08 22:10:58 -07:00
Mykhailo Yusko
9a39aff803 Replaced str.format() method in core functionality (#1819)
* Replaced str.format() method in core functionality

* Fixed linter checks
2020-04-06 12:45:25 -07:00
L. Kärkkäinen
78e912ea45 Update docs with changes done in 20.3 (#1822)
* Remove raw_args from docs (deprecated feature removed in Sanic 20.3).

* Add missing Sanic(name) arguments in docs. Merge async/non-async class view examples.

Co-authored-by: L. Kärkkäinen <tronic@users.noreply.github.com>
2020-03-31 10:57:09 -07:00
L. Kärkkäinen
aa6ea5b5a0 Updated deployment docs (#1821)
* Updated deployment docs

* Wording and formatting.

Co-authored-by: L. Kärkkäinen <tronic@users.noreply.github.com>
2020-03-28 11:43:42 -07:00
L. Kärkkäinen
48800e657f Deprecation and test cleanup (#1818)
* Remove remove_route, deprecated in 19.6.

* No need for py35 compat anymore.

* Rewrite asyncio.coroutines with async/await.

* Remove deprecated request.raw_args.

* response.text() takes str only: avoid deprecation warning in all but one test.

* Remove unused import.

* Revert unnecessary deprecation warning.

* Remove apparently unnecessary py38 compat.

* Avoid asyncio.Task.all_tasks deprecation warning.

* Avoid warning on a test that tests deprecated response.text(int).

* Add pytest-asyncio to tox deps.

* Run the coroutine returned by AsyncioServer.close.

Co-authored-by: L. Kärkkäinen <tronic@users.noreply.github.com>
2020-03-28 11:43:14 -07:00
L. Kärkkäinen
120f0262f7 Fix Ctrl+C and tests on Windows. (#1808)
* Fix Ctrl+C on Windows.

* Disable testing of a function N/A on Windows.

* Add test for coverage, avoid crash on missing _stopping.

* Initialise StreamingHTTPResponse.protocol = None

* Improved comments.

* Reduce amount of data in test_request_stream to avoid failures on Windows.

* The Windows test doesn't work on Windows :(

* Use port numbers more likely to be free than 8000.

* Disable the other signal tests on Windows as well.

* Windows doesn't properly support SO_REUSEADDR, so that's disabled in Python, and thus rebinding fails. For successful testing, reuse port instead.

* app.run argument handling: added server kwargs (alike create_server), added warning on extra kwargs, made auto_reload explicit argument. Another go at Windows tests

* Revert "app.run argument handling: added server kwargs (alike create_server), added warning on extra kwargs, made auto_reload explicit argument. Another go at Windows tests"

This reverts commit dc5d682448.

* Use random test server port on most tests. Should avoid port/addr reuse issues.

* Another test to random port instead of 8000.

* Fix deprecation warnings about missing name on Sanic() in tests.

* Linter and typing

* Increase test coverage

* Rewrite test for ctrlc_windows_workaround

* py36 compat

* py36 compat

* py36 compat

* Don't rely on loop internals but add a stopping flag to app.

* App may be restarted.

* py36 compat

* Linter

* Add a constant for OS checking.

Co-authored-by: L. Kärkkäinen <tronic@users.noreply.github.com>
2020-03-25 21:42:46 -07:00
L. Kärkkäinen
4db075ffc1 Streaming migration for 20.3 release (#1800)
* Compatibility and deprecations for Sanic 20.3 in preparation of the streaming branch.

* Add test for new API.

* isort tests

* More coverage

* json takes str, not bytes

Co-authored-by: L. Kärkkäinen <tronic@users.noreply.github.com>
2020-03-24 10:11:09 -07:00
Kevin Guillaumond
60b4efad67 Update config docs to match DEFAULT_CONFIG (#1803)
* Set REAL_IP_HEADER's default value to "X-Real-IP"

* Update config instead
2020-03-14 08:57:39 -07:00
L. Kärkkäinen
319388d78b Remove the old request context API deprecated in 19.9. Use request.ctx instead. (#1801)
Co-authored-by: L. Kärkkäinen <tronic@users.noreply.github.com>
2020-03-05 21:40:46 -08:00
Subham Roy
ce71514d71 bump httpx dependency version to 0.11.1 (#1794) 2020-03-01 11:42:11 -08:00
L. Kärkkäinen
7833d70d9e Allow multiple workers on MacOS with Python 3.8. Fallback to single worker on Windows until pickling can be fixed. (#1798) 2020-03-01 11:41:09 -08:00
Mykhailo Yusko
16961fab9d Use f-strings instead of str.format() (#1793) 2020-02-25 14:01:13 -06:00
L. Kärkkäinen
861e87347a Fix #1788 incorrect url_for for routes with hosts, added tests. (#1789)
* Fix #1788 incorrect url_for for routes with hosts, added tests.

* Linter

* Remove debug print
2020-02-21 09:10:22 -08:00
Tim Gates
91f6abaa81 Fix simple typo: viewes -> views (#1783)
Closes #1782
2020-02-17 10:16:58 -06:00
Eli Uriegas
d380b52f9a Merge pull request #1784 from gdub/changelog_correction
Corrected changelog for docs move of MD to RST (#1691).
2020-02-15 17:09:41 -08:00
Gary Wilson Jr
d656a06a19 Corrected changelog for docs move of MD to RST (#1691). 2020-02-11 11:45:56 -06:00
Adam Hopkins
258dbee3b9 Py38 tox env (#1752)
* Set version

Set version

* Add Python 3.8 to tests and package classifiers

Add Python3.8 to Appveyor config
2020-02-05 13:17:55 -06:00
Sudeep Mandal
6b9287b076 Update README re: experimental support for Windows (#1778)
As mentioned in #1517 , Windows support is "experimental" and does not currently support multiple workers.
2020-02-03 10:27:56 -06:00
L. Kärkkäinen
dac0514441 Code cleanup in file responses (#1769)
* Code cleanup in file responses.

* Lint
2020-01-26 22:08:34 -08:00
L. Kärkkäinen
bffdb3b5c2 More robust response datatype handling (#1674)
* HTTP1 header formatting moved to headers.format_headers and rewritten.

- New implementation is one line of code and twice faster than the old one.
- Whole header block encoded to UTF-8 in one pass.
- No longer supports custom encode method on header values.
- Cookie objects now have __str__ in addition to encode, to work with this.

* Linter

* format_http1_response

* Replace encode_body with faster implementation based on f-string.

Benchmarks:

def encode_body(data):
    try:
        # Try to encode it regularly
        return data.encode()
    except AttributeError:
        # Convert it to a str if you can't
        return str(data).encode()

def encode_body2(data):
    return f"{data}".encode()

def encode_body3(data):
    return str(data).encode()

data_str, data_int = "foo", 123

%timeit encode_body(data_int)
928 ns ± 2.96 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)

%timeit encode_body2(data_int)
280 ns ± 2.09 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)

%timeit encode_body3(data_int)
387 ns ± 1.7 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)

%timeit encode_body(data_str)
202 ns ± 1.9 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)

%timeit encode_body2(data_str)
197 ns ± 0.507 ns per loop (mean ± std. dev. of 7 runs, 10000000 loops each)

%timeit encode_body3(data_str)
313 ns ± 1.28 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)

* Wtf linter

* Content-type fixes.

* Body encoding sanitation, first pass.
- body/data type autodetection fixed.
- do not repr(body).encode() bytes-ish values.
- support __html__ and _repr_html_ in sanic.response.html().

* <any type>-to-str response autoconversion limited to sanic.response.text() only.

* Workaround MyPy issue.

* Add an empty line to make isort happy.

* Add html test for __html__ and _repr_html_.

* Remove StreamingHTTPResponse.get_headers helper function.

* Add back HTTPResponse Keep-Alive removed by earlier merge or something.

* Revert "Remove StreamingHTTPResponse.get_headers helper function."

Tests depend on this otherwise useless function.

This reverts commit 9651e6ae01.

* Add deprecation warnings; instead of assert for wrong HTTP version, and for non-string response.text.

* Add back missing import.

* Avoid duplicate response header tweaking code.

* Linter errors
2020-01-20 10:34:32 -06:00
L. Kärkkäinen
e908ca8cef [Trio] Quick fixes to make Sanic usable on hypercorn -k trio myweb.app (#1767)
* Quick fixes to make Sanic usable on hypercorn -k trio myweb.app

* Quick'n dirty compatibility and autodetection of hypercorn trio mode.

* mypy ignore for aiofiles/trio.

* lint
2020-01-20 10:29:06 -06:00
Ashley Sommer
801595e24a Add server.start_serving and server.serve_forever to AsyncioServer proxy object, to match asyncio-python3.7 example doc, fixes #1754 (#1762) 2020-01-20 09:00:48 -06:00
L. Kärkkäinen
ba9b432993 No tracebacks on normal errors and prettier error pages (#1768)
* Default error handler now only logs traceback on 500 errors and all responses are HTML formatted.

* Tests passing.

* Ability to flag any exception object with self.quiet = True following @ashleysommer suggestion.

* Refactor HTML formatting into errorpages.py. String escapes for debug tracebacks.

* Remove extra includes

* Auto-set quiet flag also when decorator is used.

* Cleanup, make error pages (probably) HTML5-compliant and similar for debug and non-debug modes.

* Fix lookup of non-existant status codes

* No logging of 503 errors after all.

* lint
2020-01-20 08:58:14 -06:00
Ashley Sommer
b565072ed9 Allow route decorators to stack up again (#1764)
* Allow route decorators to stack up without causing a function signature inspection crash
Fix #1742

* Apply fix to @websocket routes docorator too
Add test for double-stacked websocket decorator
remove introduction of new variable in route wrapper, extend routes in-place.
Add explanation of why a handler will be a tuple in the case of a double-stacked route decorator
2020-01-10 21:50:16 -08:00
Ashley Sommer
caa1b4d69b Fix dangling comma in arguments list for HTTPResponse in response.empty() (#1761)
* Fix dangling comma arguments list for HTTPResponse in response.empty()

* Found another black error, including another dangling comma
2020-01-10 19:58:01 -08:00
Liran Nuna
865536c5c4 Simplify status code to text lookup (#1738) 2020-01-10 08:43:44 -06:00
Eli Uriegas
784d5cce52 Merge pull request #1755 from Lagicrus/empty-response
Update docs
2020-01-04 19:15:24 -08:00
Lagicrus
0fd08c6114 Update response.rst 2020-01-04 21:26:03 +00:00
Lagicrus
cd779b6e4f Update response.rst 2020-01-04 19:51:50 +00:00
好风
3430907046 fix 1748 : Drop DeprecationWarning in python 3.8 (#1750)
https://github.com/huge-success/sanic/issues/1748
2020-01-03 20:20:42 -08:00
Eli Uriegas
2f776eba85 Release v19.12.0 (#1747)
Release v19.12.0
2020-01-03 11:50:33 -08:00
Adam Hopkins
b9cd2ed1f1 Merge pull request #1751 from huge-success/master
Move Release into LTS Branch
2020-01-02 23:45:08 +02:00
Stephen Sadowski
3411a12c40 Pipfile crud removed 2019-12-26 18:50:52 -06:00
Stephen Sadowski
28899356c8 Bumping up version from 19.12.0 to 19.12.0 2019-12-26 18:47:56 -06:00
103 changed files with 3923 additions and 1826 deletions

View File

@@ -12,6 +12,11 @@ environment:
PYTHON_VERSION: "3.7.x" PYTHON_VERSION: "3.7.x"
PYTHON_ARCH: "64" PYTHON_ARCH: "64"
- TOXENV: py38-no-ext
PYTHON: "C:\\Python38-x64"
PYTHON_VERSION: "3.8.x"
PYTHON_ARCH: "64"
init: SET "PATH=%PYTHON%;%PYTHON%\\Scripts;%PATH%" init: SET "PATH=%PYTHON%;%PYTHON%\\Scripts;%PATH%"
install: install:

5
.github/ISSUE_TEMPLATE/config.yml vendored Normal file
View File

@@ -0,0 +1,5 @@
blank_issues_enabled: true
contact_links:
- name: Questions and Help
url: https://community.sanicframework.org/c/questions-and-help
about: Do you need help with Sanic? Ask your questions here.

View File

@@ -1,13 +0,0 @@
---
name: Help wanted
about: Do you need help? Try community.sanicframework.org
---
*DELETE ALL BEFORE POSTING*
*Post your HELP WANTED questions on [the community forum](https://community.sanicframework.org/)*.
Checkout the community forum before posting any question here.
We prefer if you put these kinds of questions here:
https://community.sanicframework.org/c/questions-and-help

View File

@@ -21,23 +21,46 @@ matrix:
dist: xenial dist: xenial
sudo: true sudo: true
name: "Python 3.7 without Extensions" name: "Python 3.7 without Extensions"
- env: TOX_ENV=py38
python: 3.8
dist: xenial
sudo: true
name: "Python 3.8 with Extensions"
- env: TOX_ENV=py38-no-ext
python: 3.8
dist: xenial
sudo: true
name: "Python 3.8 without Extensions"
- env: TOX_ENV=type-checking - env: TOX_ENV=type-checking
python: 3.6 python: 3.6
name: "Python 3.6 Type checks" name: "Python 3.6 Type checks"
- env: TOX_ENV=type-checking - env: TOX_ENV=type-checking
python: 3.7 python: 3.7
name: "Python 3.7 Type checks" name: "Python 3.7 Type checks"
- env: TOX_ENV=type-checking
python: 3.8
name: "Python 3.8 Type checks"
- env: TOX_ENV=lint - env: TOX_ENV=lint
python: 3.6 python: 3.6
name: "Python 3.6 Linter checks" name: "Python 3.6 Linter checks"
- env: TOX_ENV=check - env: TOX_ENV=check
python: 3.6 python: 3.6
name: "Python 3.6 Package checks" name: "Python 3.6 Package checks"
- env: TOX_ENV=security
python: 3.6
dist: xenial
sudo: true
name: "Python 3.6 Bandit security scan"
- env: TOX_ENV=security - env: TOX_ENV=security
python: 3.7 python: 3.7
dist: xenial dist: xenial
sudo: true sudo: true
name: "Python 3.7 Bandit security scan" name: "Python 3.7 Bandit security scan"
- env: TOX_ENV=security
python: 3.8
dist: xenial
sudo: true
name: "Python 3.8 Bandit security scan"
- env: TOX_ENV=docs - env: TOX_ENV=docs
python: 3.7 python: 3.7
dist: xenial dist: xenial
@@ -48,14 +71,14 @@ matrix:
name: "Python nightly with Extensions" name: "Python nightly with Extensions"
- env: TOX_ENV=pyNightly-no-ext - env: TOX_ENV=pyNightly-no-ext
python: 'nightly' python: 'nightly'
name: "Python nightly Extensions" name: "Python nightly without Extensions"
allow_failures: allow_failures:
- env: TOX_ENV=pyNightly - env: TOX_ENV=pyNightly
python: 'nightly' python: 'nightly'
name: "Python nightly with Extensions" name: "Python nightly with Extensions"
- env: TOX_ENV=pyNightly-no-ext - env: TOX_ENV=pyNightly-no-ext
python: 'nightly' python: 'nightly'
name: "Python nightly Extensions" name: "Python nightly without Extensions"
install: install:
- pip install -U tox - pip install -U tox
- pip install codecov - pip install codecov

View File

@@ -1,3 +1,308 @@
Version 20.9.0
===============
Features
********
*
`#1887 <https://github.com/huge-success/sanic/pull/1887>`_
Pass subprotocols in websockets (both sanic server and ASGI)
*
`#1894 <https://github.com/huge-success/sanic/pull/1894>`_
Automatically set ``test_mode`` flag on app instance
*
`#1903 <https://github.com/huge-success/sanic/pull/1903>`_
Add new unified method for updating app values
*
`#1906 <https://github.com/huge-success/sanic/pull/1906>`_,
`#1909 <https://github.com/huge-success/sanic/pull/1909>`_
Adds WEBSOCKET_PING_TIMEOUT and WEBSOCKET_PING_INTERVAL configuration values
*
`#1935 <https://github.com/huge-success/sanic/pull/1935>`_
httpx version dependency updated, it is slated for removal as a dependency in v20.12
*
`#1937 <https://github.com/huge-success/sanic/pull/1937>`_
Added auto, text, and json fallback error handlers (in v21.3, the default will change form html to auto)
Bugfixes
********
*
`#1897 <https://github.com/huge-success/sanic/pull/1897>`_
Resolves exception from unread bytes in stream
Deprecations and Removals
*************************
*
`#1903 <https://github.com/huge-success/sanic/pull/1903>`_
config.from_envar, config.from_pyfile, and config.from_object are deprecated and set to be removed in v21.3
Developer infrastructure
************************
*
`#1890 <https://github.com/huge-success/sanic/pull/1890>`_,
`#1891 <https://github.com/huge-success/sanic/pull/1891>`_
Update isort calls to be compatible with new API
*
`#1893 <https://github.com/huge-success/sanic/pull/1893>`_
Remove version section from setup.cfg
*
`#1924 <https://github.com/huge-success/sanic/pull/1924>`_
Adding --strict-markers for pytest
Improved Documentation
**********************
*
`#1922 <https://github.com/huge-success/sanic/pull/1922>`_
Add explicit ASGI compliance to the README
Version 20.6.3
===============
Bugfixes
********
*
`#1884 <https://github.com/huge-success/sanic/pull/1884>`_
Revert change to multiprocessing mode
Version 20.6.2
===============
Features
********
*
`#1641 <https://github.com/huge-success/sanic/pull/1641>`_
Socket binding implemented properly for IPv6 and UNIX sockets
Version 20.6.1
===============
Features
********
*
`#1760 <https://github.com/huge-success/sanic/pull/1760>`_
Add version parameter to websocket routes
*
`#1866 <https://github.com/huge-success/sanic/pull/1866>`_
Add ``sanic`` as an entry point command
*
`#1880 <https://github.com/huge-success/sanic/pull/1880>`_
Add handler names for websockets for url_for usage
Bugfixes
********
*
`#1776 <https://github.com/huge-success/sanic/pull/1776>`_
Bug fix for host parameter issue with lists
*
`#1842 <https://github.com/huge-success/sanic/pull/1842>`_
Fix static _handler pickling error
*
`#1827 <https://github.com/huge-success/sanic/pull/1827>`_
Fix reloader on OSX py38 and Windows
*
`#1848 <https://github.com/huge-success/sanic/pull/1848>`_
Reverse named_response_middlware execution order, to match normal response middleware execution order
*
`#1853 <https://github.com/huge-success/sanic/pull/1853>`_
Fix pickle error when attempting to pickle an application which contains websocket routes
Deprecations and Removals
*************************
*
`#1739 <https://github.com/huge-success/sanic/pull/1739>`_
Deprecate body_bytes to merge into body
Developer infrastructure
************************
*
`#1852 <https://github.com/huge-success/sanic/pull/1852>`_
Fix naming of CI test env on Python nightlies
*
`#1857 <https://github.com/huge-success/sanic/pull/1857>`_
Adjust websockets version to setup.py
*
`#1869 <https://github.com/huge-success/sanic/pull/1869>`_
Wrap run()'s "protocol" type annotation in Optional[]
Improved Documentation
**********************
*
`#1846 <https://github.com/huge-success/sanic/pull/1846>`_
Update docs to clarify response middleware execution order
*
`#1865 <https://github.com/huge-success/sanic/pull/1865>`_
Fixing rst format issue that was hiding documentation
Version 20.6.0
===============
*Released, but unintentionally ommitting PR #1880, so was replaced by 20.6.1*
Version 20.3.0
===============
Features
********
*
`#1762 <https://github.com/huge-success/sanic/pull/1762>`_
Add ``srv.start_serving()`` and ``srv.serve_forever()`` to ``AsyncioServer``
*
`#1767 <https://github.com/huge-success/sanic/pull/1767>`_
Make Sanic usable on ``hypercorn -k trio myweb.app``
*
`#1768 <https://github.com/huge-success/sanic/pull/1768>`_
No tracebacks on normal errors and prettier error pages
*
`#1769 <https://github.com/huge-success/sanic/pull/1769>`_
Code cleanup in file responses
*
`#1793 <https://github.com/huge-success/sanic/pull/1793>`_ and
`#1819 <https://github.com/huge-success/sanic/pull/1819>`_
Upgrade ``str.format()`` to f-strings
*
`#1798 <https://github.com/huge-success/sanic/pull/1798>`_
Allow multiple workers on MacOS with Python 3.8
*
`#1820 <https://github.com/huge-success/sanic/pull/1820>`_
Do not set content-type and content-length headers in exceptions
Bugfixes
********
*
`#1748 <https://github.com/huge-success/sanic/pull/1748>`_
Remove loop argument in ``asyncio.Event`` in Python 3.8
*
`#1764 <https://github.com/huge-success/sanic/pull/1764>`_
Allow route decorators to stack up again
*
`#1789 <https://github.com/huge-success/sanic/pull/1789>`_
Fix tests using hosts yielding incorrect ``url_for``
*
`#1808 <https://github.com/huge-success/sanic/pull/1808>`_
Fix Ctrl+C and tests on Windows
Deprecations and Removals
*************************
*
`#1800 <https://github.com/huge-success/sanic/pull/1800>`_
Begin deprecation in way of first-class streaming, removal of ``body_init``, ``body_push``, and ``body_finish``
*
`#1801 <https://github.com/huge-success/sanic/pull/1801>`_
Complete deprecation from `#1666 <https://github.com/huge-success/sanic/pull/1666>`_ of dictionary context on ``request`` objects.
*
`#1807 <https://github.com/huge-success/sanic/pull/1807>`_
Remove server config args that can be read directly from app
*
`#1818 <https://github.com/huge-success/sanic/pull/1818>`_
Complete deprecation of ``app.remove_route`` and ``request.raw_args``
Dependencies
************
*
`#1794 <https://github.com/huge-success/sanic/pull/1794>`_
Bump ``httpx`` to 0.11.1
*
`#1806 <https://github.com/huge-success/sanic/pull/1806>`_
Import ``ASGIDispatch`` from top-level ``httpx`` (from third-party deprecation)
Developer infrastructure
************************
*
`#1833 <https://github.com/huge-success/sanic/pull/1833>`_
Resolve broken documentation builds
Improved Documentation
**********************
*
`#1755 <https://github.com/huge-success/sanic/pull/1755>`_
Usage of ``response.empty()``
*
`#1778 <https://github.com/huge-success/sanic/pull/1778>`_
Update README
*
`#1783 <https://github.com/huge-success/sanic/pull/1783>`_
Fix typo
*
`#1784 <https://github.com/huge-success/sanic/pull/1784>`_
Corrected changelog for docs move of MD to RST (`#1691 <https://github.com/huge-success/sanic/pull/1691>`_)
*
`#1803 <https://github.com/huge-success/sanic/pull/1803>`_
Update config docs to match DEFAULT_CONFIG
*
`#1814 <https://github.com/huge-success/sanic/pull/1814>`_
Update getting_started.rst
*
`#1821 <https://github.com/huge-success/sanic/pull/1821>`_
Update to deployment
*
`#1822 <https://github.com/huge-success/sanic/pull/1822>`_
Update docs with changes done in 20.3
*
`#1834 <https://github.com/huge-success/sanic/pull/1834>`_
Order of listeners
Version 19.12.0 Version 19.12.0
=============== ===============
@@ -24,7 +329,7 @@ Bugfixes
Improved Documentation Improved Documentation
********************** **********************
- Move docs from RST to MD - Move docs from MD to RST
Moved all docs from markdown to restructured text like the rest of the docs to unify the scheme and make it easier in Moved all docs from markdown to restructured text like the rest of the docs to unify the scheme and make it easier in
the future to update documentation. (`#1691 <https://github.com/huge-success/sanic/issues/1691>`__) the future to update documentation. (`#1691 <https://github.com/huge-success/sanic/issues/1691>`__)
@@ -64,7 +369,7 @@ Features
* *
`#1562 <https://github.com/huge-success/sanic/pull/1562>`_ `#1562 <https://github.com/huge-success/sanic/pull/1562>`_
Remove ``aiohttp`` dependencey and create new ``SanicTestClient`` based upon Remove ``aiohttp`` dependency and create new ``SanicTestClient`` based upon
`requests-async <https://github.com/encode/requests-async>`_ `requests-async <https://github.com/encode/requests-async>`_
* *

View File

@@ -71,7 +71,7 @@ black:
black --config ./.black.toml sanic tests black --config ./.black.toml sanic tests
fix-import: black fix-import: black
isort -rc sanic tests isort sanic tests
docs-clean: docs-clean:

View File

@@ -58,6 +58,8 @@ Sanic | Build fast. Run fast.
Sanic is a **Python 3.6+** web server and web framework that's written to go fast. It allows the usage of the ``async/await`` syntax added in Python 3.5, which makes your code non-blocking and speedy. Sanic is a **Python 3.6+** web server and web framework that's written to go fast. It allows the usage of the ``async/await`` syntax added in Python 3.5, which makes your code non-blocking and speedy.
Sanic is also ASGI compliant, so you can deploy it with an `alternative ASGI webserver <https://sanic.readthedocs.io/en/latest/sanic/deploying.html#running-via-asgi>`_.
`Source code on GitHub <https://github.com/huge-success/sanic/>`_ | `Help and discussion board <https://community.sanicframework.org/>`_. `Source code on GitHub <https://github.com/huge-success/sanic/>`_ | `Help and discussion board <https://community.sanicframework.org/>`_.
The project is maintained by the community, for the community. **Contributions are welcome!** The project is maintained by the community, for the community. **Contributions are welcome!**
@@ -83,6 +85,9 @@ Installation
If you are running on a clean install of Fedora 28 or above, please make sure you have the ``redhat-rpm-config`` package installed in case if you want to If you are running on a clean install of Fedora 28 or above, please make sure you have the ``redhat-rpm-config`` package installed in case if you want to
use ``sanic`` with ``ujson`` dependency. use ``sanic`` with ``ujson`` dependency.
.. note::
Windows support is currently "experimental" and on a best-effort basis. Multiple workers are also not currently supported on Windows (see `Issue #1517 <https://github.com/huge-success/sanic/issues/1517>`_), but setting ``workers=1`` should launch the server successfully.
Hello World Example Hello World Example
------------------- -------------------
@@ -101,7 +106,7 @@ Hello World Example
if __name__ == '__main__': if __name__ == '__main__':
app.run(host='0.0.0.0', port=8000) app.run(host='0.0.0.0', port=8000)
Sanic can now be easily run using ``python3 hello.py``. Sanic can now be easily run using ``sanic hello.app``.
.. code:: .. code::

View File

@@ -4,19 +4,27 @@
Sanic releases long term support release once a year in December. LTS releases receive bug and security updates for **24 months**. Interim releases throughout the year occur every three months, and are supported until the subsequent interim release. Sanic releases long term support release once a year in December. LTS releases receive bug and security updates for **24 months**. Interim releases throughout the year occur every three months, and are supported until the subsequent interim release.
| Version | LTS | Supported | | Version | LTS | Supported |
| ------- | ------------------ | ------------------ | | ------- | ------------- | ------------------ |
| 19.6.0 | | :white_check_mark: | | 20.9 | | :heavy_check_mark: |
| 19.3.1 | | :heavy_check_mark: | | 20.6 | | :x: |
| 18.12.0 | :heavy_check_mark: | :heavy_check_mark: | | 20.3 | | :x: |
| 0.8.3 | | :x: | | 19.12 | until 2021-12 | :white_check_mark: |
| 0.7.0 | | :x: | | 19.9 | | :x: |
| 0.6.0 | | :x: | | 19.6 | | :x: |
| 0.5.4 | | :x: | | 19.3 | | :x: |
| 0.4.1 | | :x: | | 18.12 | until 2020-12 | :white_check_mark: |
| 0.3.1 | | :x: | | 0.8.3 | | :x: |
| 0.2.0 | | :x: | | 0.7.0 | | :x: |
| 0.1.9 | | :x: | | 0.6.0 | | :x: |
| 0.5.4 | | :x: |
| 0.4.1 | | :x: |
| 0.3.1 | | :x: |
| 0.2.0 | | :x: |
| 0.1.9 | | :x: |
:white_check_mark: = security/bug fixes
:heavy_check_mark: = full support
## Reporting a Vulnerability ## Reporting a Vulnerability

View File

@@ -0,0 +1 @@
Remove [version] section.

View File

@@ -0,0 +1,3 @@
Adds WEBSOCKET_PING_TIMEOUT and WEBSOCKET_PING_INTERVAL configuration values
Allows setting the ping_interval and ping_timeout arguments when initializing `WebSocketCommonProtocol`.

View File

@@ -28,6 +28,7 @@ Guides
sanic/debug_mode sanic/debug_mode
sanic/testing sanic/testing
sanic/deploying sanic/deploying
sanic/nginx
sanic/extensions sanic/extensions
sanic/examples sanic/examples
sanic/changelog sanic/changelog

View File

@@ -28,14 +28,15 @@ using all these methods would look like the following.
from sanic.views import HTTPMethodView from sanic.views import HTTPMethodView
from sanic.response import text from sanic.response import text
app = Sanic('some_name') app = Sanic("class_views_example")
class SimpleView(HTTPMethodView): class SimpleView(HTTPMethodView):
def get(self, request): def get(self, request):
return text('I am get method') return text('I am get method')
def post(self, request): # You can also use async syntax
async def post(self, request):
return text('I am post method') return text('I am post method')
def put(self, request): def put(self, request):
@@ -49,22 +50,6 @@ using all these methods would look like the following.
app.add_route(SimpleView.as_view(), '/') app.add_route(SimpleView.as_view(), '/')
You can also use `async` syntax.
.. code-block:: python
from sanic import Sanic
from sanic.views import HTTPMethodView
from sanic.response import text
app = Sanic('some_name')
class SimpleAsyncView(HTTPMethodView):
async def get(self, request):
return text('I am async get method')
app.add_route(SimpleAsyncView.as_view(), '/')
URL parameters URL parameters
-------------- --------------
@@ -154,7 +139,7 @@ lambda:
from sanic.views import CompositionView from sanic.views import CompositionView
from sanic.response import text from sanic.response import text
app = Sanic(__name__) app = Sanic("composition_example")
def get_handler(request): def get_handler(request):
return text('I am a get method') return text('I am a get method')

View File

@@ -12,9 +12,9 @@ Sanic holds the configuration in the `config` attribute of the application objec
app = Sanic('myapp') app = Sanic('myapp')
app.config.DB_NAME = 'appdb' app.config.DB_NAME = 'appdb'
app.config.DB_USER = 'appuser' app.config['DB_USER'] = 'appuser'
Since the config object actually is a dictionary, you can use its `update` method in order to set several values at once: Since the config object has a type that inherits from dictionary, you can use its ``update`` method in order to set several values at once:
.. code-block:: python .. code-block:: python
@@ -39,17 +39,98 @@ Any variables defined with the `SANIC_` prefix will be applied to the sanic conf
.. code-block:: python .. code-block:: python
app = Sanic(load_env='MYAPP_') app = Sanic(__name__, load_env='MYAPP_')
Then the above variable would be `MYAPP_REQUEST_TIMEOUT`. If you want to disable loading from environment variables you can set it to `False` instead: Then the above variable would be `MYAPP_REQUEST_TIMEOUT`. If you want to disable loading from environment variables you can set it to `False` instead:
.. code-block:: python .. code-block:: python
app = Sanic(load_env=False) app = Sanic(__name__, load_env=False)
From file, dict, or any object (having __dict__ attribute).
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
You can store app configurations in: (1) a Python file, (2) a dictionary, or (3) in some other type of custom object.
In order to load configuration from ove of those, you can use ``app.upload_config()``.
**1) From file**
Let's say you have ``my_config.py`` file that looks like this:
.. code-block:: python
# my_config.py
A = 1
B = 2
Loading config from this file is as easy as:
.. code-block:: python
app.update_config("/path/to/my_config.py")
You can also use environment variables in the path name here.
Let's say you have an environment variable like this:
.. code-block:: shell
$ export my_path="/path/to"
Then you can use it like this:
.. code-block:: python
app.update_config("${my_path}/my_config.py")
.. note::
Just remember that you have to provide environment variables in the format ${environment_variable} and that $environment_variable is not expanded (is treated as "plain" text).
**2) From dict**
You can also set your app config by providing a ``dict``:
.. code-block:: python
d = {"A": 1, "B": 2}
app.update_config(d)
**3) From _any_ object**
App config can be taken from an object. Internally, it uses ``__dict__`` to retrieve keys and values.
For example, pass the class:
.. code-block:: python
class C:
A = 1
B = 2
app.update_config(C)
or, it can be instantiated:
.. code-block:: python
c = C()
app.update_config(c)
- From an object (having __dict__ attribute)
From an Object From an Object
~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~
.. note::
Deprecated, will be removed in version 21.3.
If there are a lot of configuration values and they have sensible defaults it might be helpful to put them into a module: If there are a lot of configuration values and they have sensible defaults it might be helpful to put them into a module:
.. code-block:: python .. code-block:: python
@@ -71,6 +152,10 @@ You could use a class or any other object as well.
From a File From a File
~~~~~~~~~~~ ~~~~~~~~~~~
.. note::
Deprecated, will be removed in version 21.3.
Usually you will want to load configuration from a file that is not part of the distributed application. You can load configuration from a file using `from_pyfile(/path/to/config_file)`. However, that requires the program to know the path to the config file. So instead you can specify the location of the config file in an environment variable and tell Sanic to use that to find the config file: Usually you will want to load configuration from a file that is not part of the distributed application. You can load configuration from a file using `from_pyfile(/path/to/config_file)`. However, that requires the program to know the path to the config file. So instead you can specify the location of the config file in an environment variable and tell Sanic to use that to find the config file:
.. code-block:: python .. code-block:: python
@@ -98,7 +183,7 @@ The config files are regular Python files which are executed in order to load th
Builtin Configuration Values Builtin Configuration Values
---------------------------- ----------------------------
Out of the box there are just a few predefined values which can be overwritten when creating the application. Out of the box there are just a few predefined values which can be overwritten when creating the application. Note that websocket configuration values will have no impact if running in ASGI mode.
+---------------------------+-------------------+-----------------------------------------------------------------------------+ +---------------------------+-------------------+-----------------------------------------------------------------------------+
| Variable | Default | Description | | Variable | Default | Description |
@@ -115,15 +200,29 @@ Out of the box there are just a few predefined values which can be overwritten w
+---------------------------+-------------------+-----------------------------------------------------------------------------+ +---------------------------+-------------------+-----------------------------------------------------------------------------+
| KEEP_ALIVE_TIMEOUT | 5 | How long to hold a TCP connection open (sec) | | KEEP_ALIVE_TIMEOUT | 5 | How long to hold a TCP connection open (sec) |
+---------------------------+-------------------+-----------------------------------------------------------------------------+ +---------------------------+-------------------+-----------------------------------------------------------------------------+
| WEBSOCKET_MAX_SIZE | 2^20 | Maximum size for incoming messages (bytes) |
+---------------------------+-------------------+-----------------------------------------------------------------------------+
| WEBSOCKET_MAX_QUEUE | 32 | Maximum length of the queue that holds incoming messages |
+---------------------------+-------------------+-----------------------------------------------------------------------------+
| WEBSOCKET_READ_LIMIT | 2^16 | High-water limit of the buffer for incoming bytes |
+---------------------------+-------------------+-----------------------------------------------------------------------------+
| WEBSOCKET_WRITE_LIMIT | 2^16 | High-water limit of the buffer for outgoing bytes |
+---------------------------+-------------------+-----------------------------------------------------------------------------+
| WEBSOCKET_PING_INTERVAL | 20 | A Ping frame is sent every ping_interval seconds. |
+---------------------------+-------------------+-----------------------------------------------------------------------------+
| WEBSOCKET_PING_TIMEOUT | 20 | Connection is closed when Pong is not received after ping_timeout seconds |
+---------------------------+-------------------+-----------------------------------------------------------------------------+
| GRACEFUL_SHUTDOWN_TIMEOUT | 15.0 | How long to wait to force close non-idle connection (sec) | | GRACEFUL_SHUTDOWN_TIMEOUT | 15.0 | How long to wait to force close non-idle connection (sec) |
+---------------------------+-------------------+-----------------------------------------------------------------------------+ +---------------------------+-------------------+-----------------------------------------------------------------------------+
| ACCESS_LOG | True | Disable or enable access log | | ACCESS_LOG | True | Disable or enable access log |
+---------------------------+-------------------+-----------------------------------------------------------------------------+ +---------------------------+-------------------+-----------------------------------------------------------------------------+
| PROXIES_COUNT | -1 | The number of proxy servers in front of the app (e.g. nginx; see below) | | FORWARDED_SECRET | None | Used to securely identify a specific proxy server (see below) |
+---------------------------+-------------------+-----------------------------------------------------------------------------+
| PROXIES_COUNT | None | The number of proxy servers in front of the app (e.g. nginx; see below) |
+---------------------------+-------------------+-----------------------------------------------------------------------------+ +---------------------------+-------------------+-----------------------------------------------------------------------------+
| FORWARDED_FOR_HEADER | "X-Forwarded-For" | The name of "X-Forwarded-For" HTTP header that contains client and proxy ip | | FORWARDED_FOR_HEADER | "X-Forwarded-For" | The name of "X-Forwarded-For" HTTP header that contains client and proxy ip |
+---------------------------+-------------------+-----------------------------------------------------------------------------+ +---------------------------+-------------------+-----------------------------------------------------------------------------+
| REAL_IP_HEADER | "X-Real-IP" | The name of "X-Real-IP" HTTP header that contains real client ip | | REAL_IP_HEADER | None | The name of "X-Real-IP" HTTP header that contains real client ip |
+---------------------------+-------------------+-----------------------------------------------------------------------------+ +---------------------------+-------------------+-----------------------------------------------------------------------------+
The different Timeout variables: The different Timeout variables:
@@ -228,9 +327,7 @@ Proxy config if using ...
* a proxy that supports `forwarded`: set `FORWARDED_SECRET` to the value that the proxy inserts in the header * a proxy that supports `forwarded`: set `FORWARDED_SECRET` to the value that the proxy inserts in the header
* Apache Traffic Server: `CONFIG proxy.config.http.insert_forwarded STRING for|proto|host|by=_secret` * Apache Traffic Server: `CONFIG proxy.config.http.insert_forwarded STRING for|proto|host|by=_secret`
* NGHTTPX: `nghttpx --add-forwarded=for,proto,host,by --forwarded-for=ip --forwarded-by=_secret` * NGHTTPX: `nghttpx --add-forwarded=for,proto,host,by --forwarded-for=ip --forwarded-by=_secret`
* NGINX: after `the official instructions <https://www.nginx.com/resources/wiki/start/topics/examples/forwarded/>`_, add anywhere in your config: * NGINX: :ref:`nginx`.
.. proxy_set_header Forwarded "$proxy_add_forwarded;by=\"_$server_name\";proto=$scheme;host=\"$http_host\";path=\"$request_uri\";secret=_secret";
* a custom header with client IP: set `REAL_IP_HEADER` to the name of that header * a custom header with client IP: set `REAL_IP_HEADER` to the name of that header
* `x-forwarded-for`: set `PROXIES_COUNT` to `1` for a single proxy, or a greater number to allow Sanic to select the correct IP * `x-forwarded-for`: set `PROXIES_COUNT` to `1` for a single proxy, or a greater number to allow Sanic to select the correct IP

View File

@@ -21,7 +21,7 @@ and the Automatic Reloader will be activated.
from sanic import Sanic from sanic import Sanic
from sanic.response import json from sanic.response import json
app = Sanic() app = Sanic(__name__)
@app.route('/') @app.route('/')
async def hello_world(request): async def hello_world(request):
@@ -43,7 +43,7 @@ the ``auto_reload`` argument will activate or deactivate the Automatic Reloader.
from sanic import Sanic from sanic import Sanic
from sanic.response import json from sanic.response import json
app = Sanic() app = Sanic(__name__)
@app.route('/') @app.route('/')
async def hello_world(request): async def hello_world(request):

View File

@@ -1,9 +1,12 @@
Deploying Deploying
========= =========
Deploying Sanic is very simple using one of three options: the inbuilt webserver, Sanic has three serving options: the inbuilt webserver,
an `ASGI webserver <https://asgi.readthedocs.io/en/latest/implementations.html>`_, or `gunicorn`. an `ASGI webserver <https://asgi.readthedocs.io/en/latest/implementations.html>`_, or `gunicorn`.
It is also very common to place Sanic behind a reverse proxy, like `nginx`.
Sanic's own webserver is the fastest option, and it can be securely run on
the Internet. Still, it is also very common to place Sanic behind a reverse
proxy, as shown in :ref:`nginx`.
Running via Sanic webserver Running via Sanic webserver
--------------------------- ---------------------------
@@ -13,6 +16,7 @@ keyword arguments:
- `host` *(default `"127.0.0.1"`)*: Address to host the server on. - `host` *(default `"127.0.0.1"`)*: Address to host the server on.
- `port` *(default `8000`)*: Port to host the server on. - `port` *(default `8000`)*: Port to host the server on.
- `unix` *(default `None`)*: Unix socket name to host the server on (instead of TCP).
- `debug` *(default `False`)*: Enables debug output (slows server). - `debug` *(default `False`)*: Enables debug output (slows server).
- `ssl` *(default `None`)*: `SSLContext` for SSL encryption of worker(s). - `ssl` *(default `None`)*: `SSLContext` for SSL encryption of worker(s).
- `sock` *(default `None`)*: Socket for the server to accept connections from. - `sock` *(default `None`)*: Socket for the server to accept connections from.
@@ -47,7 +51,15 @@ If you like using command line arguments, you can launch a Sanic webserver by
executing the module. For example, if you initialized Sanic as `app` in a file executing the module. For example, if you initialized Sanic as `app` in a file
named `server.py`, you could run the server like so: named `server.py`, you could run the server like so:
.. python -m sanic server.app --host=0.0.0.0 --port=1337 --workers=4 ::
sanic server.app --host=0.0.0.0 --port=1337 --workers=4
It can also be called directly as a module.
::
python -m sanic server.app --host=0.0.0.0 --port=1337 --workers=4
With this way of running sanic, it is not necessary to invoke `app.run` in your With this way of running sanic, it is not necessary to invoke `app.run` in your
Python file. If you do, make sure you wrap it so that it only executes when Python file. If you do, make sure you wrap it so that it only executes when
@@ -85,7 +97,11 @@ before shutdown, and after shutdown. Therefore, in ASGI mode, the startup and sh
run consecutively and not actually around the server process beginning and ending (since that run consecutively and not actually around the server process beginning and ending (since that
is now controlled by the ASGI server). Therefore, it is best to use `after_server_start` and is now controlled by the ASGI server). Therefore, it is best to use `after_server_start` and
`before_server_stop`. `before_server_stop`.
3. ASGI mode is still in "beta" as of Sanic v19.6.
Sanic has experimental support for running on `Trio <https://trio.readthedocs.io/en/stable/>`_ with::
hypercorn -k trio myapp:app
Running via Gunicorn Running via Gunicorn
-------------------- --------------------
@@ -110,28 +126,6 @@ See the `Gunicorn Docs <http://docs.gunicorn.org/en/latest/settings.html#max-req
Other deployment considerations Other deployment considerations
------------------------------- -------------------------------
Running behind a reverse proxy
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Sanic can be used with a reverse proxy (e.g. nginx). There's a simple example of nginx configuration:
::
server {
listen 80;
server_name example.org;
location / {
proxy_pass http://127.0.0.1:8000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
}
If you want to get real client ip, you should configure `X-Real-IP` and `X-Forwarded-For` HTTP headers and set `app.config.PROXIES_COUNT` to `1`; see the configuration page for more information.
Disable debug logging for performance Disable debug logging for performance
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

View File

@@ -25,7 +25,7 @@ A simple sanic application with a single ``async`` method with ``text`` and ``js
Simple App with ``Sanic Views`` Simple App with ``Sanic Views``
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Showcasing the simple mechanism of using :class:`sanic.viewes.HTTPMethodView` as well as a way to extend the same Showcasing the simple mechanism of using :class:`sanic.views.HTTPMethodView` as well as a way to extend the same
into providing a custom ``async`` behavior for ``view``. into providing a custom ``async`` behavior for ``view``.
.. literalinclude:: ../../examples/simple_async_view.py .. literalinclude:: ../../examples/simple_async_view.py

View File

@@ -59,7 +59,7 @@ You can also add an exception handler as such:
async def server_error_handler(request, exception): async def server_error_handler(request, exception):
return text("Oops, server error", status=500) return text("Oops, server error", status=500)
app = Sanic() app = Sanic("error_handler_example")
app.error_handler.add(Exception, server_error_handler) app.error_handler.add(Exception, server_error_handler)
In some cases, you might want to add some more error handling In some cases, you might want to add some more error handling
@@ -77,7 +77,7 @@ can subclass Sanic's default error handler as such:
# You custom error handling logic... # You custom error handling logic...
return super().default(request, exception) return super().default(request, exception)
app = Sanic() app = Sanic("custom_error_handler_example")
app.error_handler = CustomErrorHandler() app.error_handler = CustomErrorHandler()
Useful exceptions Useful exceptions

View File

@@ -8,19 +8,19 @@ syntax, so earlier versions of python won't work.
1. Install Sanic 1. Install Sanic
---------------- ----------------
> If you are running on a clean install of Fedora 28 or above, please make sure you have the ``redhat-rpm-config`` package installed in case if you want to use ``sanic`` with ``ujson`` dependency. If you are running on a clean install of Fedora 28 or above, please make sure you have the ``redhat-rpm-config`` package installed in case if you want to use ``sanic`` with ``ujson`` dependency.
.. code-block:: bash .. code-block:: bash
pip3 install sanic pip3 install sanic
To install sanic without `uvloop` or `ujson` using bash, you can provide either or both of these environmental variables To install sanic without `uvloop` or `ujson` using bash, you can provide either or both of these environmental variables
using any truthy string like `'y', 'yes', 't', 'true', 'on', '1'` and setting the `SANIC_NO_X` (`X` = `UVLOOP`/`UJSON`) using any truthy string like `'y', 'yes', 't', 'true', 'on', '1'` and setting the `SANIC_NO_X` ( with`X` = `UVLOOP`/`UJSON`)
to true will stop that features installation. to true will stop that features installation.
.. code-block:: bash .. code-block:: bash
SANIC_NO_UVLOOP=true SANIC_NO_UJSON=true pip3 install sanic SANIC_NO_UVLOOP=true SANIC_NO_UJSON=true pip3 install --no-binary :all: sanic
You can also install Sanic from `conda-forge <https://anaconda.org/conda-forge/sanic>`_ You can also install Sanic from `conda-forge <https://anaconda.org/conda-forge/sanic>`_
@@ -37,7 +37,7 @@ You can also install Sanic from `conda-forge <https://anaconda.org/conda-forge/s
from sanic import Sanic from sanic import Sanic
from sanic.response import json from sanic.response import json
app = Sanic() app = Sanic("hello_example")
@app.route("/") @app.route("/")
async def test(request): async def test(request):

View File

@@ -15,7 +15,7 @@ Sanic aspires to be simple
from sanic import Sanic from sanic import Sanic
from sanic.response import json from sanic.response import json
app = Sanic() app = Sanic("App Name")
@app.route("/") @app.route("/")
async def test(request): async def test(request):

View File

@@ -17,7 +17,7 @@ A simple example using default settings would be like this:
from sanic.log import logger from sanic.log import logger
from sanic.response import text from sanic.response import text
app = Sanic('test') app = Sanic('logging_example')
@app.route('/') @app.route('/')
async def test(request): async def test(request):
@@ -47,7 +47,7 @@ initialize ``Sanic`` app:
.. code:: python .. code:: python
app = Sanic('test', log_config=LOGGING_CONFIG) app = Sanic('logging_example', log_config=LOGGING_CONFIG)
And to close logging, simply assign access_log=False: And to close logging, simply assign access_log=False:
@@ -100,4 +100,4 @@ Log Context Parameter Parameter Value Datatype
The default access log format is ``%(asctime)s - (%(name)s)[%(levelname)s][%(host)s]: %(request)s %(message)s %(status)d %(byte)d`` The default access log format is ``%(asctime)s - (%(name)s)[%(levelname)s][%(host)s]: %(request)s %(message)s %(status)d %(byte)d``
.. _python3 logging API: https://docs.python.org/3/howto/logging.html .. _python3 logging API: https://docs.python.org/3/howto/logging.html

View File

@@ -14,8 +14,8 @@ There are two types of middleware: request and response. Both are declared
using the `@app.middleware` decorator, with the decorator's parameter being a using the `@app.middleware` decorator, with the decorator's parameter being a
string representing its type: `'request'` or `'response'`. string representing its type: `'request'` or `'response'`.
* Request middleware receives only the `request` as argument. * Request middleware receives only the `request` as an argument and are executed in the order they were added.
* Response middleware receives both the `request` and `response`. * Response middleware receives both the `request` and `response` and are executed in *reverse* order.
The simplest middleware doesn't modify the request or response at all: The simplest middleware doesn't modify the request or response at all:
@@ -64,12 +64,12 @@ this.
app.run(host="0.0.0.0", port=8000) app.run(host="0.0.0.0", port=8000)
The three middlewares are executed in order: The three middlewares are executed in the following order:
1. The first request middleware **add_key** adds a new key `foo` into request context. 1. The first request middleware **add_key** adds a new key `foo` into request context.
2. Request is routed to handler **index**, which gets the key from context and returns a text response. 2. Request is routed to handler **index**, which gets the key from context and returns a text response.
3. The first response middleware **custom_banner** changes the HTTP response header *Server* to say *Fake-Server* 3. The second response middleware **prevent_xss** adds the HTTP header for preventing Cross-Site-Scripting (XSS) attacks.
4. The second response middleware **prevent_xss** adds the HTTP header for preventing Cross-Site-Scripting (XSS) attacks. 4. The first response middleware **custom_banner** changes the HTTP response header *Server* to say *Fake-Server*
Responding early Responding early
---------------- ----------------
@@ -132,13 +132,24 @@ For example:
async def close_db(app, loop): async def close_db(app, loop):
await app.db.close() await app.db.close()
Note:
The listeners are deconstructed in the reverse order of being constructed.
For example:
If the first listener in before_server_start handler setups a database connection,
ones registered after it can rely on that connection being alive both when they are started
and stopped, because stopping is done in reverse order, and the database connection is
torn down last.
It's also possible to register a listener using the `register_listener` method. It's also possible to register a listener using the `register_listener` method.
This may be useful if you define your listeners in another module besides This may be useful if you define your listeners in another module besides
the one you instantiate your app in. the one you instantiate your app in.
.. code-block:: python .. code-block:: python
app = Sanic() app = Sanic(__name__)
async def setup_db(app, loop): async def setup_db(app, loop):
app.db = await db_setup() app.db = await db_setup()

222
docs/sanic/nginx.rst Normal file
View File

@@ -0,0 +1,222 @@
.. _nginx:
Nginx Deployment
================
Introduction
~~~~~~~~~~~~
Although Sanic can be run directly on Internet, it may be useful to use a proxy
server such as Nginx in front of it. This is particularly useful for running
multiple virtual hosts on the same IP, serving NodeJS or other services beside
a single Sanic app, and it also allows for efficient serving of static files.
SSL and HTTP/2 are also easily implemented on such proxy.
We are setting the Sanic app to serve only locally at `127.0.0.1:8000`, while the
Nginx installation is responsible for providing the service to public Internet
on domain `example.com`. Static files will be served from `/var/www/`.
Proxied Sanic app
~~~~~~~~~~~~~~~~~
The app needs to be setup with a secret key used to identify a trusted proxy,
so that real client IP and other information can be identified. This protects
against anyone on the Internet sending fake headers to spoof their IP addresses
and other details. Choose any random string and configure it both on the app
and in Nginx config.
.. code-block:: python
from sanic import Sanic
from sanic.response import text
app = Sanic("proxied_example")
app.config.FORWARDED_SECRET = "YOUR SECRET"
@app.get("/")
def index(request):
# This should display external (public) addresses:
return text(
f"{request.remote_addr} connected to {request.url_for('index')}\n"
f"Forwarded: {request.forwarded}\n"
)
if __name__ == '__main__':
app.run(host='127.0.0.1', port=8000, workers=8, access_log=False)
Since this is going to be a system service, save your code to
`/srv/sanicexample/sanicexample.py`.
For testing, run your app in a terminal.
Nginx configuration
~~~~~~~~~~~~~~~~~~~
Quite much configuration is required to allow fast transparent proxying, but
for the most part these don't need to be modified, so bear with me.
Upstream servers need to be configured in a separate `upstream` block to enable
HTTP keep-alive, which can drastically improve performance, so we use this
instead of directly providing an upstream address in `proxy_pass` directive. In
this example, the upstream section is named by `server_name`, i.e. the public
domain name, which then also gets passed to Sanic in the `Host` header. You may
change the naming as you see fit. Multiple servers may also be provided for
load balancing and failover.
Change the two occurrences of `example.com` to your true domain name, and
instead of `YOUR SECRET` use the secret you chose for your app.
::
upstream example.com {
keepalive 100;
server 127.0.0.1:8000;
#server unix:/tmp/sanic.sock;
}
server {
server_name example.com;
listen 443 ssl http2 default_server;
listen [::]:443 ssl http2 default_server;
# Serve static files if found, otherwise proxy to Sanic
location / {
root /var/www;
try_files $uri @sanic;
}
location @sanic {
proxy_pass http://$server_name;
# Allow fast streaming HTTP/1.1 pipes (keep-alive, unbuffered)
proxy_http_version 1.1;
proxy_request_buffering off;
proxy_buffering off;
# Proxy forwarding (password configured in app.config.FORWARDED_SECRET)
proxy_set_header forwarded "$proxy_forwarded;secret=\"YOUR SECRET\"";
# Allow websockets
proxy_set_header connection "upgrade";
proxy_set_header upgrade $http_upgrade;
}
}
To avoid cookie visibility issues and inconsistent addresses on search engines,
it is a good idea to redirect all visitors to one true domain, always using
HTTPS:
::
# Redirect all HTTP to HTTPS with no-WWW
server {
listen 80 default_server;
listen [::]:80 default_server;
server_name ~^(?:www\.)?(.*)$;
return 301 https://$1$request_uri;
}
# Redirect WWW to no-WWW
server {
listen 443 ssl http2;
listen [::]:443 ssl http2;
server_name ~^www\.(.*)$;
return 301 $scheme://$1$request_uri;
}
The above config sections may be placed in `/etc/nginx/sites-available/default`
or in other site configs (be sure to symlink them to `sites-enabled` if you
create new ones).
Make sure that your SSL certificates are configured in the main config, or
add the `ssl_certificate` and `ssl_certificate_key` directives to each
`server` section that listens on SSL.
Additionally, copy&paste all of this into `nginx/conf.d/forwarded.conf`:
::
# RFC 7239 Forwarded header for Nginx proxy_pass
# Add within your server or location block:
# proxy_set_header forwarded "$proxy_forwarded;secret=\"YOUR SECRET\"";
# Configure your upstream web server to identify this proxy by that password
# because otherwise anyone on the Internet could spoof these headers and fake
# their real IP address and other information to your service.
# Provide the full proxy chain in $proxy_forwarded
map $proxy_add_forwarded $proxy_forwarded {
default "$proxy_add_forwarded;by=\"_$hostname\";proto=$scheme;host=\"$http_host\";path=\"$request_uri\"";
}
# The following mappings are based on
# https://www.nginx.com/resources/wiki/start/topics/examples/forwarded/
map $remote_addr $proxy_forwarded_elem {
# IPv4 addresses can be sent as-is
~^[0-9.]+$ "for=$remote_addr";
# IPv6 addresses need to be bracketed and quoted
~^[0-9A-Fa-f:.]+$ "for=\"[$remote_addr]\"";
# Unix domain socket names cannot be represented in RFC 7239 syntax
default "for=unknown";
}
map $http_forwarded $proxy_add_forwarded {
# If the incoming Forwarded header is syntactically valid, append to it
"~^(,[ \\t]*)*([!#$%&'*+.^_`|~0-9A-Za-z-]+=([!#$%&'*+.^_`|~0-9A-Za-z-]+|\"([\\t \\x21\\x23-\\x5B\\x5D-\\x7E\\x80-\\xFF]|\\\\[\\t \\x21-\\x7E\\x80-\\xFF])*\"))?(;([!#$%&'*+.^_`|~0-9A-Za-z-]+=([!#$%&'*+.^_`|~0-9A-Za-z-]+|\"([\\t \\x21\\x23-\\x5B\\x5D-\\x7E\\x80-\\xFF]|\\\\[\\t \\x21-\\x7E\\x80-\\xFF])*\"))?)*([ \\t]*,([ \\t]*([!#$%&'*+.^_`|~0-9A-Za-z-]+=([!#$%&'*+.^_`|~0-9A-Za-z-]+|\"([\\t \\x21\\x23-\\x5B\\x5D-\\x7E\\x80-\\xFF]|\\\\[\\t \\x21-\\x7E\\x80-\\xFF])*\"))?(;([!#$%&'*+.^_`|~0-9A-Za-z-]+=([!#$%&'*+.^_`|~0-9A-Za-z-]+|\"([\\t \\x21\\x23-\\x5B\\x5D-\\x7E\\x80-\\xFF]|\\\\[\\t \\x21-\\x7E\\x80-\\xFF])*\"))?)*)?)*$" "$http_forwarded, $proxy_forwarded_elem";
# Otherwise, replace it
default "$proxy_forwarded_elem";
}
For installs that don't use `conf.d` and `sites-available`, all of the above
configs may also be placed inside the `http` section of the main `nginx.conf`.
Reload Nginx config after changes:
::
sudo nginx -s reload
Now you should be able to connect your app on `https://example.com/`. Any 404
errors and such will be handled by Sanic's error pages, and whenever a static
file is present at a given path, it will be served by Nginx.
SSL certificates
~~~~~~~~~~~~~~~~
If you haven't already configured valid certificates on your server, now is a
good time to do so. Install `certbot` and `python3-certbot-nginx`, then run
::
certbot --nginx -d example.com -d www.example.com
`<https://www.nginx.com/blog/using-free-ssltls-certificates-from-lets-encrypt-with-nginx/>`_
Running as a service
~~~~~~~~~~~~~~~~~~~~
This part is for Linux distributions based on `systemd`. Create a unit file
`/etc/systemd/system/sanicexample.service`::
[Unit]
Description=Sanic Example
[Service]
User=nobody
WorkingDirectory=/srv/sanicexample
ExecStart=/usr/bin/env python3 sanicexample.py
Restart=always
[Install]
WantedBy=multi-user.target
Then reload service files, start your service and enable it on boot::
sudo systemctl daemon-reload
sudo systemctl start sanicexample
sudo systemctl enable sanicexample

View File

@@ -56,7 +56,6 @@ The difference between Request.args and Request.query_args for the queryset `?ke
"url": request.url, "url": request.url,
"query_string": request.query_string, "query_string": request.query_string,
"args": request.args, "args": request.args,
"raw_args": request.raw_args,
"query_args": request.query_args, "query_args": request.query_args,
}) })
@@ -72,12 +71,9 @@ The difference between Request.args and Request.query_args for the queryset `?ke
"url":"http:\/\/0.0.0.0:8000\/test_request_args?key1=value1&key2=value2&key1=value3", "url":"http:\/\/0.0.0.0:8000\/test_request_args?key1=value1&key2=value2&key1=value3",
"query_string":"key1=value1&key2=value2&key1=value3", "query_string":"key1=value1&key2=value2&key1=value3",
"args":{"key1":["value1","value3"],"key2":["value2"]}, "args":{"key1":["value1","value3"],"key2":["value2"]},
"raw_args":{"key1":"value1","key2":"value2"},
"query_args":[["key1","value1"],["key2","value2"],["key1","value3"]] "query_args":[["key1","value1"],["key2","value2"],["key1","value3"]]
} }
- `raw_args` contains only the first entry of `key1`. Will be deprecated in the future versions.
- `files` (dictionary of `File` objects) - List of files that have a name, body, and type - `files` (dictionary of `File` objects) - List of files that have a name, body, and type
.. code-block:: python .. code-block:: python
@@ -206,7 +202,7 @@ The output will be:
Accessing values using `get` and `getlist` Accessing values using `get` and `getlist`
------------------------------------------ ------------------------------------------
The `request.args` returns a subclass of `dict` called `RequestParameters`. The `request.args` returns a subclass of `dict` called `RequestParameters`.
The key difference when using this object is the distinction between the `get` and `getlist` methods. The key difference when using this object is the distinction between the `get` and `getlist` methods.
- `get(key, default=None)` operates as normal, except that when the value of - `get(key, default=None)` operates as normal, except that when the value of
@@ -228,14 +224,14 @@ The key difference when using this object is the distinction between the `get` a
from sanic import Sanic from sanic import Sanic
from sanic.response import json from sanic.response import json
app = Sanic(name="example") app = Sanic(__name__)
@app.route("/") @app.route("/")
def get_handler(request): def get_handler(request):
return json({ return json({
"p1": request.args.getlist("p1") "p1": request.args.getlist("p1")
}) })
Accessing the handler name with the request.endpoint attribute Accessing the handler name with the request.endpoint attribute
-------------------------------------------------------------- --------------------------------------------------------------
@@ -247,7 +243,7 @@ route will return "hello".
from sanic.response import text from sanic.response import text
from sanic import Sanic from sanic import Sanic
app = Sanic() app = Sanic(__name__)
@app.get("/") @app.get("/")
def hello(request): def hello(request):

View File

@@ -107,6 +107,19 @@ Response without encoding the body
def handle_request(request): def handle_request(request):
return response.raw(b'raw data') return response.raw(b'raw data')
Empty
--------------
For responding with an empty message as defined by `RFC 2616 <https://tools.ietf.org/search/rfc2616#section-7.2.1>`_
.. code-block:: python
from sanic import response
@app.route('/empty')
async def handle_request(request):
return response.empty()
Modify headers or status Modify headers or status
------------------------ ------------------------

View File

@@ -406,7 +406,7 @@ Build URL for static files
========================== ==========================
Sanic supports using `url_for` method to build static file urls. In case if the static url Sanic supports using `url_for` method to build static file urls. In case if the static url
is pointing to a directory, `filename` parameter to the `url_for` can be ignored. q is pointing to a directory, `filename` parameter to the `url_for` can be ignored.
.. code-block:: python .. code-block:: python

View File

@@ -16,7 +16,7 @@ IPv6 example:
sock = socket.socket(socket.AF_INET6, socket.SOCK_STREAM) sock = socket.socket(socket.AF_INET6, socket.SOCK_STREAM)
sock.bind(('::', 7777)) sock.bind(('::', 7777))
app = Sanic() app = Sanic("ipv6_example")
@app.route("/") @app.route("/")
@@ -46,7 +46,7 @@ UNIX socket example:
sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
sock.bind(server_socket) sock.bind(server_socket)
app = Sanic() app = Sanic("unix_socket_example")
@app.route("/") @app.route("/")

View File

@@ -16,7 +16,7 @@ Sanic allows you to get request data by stream, as below. When the request ends,
from sanic.response import stream, text from sanic.response import stream, text
bp = Blueprint('blueprint_request_stream') bp = Blueprint('blueprint_request_stream')
app = Sanic('request_stream') app = Sanic(__name__)
class SimpleView(HTTPMethodView): class SimpleView(HTTPMethodView):

View File

@@ -58,6 +58,36 @@ More information about
the available arguments to `httpx` can be found the available arguments to `httpx` can be found
[in the documentation for `httpx <https://www.encode.io/httpx/>`_. [in the documentation for `httpx <https://www.encode.io/httpx/>`_.
Additionally, Sanic has an asynchronous testing client. The difference is that the async client will not stand up an
instance of your application, but will instead reach inside it using ASGI. All listeners and middleware are still
executed.
.. code-block:: python
@pytest.mark.asyncio
async def test_index_returns_200():
request, response = await app.asgi_client.put('/')
assert response.status == 200
.. note::
Whenever one of the test clients run, you can test your app instance to determine if it is in testing mode:
`app.test_mode`.
Additionally, Sanic has an asynchronous testing client. The difference is that the async client will not stand up an
instance of your application, but will instead reach inside it using ASGI. All listeners and middleware are still
executed.
.. code-block:: python
@pytest.mark.asyncio
async def test_index_returns_200():
request, response = await app.asgi_client.put('/')
assert response.status == 200
.. note::
Whenever one of the test clients run, you can test your app instance to determine if it is in testing mode:
`app.test_mode`.
Using a random port Using a random port
------------------- -------------------

View File

@@ -12,7 +12,7 @@ To setup a WebSocket:
from sanic.response import json from sanic.response import json
from sanic.websocket import WebSocketProtocol from sanic.websocket import WebSocketProtocol
app = Sanic() app = Sanic("websocket_example")
@app.websocket('/feed') @app.websocket('/feed')
async def feed(request, ws): async def feed(request, ws):
@@ -51,5 +51,9 @@ You could setup your own WebSocket configuration through ``app.config``, like
app.config.WEBSOCKET_MAX_QUEUE = 32 app.config.WEBSOCKET_MAX_QUEUE = 32
app.config.WEBSOCKET_READ_LIMIT = 2 ** 16 app.config.WEBSOCKET_READ_LIMIT = 2 ** 16
app.config.WEBSOCKET_WRITE_LIMIT = 2 ** 16 app.config.WEBSOCKET_WRITE_LIMIT = 2 ** 16
app.config.WEBSOCKET_PING_INTERVAL = 20
app.config.WEBSOCKET_PING_TIMEOUT = 20
These settings will have no impact if running in ASGI mode.
Find more in ``Configuration`` section. Find more in ``Configuration`` section.

View File

@@ -1,19 +0,0 @@
name: py36
dependencies:
- pip=18.1=py36_0
- python=3.6=0
- setuptools=40.4.3=py36_0
- pip:
- httptools>=0.0.10
- uvloop>=0.5.3
- ujson>=1.35
- aiofiles>=0.3.0
- websockets>=6.0,<7.0
- multidict>=4.0,<5.0
- sphinx==1.8.3
- sphinx_rtd_theme==0.4.2
- recommonmark==0.5.0
- httpx==0.9.3
- sphinxcontrib-asyncio>=0.2.0
- docutils==0.14
- pygments==2.3.1

View File

@@ -0,0 +1,43 @@
from sanic import Sanic, Blueprint
from sanic.response import text
'''
Demonstrates that blueprint request middleware are executed in the order they
are added. And blueprint response middleware are executed in _reverse_ order.
On a valid request, it should print "1 2 3 6 5 4" to terminal
'''
app = Sanic(__name__)
bp = Blueprint("bp_"+__name__)
@bp.middleware('request')
def request_middleware_1(request):
print('1')
@bp.middleware('request')
def request_middleware_2(request):
print('2')
@bp.middleware('request')
def request_middleware_3(request):
print('3')
@bp.middleware('response')
def resp_middleware_4(request, response):
print('4')
@bp.middleware('response')
def resp_middleware_5(request, response):
print('5')
@bp.middleware('response')
def resp_middleware_6(request, response):
print('6')
@bp.route('/')
def pop_handler(request):
return text('hello world')
app.blueprint(bp, url_prefix='/bp')
app.run(host="0.0.0.0", port=8000, debug=True, auto_reload=False)

View File

@@ -0,0 +1,18 @@
from asyncio import sleep
from sanic import Sanic, response
app = Sanic(__name__, strict_slashes=True)
@app.get("/")
async def handler(request):
return response.redirect("/sleep/3")
@app.get("/sleep/<t:number>")
async def handler2(request, t=0.3):
await sleep(t)
return response.text(f"Slept {t:.1f} seconds.\n")
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,2 +1,9 @@
conda: version: 2
file: environment.yml python:
version: 3.8
install:
- method: pip
path: .
extra_requirements:
- docs
system_packages: true

View File

@@ -1,3 +1,6 @@
import os
import sys
from argparse import ArgumentParser from argparse import ArgumentParser
from importlib import import_module from importlib import import_module
from typing import Any, Dict, Optional from typing import Any, Dict, Optional
@@ -6,10 +9,11 @@ from sanic.app import Sanic
from sanic.log import logger from sanic.log import logger
if __name__ == "__main__": def main():
parser = ArgumentParser(prog="sanic") parser = ArgumentParser(prog="sanic")
parser.add_argument("--host", dest="host", type=str, default="127.0.0.1") parser.add_argument("--host", dest="host", type=str, default="127.0.0.1")
parser.add_argument("--port", dest="port", type=int, default=8000) parser.add_argument("--port", dest="port", type=int, default=8000)
parser.add_argument("--unix", dest="unix", type=str, default="")
parser.add_argument( parser.add_argument(
"--cert", dest="cert", type=str, help="location of certificate for SSL" "--cert", dest="cert", type=str, help="location of certificate for SSL"
) )
@@ -22,18 +26,22 @@ if __name__ == "__main__":
args = parser.parse_args() args = parser.parse_args()
try: try:
module_path = os.path.abspath(os.getcwd())
if module_path not in sys.path:
sys.path.append(module_path)
module_parts = args.module.split(".") module_parts = args.module.split(".")
module_name = ".".join(module_parts[:-1]) module_name = ".".join(module_parts[:-1])
app_name = module_parts[-1] app_name = module_parts[-1]
module = import_module(module_name) module = import_module(module_name)
app = getattr(module, app_name, None) app = getattr(module, app_name, None)
app_name = type(app).__name__
if not isinstance(app, Sanic): if not isinstance(app, Sanic):
raise ValueError( raise ValueError(
"Module is not a Sanic app, it is a {}. " f"Module is not a Sanic app, it is a {app_name}. "
"Perhaps you meant {}.app?".format( f"Perhaps you meant {args.module}.app?"
type(app).__name__, args.module
)
) )
if args.cert is not None or args.key is not None: if args.cert is not None or args.key is not None:
ssl = { ssl = {
@@ -46,15 +54,20 @@ if __name__ == "__main__":
app.run( app.run(
host=args.host, host=args.host,
port=args.port, port=args.port,
unix=args.unix,
workers=args.workers, workers=args.workers,
debug=args.debug, debug=args.debug,
ssl=ssl, ssl=ssl,
) )
except ImportError as e: except ImportError as e:
logger.error( logger.error(
"No module named {} found.\n" f"No module named {e.name} found.\n"
" Example File: project/sanic_server.py -> app\n" f" Example File: project/sanic_server.py -> app\n"
" Example Module: project.sanic_server.app".format(e.name) f" Example Module: project.sanic_server.app"
) )
except ValueError: except ValueError:
logger.exception("Failed to run app") logger.exception("Failed to run app")
if __name__ == "__main__":
main()

View File

@@ -1 +1 @@
__version__ = "19.12.0" __version__ = "20.9.0"

View File

@@ -81,6 +81,7 @@ class Sanic:
self.sock = None self.sock = None
self.strict_slashes = strict_slashes self.strict_slashes = strict_slashes
self.listeners = defaultdict(list) self.listeners = defaultdict(list)
self.is_stopping = False
self.is_running = False self.is_running = False
self.is_request_stream = False self.is_request_stream = False
self.websocket_enabled = False self.websocket_enabled = False
@@ -89,6 +90,7 @@ class Sanic:
self.named_response_middleware = {} self.named_response_middleware = {}
# Register alternative method names # Register alternative method names
self.go_fast = self.run self.go_fast = self.run
self.test_mode = False
@property @property
def loop(self): def loop(self):
@@ -116,24 +118,12 @@ class Sanic:
:param task: future, couroutine or awaitable :param task: future, couroutine or awaitable
""" """
try: try:
if callable(task): loop = self.loop # Will raise SanicError if loop is not started
try: self._loop_add_task(task, self, loop)
self.loop.create_task(task(self))
except TypeError:
self.loop.create_task(task())
else:
self.loop.create_task(task)
except SanicException: except SanicException:
self.listener("before_server_start")(
@self.listener("before_server_start") partial(self._loop_add_task, task)
def run(app, loop): )
if callable(task):
try:
loop.create_task(task(self))
except TypeError:
loop.create_task(task())
else:
loop.create_task(task)
# Decorator # Decorator
def listener(self, event): def listener(self, event):
@@ -194,25 +184,35 @@ class Sanic:
strict_slashes = self.strict_slashes strict_slashes = self.strict_slashes
def response(handler): def response(handler):
if isinstance(handler, tuple):
# if a handler fn is already wrapped in a route, the handler
# variable will be a tuple of (existing routes, handler fn)
routes, handler = handler
else:
routes = []
args = list(signature(handler).parameters.keys()) args = list(signature(handler).parameters.keys())
if not args: if not args:
handler_name = handler.__name__
raise ValueError( raise ValueError(
"Required parameter `request` missing " f"Required parameter `request` missing "
"in the {0}() route?".format(handler.__name__) f"in the {handler_name}() route?"
) )
if stream: if stream:
handler.is_stream = stream handler.is_stream = stream
routes = self.router.add( routes.extend(
uri=uri, self.router.add(
methods=methods, uri=uri,
handler=handler, methods=methods,
host=host, handler=handler,
strict_slashes=strict_slashes, host=host,
version=version, strict_slashes=strict_slashes,
name=name, version=version,
name=name,
)
) )
return routes, handler return routes, handler
@@ -451,7 +451,13 @@ class Sanic:
# Decorator # Decorator
def websocket( def websocket(
self, uri, host=None, strict_slashes=None, subprotocols=None, name=None self,
uri,
host=None,
strict_slashes=None,
subprotocols=None,
version=None,
name=None,
): ):
""" """
Decorate a function to be registered as a websocket route Decorate a function to be registered as a websocket route
@@ -476,53 +482,28 @@ class Sanic:
strict_slashes = self.strict_slashes strict_slashes = self.strict_slashes
def response(handler): def response(handler):
async def websocket_handler(request, *args, **kwargs): if isinstance(handler, tuple):
request.app = self # if a handler fn is already wrapped in a route, the handler
if not getattr(handler, "__blueprintname__", False): # variable will be a tuple of (existing routes, handler fn)
request.endpoint = handler.__name__ routes, handler = handler
else: else:
request.endpoint = ( routes = []
getattr(handler, "__blueprintname__", "") websocket_handler = partial(
+ handler.__name__ self._websocket_handler, handler, subprotocols=subprotocols
) )
websocket_handler.__name__ = (
pass "websocket_handler_" + handler.__name__
)
if self.asgi: routes.extend(
ws = request.transport.get_websocket_connection() self.router.add(
else: uri=uri,
try: handler=websocket_handler,
protocol = request.transport.get_protocol() methods=frozenset({"GET"}),
except AttributeError: host=host,
# On Python3.5 the Transport classes in asyncio do not strict_slashes=strict_slashes,
# have a get_protocol() method as in uvloop version=version,
protocol = request.transport._protocol name=name,
protocol.app = self )
ws = await protocol.websocket_handshake(
request, subprotocols
)
# schedule the application handler
# its future is kept in self.websocket_tasks in case it
# needs to be cancelled due to the server being stopped
fut = ensure_future(handler(request, ws, *args, **kwargs))
self.websocket_tasks.add(fut)
try:
await fut
except (CancelledError, ConnectionClosed):
pass
finally:
self.websocket_tasks.remove(fut)
await ws.close()
routes = self.router.add(
uri=uri,
handler=websocket_handler,
methods=frozenset({"GET"}),
host=host,
strict_slashes=strict_slashes,
name=name,
) )
return routes, handler return routes, handler
@@ -535,6 +516,7 @@ class Sanic:
host=None, host=None,
strict_slashes=None, strict_slashes=None,
subprotocols=None, subprotocols=None,
version=None,
name=None, name=None,
): ):
""" """
@@ -562,6 +544,7 @@ class Sanic:
host=host, host=host,
strict_slashes=strict_slashes, strict_slashes=strict_slashes,
subprotocols=subprotocols, subprotocols=subprotocols,
version=version,
name=name, name=name,
)(handler) )(handler)
@@ -574,36 +557,10 @@ class Sanic:
if not self.websocket_enabled: if not self.websocket_enabled:
# if the server is stopped, we want to cancel any ongoing # if the server is stopped, we want to cancel any ongoing
# websocket tasks, to allow the server to exit promptly # websocket tasks, to allow the server to exit promptly
@self.listener("before_server_stop") self.listener("before_server_stop")(self._cancel_websocket_tasks)
def cancel_websocket_tasks(app, loop):
for task in self.websocket_tasks:
task.cancel()
self.websocket_enabled = enable self.websocket_enabled = enable
def remove_route(self, uri, clean_cache=True, host=None):
"""
This method provides the app user a mechanism by which an already
existing route can be removed from the :class:`Sanic` object
.. warning::
remove_route is deprecated in v19.06 and will be removed
from future versions.
:param uri: URL Path to be removed from the app
:param clean_cache: Instruct sanic if it needs to clean up the LRU
route cache
:param host: IP address or FQDN specific to the host
:return: None
"""
warnings.warn(
"remove_route is deprecated and will be removed "
"from future versions.",
DeprecationWarning,
stacklevel=2,
)
self.router.remove(uri, clean_cache, host)
# Decorator # Decorator
def exception(self, *exceptions): def exception(self, *exceptions):
"""Decorate a function to be registered as a handler for exceptions """Decorate a function to be registered as a handler for exceptions
@@ -661,7 +618,7 @@ class Sanic:
if _rn not in self.named_response_middleware: if _rn not in self.named_response_middleware:
self.named_response_middleware[_rn] = deque() self.named_response_middleware[_rn] = deque()
if middleware not in self.named_response_middleware[_rn]: if middleware not in self.named_response_middleware[_rn]:
self.named_response_middleware[_rn].append(middleware) self.named_response_middleware[_rn].appendleft(middleware)
# Decorator # Decorator
def middleware(self, middleware_or_request): def middleware(self, middleware_or_request):
@@ -810,9 +767,17 @@ class Sanic:
uri, route = self.router.find_route_by_view_name(view_name, **kw) uri, route = self.router.find_route_by_view_name(view_name, **kw)
if not (uri and route): if not (uri and route):
raise URLBuildError( raise URLBuildError(
"Endpoint with name `{}` was not found".format(view_name) f"Endpoint with name `{view_name}` was not found"
) )
# If the route has host defined, split that off
# TODO: Retain netloc and path separately in Route objects
host = uri.find("/")
if host > 0:
host, uri = uri[:host], uri[host:]
else:
host = None
if view_name == "static" or view_name.endswith(".static"): if view_name == "static" or view_name.endswith(".static"):
filename = kwargs.pop("filename", None) filename = kwargs.pop("filename", None)
# it's static folder # it's static folder
@@ -824,7 +789,7 @@ class Sanic:
if filename.startswith("/"): if filename.startswith("/"):
filename = filename[1:] filename = filename[1:]
uri = "{}/{}".format(folder_, filename) uri = f"{folder_}/{filename}"
if uri != "/" and uri.endswith("/"): if uri != "/" and uri.endswith("/"):
uri = uri[:-1] uri = uri[:-1]
@@ -845,7 +810,7 @@ class Sanic:
netloc = kwargs.pop("_server", None) netloc = kwargs.pop("_server", None)
if netloc is None and external: if netloc is None and external:
netloc = self.config.get("SERVER_NAME", "") netloc = host or self.config.get("SERVER_NAME", "")
if external: if external:
if not scheme: if not scheme:
@@ -860,7 +825,7 @@ class Sanic:
for match in matched_params: for match in matched_params:
name, _type, pattern = self.router.parse_parameter_string(match) name, _type, pattern = self.router.parse_parameter_string(match)
# we only want to match against each individual parameter # we only want to match against each individual parameter
specific_pattern = "^{}$".format(pattern) specific_pattern = f"^{pattern}$"
supplied_param = None supplied_param = None
if name in kwargs: if name in kwargs:
@@ -868,9 +833,7 @@ class Sanic:
del kwargs[name] del kwargs[name]
else: else:
raise URLBuildError( raise URLBuildError(
"Required parameter `{}` was not passed to url_for".format( f"Required parameter `{name}` was not passed to url_for"
name
)
) )
supplied_param = str(supplied_param) supplied_param = str(supplied_param)
@@ -880,23 +843,22 @@ class Sanic:
if not passes_pattern: if not passes_pattern:
if _type != str: if _type != str:
type_name = _type.__name__
msg = ( msg = (
'Value "{}" for parameter `{}` does not ' f'Value "{supplied_param}" '
"match pattern for type `{}`: {}".format( f"for parameter `{name}` does not "
supplied_param, name, _type.__name__, pattern f"match pattern for type `{type_name}`: {pattern}"
)
) )
else: else:
msg = ( msg = (
'Value "{}" for parameter `{}` ' f'Value "{supplied_param}" for parameter `{name}` '
"does not satisfy pattern {}".format( f"does not satisfy pattern {pattern}"
supplied_param, name, pattern
)
) )
raise URLBuildError(msg) raise URLBuildError(msg)
# replace the parameter in the URL with the supplied value # replace the parameter in the URL with the supplied value
replacement_regex = "(<{}.*?>)".format(name) replacement_regex = f"(<{name}.*?>)"
out = re.sub(replacement_regex, supplied_param, out) out = re.sub(replacement_regex, supplied_param, out)
@@ -997,9 +959,8 @@ class Sanic:
) )
elif self.debug: elif self.debug:
response = HTTPResponse( response = HTTPResponse(
"Error while handling error: {}\nStack: {}".format( f"Error while "
e, format_exc() f"handling error: {e}\nStack: {format_exc()}",
),
status=500, status=500,
) )
else: else:
@@ -1062,16 +1023,19 @@ class Sanic:
self, self,
host: Optional[str] = None, host: Optional[str] = None,
port: Optional[int] = None, port: Optional[int] = None,
*,
debug: bool = False, debug: bool = False,
auto_reload: Optional[bool] = None,
ssl: Union[dict, SSLContext, None] = None, ssl: Union[dict, SSLContext, None] = None,
sock: Optional[socket] = None, sock: Optional[socket] = None,
workers: int = 1, workers: int = 1,
protocol: Type[Protocol] = None, protocol: Optional[Type[Protocol]] = None,
backlog: int = 100, backlog: int = 100,
stop_event: Any = None, stop_event: Any = None,
register_sys_signals: bool = True, register_sys_signals: bool = True,
access_log: Optional[bool] = None, access_log: Optional[bool] = None,
**kwargs: Any unix: Optional[str] = None,
loop: None = None,
) -> None: ) -> None:
"""Run the HTTP Server and listen until keyboard interrupt or term """Run the HTTP Server and listen until keyboard interrupt or term
signal. On termination, drain connections before closing. signal. On termination, drain connections before closing.
@@ -1082,6 +1046,9 @@ class Sanic:
:type port: int :type port: int
:param debug: Enables debug output (slows server) :param debug: Enables debug output (slows server)
:type debug: bool :type debug: bool
:param auto_reload: Reload app whenever its source code is changed.
Enabled by default in debug mode.
:type auto_relaod: bool
:param ssl: SSLContext, or location of certificate and key :param ssl: SSLContext, or location of certificate and key
for SSL encryption of worker(s) for SSL encryption of worker(s)
:type ssl: SSLContext or dict :type ssl: SSLContext or dict
@@ -1101,9 +1068,11 @@ class Sanic:
:type register_sys_signals: bool :type register_sys_signals: bool
:param access_log: Enables writing access logs (slows server) :param access_log: Enables writing access logs (slows server)
:type access_log: bool :type access_log: bool
:param unix: Unix socket to listen on instead of TCP port
:type unix: str
:return: Nothing :return: Nothing
""" """
if "loop" in kwargs: if loop is not None:
raise TypeError( raise TypeError(
"loop is not a valid argument. To use an existing loop, " "loop is not a valid argument. To use an existing loop, "
"change to create_server().\nSee more: " "change to create_server().\nSee more: "
@@ -1111,13 +1080,9 @@ class Sanic:
"#asynchronous-support" "#asynchronous-support"
) )
# Default auto_reload to false if auto_reload or auto_reload is None and debug:
auto_reload = False if os.environ.get("SANIC_SERVER_RUNNING") != "true":
# If debug is set, default it to true (unless on windows) return reloader_helpers.watchdog(1.0)
if debug and os.name == "posix":
auto_reload = True
# Allow for overriding either of the defaults
auto_reload = kwargs.get("auto_reload", auto_reload)
if sock is None: if sock is None:
host, port = host or "127.0.0.1", port or 8000 host, port = host or "127.0.0.1", port or 8000
@@ -1143,6 +1108,7 @@ class Sanic:
debug=debug, debug=debug,
ssl=ssl, ssl=ssl,
sock=sock, sock=sock,
unix=unix,
workers=workers, workers=workers,
protocol=protocol, protocol=protocol,
backlog=backlog, backlog=backlog,
@@ -1152,19 +1118,15 @@ class Sanic:
try: try:
self.is_running = True self.is_running = True
self.is_stopping = False
if workers > 1 and os.name != "posix":
logger.warn(
f"Multiprocessing is currently not supported on {os.name},"
" using workers=1 instead"
)
workers = 1
if workers == 1: if workers == 1:
if auto_reload and os.name != "posix": serve(**server_settings)
# This condition must be removed after implementing
# auto reloader for other operating systems.
raise NotImplementedError
if (
auto_reload
and os.environ.get("SANIC_SERVER_RUNNING") != "true"
):
reloader_helpers.watchdog(2)
else:
serve(**server_settings)
else: else:
serve_multiple(server_settings, workers) serve_multiple(server_settings, workers)
except BaseException: except BaseException:
@@ -1178,12 +1140,15 @@ class Sanic:
def stop(self): def stop(self):
"""This kills the Sanic""" """This kills the Sanic"""
get_event_loop().stop() if not self.is_stopping:
self.is_stopping = True
get_event_loop().stop()
async def create_server( async def create_server(
self, self,
host: Optional[str] = None, host: Optional[str] = None,
port: Optional[int] = None, port: Optional[int] = None,
*,
debug: bool = False, debug: bool = False,
ssl: Union[dict, SSLContext, None] = None, ssl: Union[dict, SSLContext, None] = None,
sock: Optional[socket] = None, sock: Optional[socket] = None,
@@ -1191,6 +1156,7 @@ class Sanic:
backlog: int = 100, backlog: int = 100,
stop_event: Any = None, stop_event: Any = None,
access_log: Optional[bool] = None, access_log: Optional[bool] = None,
unix: Optional[str] = None,
return_asyncio_server=False, return_asyncio_server=False,
asyncio_server_kwargs=None, asyncio_server_kwargs=None,
) -> Optional[AsyncioServer]: ) -> Optional[AsyncioServer]:
@@ -1260,6 +1226,7 @@ class Sanic:
debug=debug, debug=debug,
ssl=ssl, ssl=ssl,
sock=sock, sock=sock,
unix=unix,
loop=get_event_loop(), loop=get_event_loop(),
protocol=protocol, protocol=protocol,
backlog=backlog, backlog=backlog,
@@ -1325,6 +1292,7 @@ class Sanic:
debug=False, debug=False,
ssl=None, ssl=None,
sock=None, sock=None,
unix=None,
workers=1, workers=1,
loop=None, loop=None,
protocol=HttpProtocol, protocol=HttpProtocol,
@@ -1363,33 +1331,16 @@ class Sanic:
server_settings = { server_settings = {
"protocol": protocol, "protocol": protocol,
"request_class": self.request_class,
"is_request_stream": self.is_request_stream,
"router": self.router,
"host": host, "host": host,
"port": port, "port": port,
"sock": sock, "sock": sock,
"unix": unix,
"ssl": ssl, "ssl": ssl,
"app": self, "app": self,
"signal": Signal(), "signal": Signal(),
"debug": debug,
"request_handler": self.handle_request,
"error_handler": self.error_handler,
"request_timeout": self.config.REQUEST_TIMEOUT,
"response_timeout": self.config.RESPONSE_TIMEOUT,
"keep_alive_timeout": self.config.KEEP_ALIVE_TIMEOUT,
"request_max_size": self.config.REQUEST_MAX_SIZE,
"request_buffer_queue_size": self.config.REQUEST_BUFFER_QUEUE_SIZE,
"keep_alive": self.config.KEEP_ALIVE,
"loop": loop, "loop": loop,
"register_sys_signals": register_sys_signals, "register_sys_signals": register_sys_signals,
"backlog": backlog, "backlog": backlog,
"access_log": self.config.ACCESS_LOG,
"websocket_max_size": self.config.WEBSOCKET_MAX_SIZE,
"websocket_max_queue": self.config.WEBSOCKET_MAX_QUEUE,
"websocket_read_limit": self.config.WEBSOCKET_READ_LIMIT,
"websocket_write_limit": self.config.WEBSOCKET_WRITE_LIMIT,
"graceful_shutdown_timeout": self.config.GRACEFUL_SHUTDOWN_TIMEOUT,
} }
# -------------------------------------------- # # -------------------------------------------- #
@@ -1426,11 +1377,14 @@ class Sanic:
server_settings["run_async"] = True server_settings["run_async"] = True
# Serve # Serve
if host and port and os.environ.get("SANIC_SERVER_RUNNING") != "true": if host and port:
proto = "http" proto = "http"
if ssl is not None: if ssl is not None:
proto = "https" proto = "https"
logger.info("Goin' Fast @ {}://{}:{}".format(proto, host, port)) if unix:
logger.info(f"Goin' Fast @ {unix} {proto}://...")
else:
logger.info(f"Goin' Fast @ {proto}://{host}:{port}")
return server_settings return server_settings
@@ -1438,6 +1392,55 @@ class Sanic:
parts = [self.name, *parts] parts = [self.name, *parts]
return ".".join(parts) return ".".join(parts)
@classmethod
def _loop_add_task(cls, task, app, loop):
if callable(task):
try:
loop.create_task(task(app))
except TypeError:
loop.create_task(task())
else:
loop.create_task(task)
@classmethod
def _cancel_websocket_tasks(cls, app, loop):
for task in app.websocket_tasks:
task.cancel()
async def _websocket_handler(
self, handler, request, *args, subprotocols=None, **kwargs
):
request.app = self
if not getattr(handler, "__blueprintname__", False):
request.endpoint = handler.__name__
else:
request.endpoint = (
getattr(handler, "__blueprintname__", "") + handler.__name__
)
pass
if self.asgi:
ws = request.transport.get_websocket_connection()
else:
protocol = request.transport.get_protocol()
protocol.app = self
ws = await protocol.websocket_handshake(request, subprotocols)
# schedule the application handler
# its future is kept in self.websocket_tasks in case it
# needs to be cancelled due to the server being stopped
fut = ensure_future(handler(request, ws, *args, **kwargs))
self.websocket_tasks.add(fut)
try:
await fut
except (CancelledError, ConnectionClosed):
pass
finally:
self.websocket_tasks.remove(fut)
await ws.close()
# -------------------------------------------------------------------- # # -------------------------------------------------------------------- #
# ASGI # ASGI
# -------------------------------------------------------------------- # # -------------------------------------------------------------------- #
@@ -1449,3 +1452,13 @@ class Sanic:
self.asgi = True self.asgi = True
asgi_app = await ASGIApp.create(self, scope, receive, send) asgi_app = await ASGIApp.create(self, scope, receive, send)
await asgi_app() await asgi_app()
# -------------------------------------------------------------------- #
# Configuration
# -------------------------------------------------------------------- #
def update_config(self, config: Union[bytes, str, dict, Any]):
"""Update app.config.
Please refer to config.py::Config.update_config for documentation."""
self.config.update_config(config)

View File

@@ -22,7 +22,7 @@ from sanic.exceptions import InvalidUsage, ServerError
from sanic.log import logger from sanic.log import logger
from sanic.request import Request from sanic.request import Request
from sanic.response import HTTPResponse, StreamingHTTPResponse from sanic.response import HTTPResponse, StreamingHTTPResponse
from sanic.server import StreamBuffer from sanic.server import ConnInfo, StreamBuffer
from sanic.websocket import WebSocketConnection from sanic.websocket import WebSocketConnection
@@ -98,7 +98,9 @@ class MockTransport:
def create_websocket_connection( def create_websocket_connection(
self, send: ASGISend, receive: ASGIReceive self, send: ASGISend, receive: ASGIReceive
) -> WebSocketConnection: ) -> WebSocketConnection:
self._websocket_connection = WebSocketConnection(send, receive) self._websocket_connection = WebSocketConnection(
send, receive, self.scope.get("subprotocols", [])
)
return self._websocket_connection return self._websocket_connection
def add_task(self) -> None: def add_task(self) -> None:
@@ -255,6 +257,7 @@ class ASGIApp:
instance.transport, instance.transport,
sanic_app, sanic_app,
) )
instance.request.conn_info = ConnInfo(instance.transport)
if sanic_app.is_request_stream: if sanic_app.is_request_stream:
is_stream_handler = sanic_app.router.is_stream_handler( is_stream_handler = sanic_app.router.is_stream_handler(

View File

@@ -143,7 +143,7 @@ class Blueprint:
if _routes: if _routes:
routes += _routes routes += _routes
route_names = [route.name for route in routes] route_names = [route.name for route in routes if route]
# Middleware # Middleware
for future in self.middlewares: for future in self.middlewares:
if future.args or future.kwargs: if future.args or future.kwargs:
@@ -151,7 +151,7 @@ class Blueprint:
future.middleware, future.middleware,
route_names, route_names,
*future.args, *future.args,
**future.kwargs **future.kwargs,
) )
else: else:
app.register_named_middleware(future.middleware, route_names) app.register_named_middleware(future.middleware, route_names)
@@ -283,6 +283,13 @@ class Blueprint:
strict_slashes = self.strict_slashes strict_slashes = self.strict_slashes
def decorator(handler): def decorator(handler):
nonlocal uri
nonlocal host
nonlocal strict_slashes
nonlocal version
nonlocal name
name = f"{self.name}.{name or handler.__name__}"
route = FutureRoute( route = FutureRoute(
handler, uri, [], host, strict_slashes, False, version, name handler, uri, [], host, strict_slashes, False, version, name
) )
@@ -376,7 +383,7 @@ class Blueprint:
""" """
name = kwargs.pop("name", "static") name = kwargs.pop("name", "static")
if not name.startswith(self.name + "."): if not name.startswith(self.name + "."):
name = "{}.{}".format(self.name, name) name = f"{self.name}.{name}"
kwargs.update(name=name) kwargs.update(name=name)
strict_slashes = kwargs.get("strict_slashes") strict_slashes = kwargs.get("strict_slashes")

View File

@@ -1,6 +1,53 @@
import asyncio
import signal
from sys import argv
from multidict import CIMultiDict # type: ignore from multidict import CIMultiDict # type: ignore
class Header(CIMultiDict): class Header(CIMultiDict):
def get_all(self, key): def get_all(self, key):
return self.getall(key, default=[]) return self.getall(key, default=[])
use_trio = argv[0].endswith("hypercorn") and "trio" in argv
if use_trio:
from trio import Path # type: ignore
from trio import open_file as open_async # type: ignore
def stat_async(path):
return Path(path).stat()
else:
from aiofiles import open as aio_open # type: ignore
from aiofiles.os import stat as stat_async # type: ignore # noqa: F401
async def open_async(file, mode="r", **kwargs):
return aio_open(file, mode, **kwargs)
def ctrlc_workaround_for_windows(app):
async def stay_active(app):
"""Asyncio wakeups to allow receiving SIGINT in Python"""
while not die:
# If someone else stopped the app, just exit
if app.is_stopping:
return
# Windows Python blocks signal handlers while the event loop is
# waiting for I/O. Frequent wakeups keep interrupts flowing.
await asyncio.sleep(0.1)
# Can't be called from signal handler, so call it from here
app.stop()
def ctrlc_handler(sig, frame):
nonlocal die
if die:
raise KeyboardInterrupt("Non-graceful Ctrl+C")
die = True
die = False
signal.signal(signal.SIGINT, ctrlc_handler)
app.add_task(stay_active)

View File

@@ -1,8 +1,15 @@
import os from os import environ
import types from typing import Any, Union
from sanic.exceptions import PyFileError # NOTE(tomaszdrozdz): remove in version: 21.3
from sanic.helpers import import_string # We replace from_envvar(), from_object(), from_pyfile() config object methods
# with one simpler update_config() method.
# We also replace "loading module from file code" in from_pyfile()
# in a favour of load_module_from_file_location().
# Please see pull request: 1903
# and issue: 1895
from .deprecated import from_envvar, from_object, from_pyfile # noqa
from .utils import load_module_from_file_location, str_to_bool
SANIC_PREFIX = "SANIC_" SANIC_PREFIX = "SANIC_"
@@ -20,16 +27,19 @@ DEFAULT_CONFIG = {
"RESPONSE_TIMEOUT": 60, # 60 seconds "RESPONSE_TIMEOUT": 60, # 60 seconds
"KEEP_ALIVE": True, "KEEP_ALIVE": True,
"KEEP_ALIVE_TIMEOUT": 5, # 5 seconds "KEEP_ALIVE_TIMEOUT": 5, # 5 seconds
"WEBSOCKET_MAX_SIZE": 2 ** 20, # 1 megabytes "WEBSOCKET_MAX_SIZE": 2 ** 20, # 1 megabyte
"WEBSOCKET_MAX_QUEUE": 32, "WEBSOCKET_MAX_QUEUE": 32,
"WEBSOCKET_READ_LIMIT": 2 ** 16, "WEBSOCKET_READ_LIMIT": 2 ** 16,
"WEBSOCKET_WRITE_LIMIT": 2 ** 16, "WEBSOCKET_WRITE_LIMIT": 2 ** 16,
"WEBSOCKET_PING_TIMEOUT": 20,
"WEBSOCKET_PING_INTERVAL": 20,
"GRACEFUL_SHUTDOWN_TIMEOUT": 15.0, # 15 sec "GRACEFUL_SHUTDOWN_TIMEOUT": 15.0, # 15 sec
"ACCESS_LOG": True, "ACCESS_LOG": True,
"FORWARDED_SECRET": None, "FORWARDED_SECRET": None,
"REAL_IP_HEADER": None, "REAL_IP_HEADER": None,
"PROXIES_COUNT": None, "PROXIES_COUNT": None,
"FORWARDED_FOR_HEADER": "X-Forwarded-For", "FORWARDED_FOR_HEADER": "X-Forwarded-For",
"FALLBACK_ERROR_FORMAT": "html",
} }
@@ -51,81 +61,28 @@ class Config(dict):
try: try:
return self[attr] return self[attr]
except KeyError as ke: except KeyError as ke:
raise AttributeError("Config has no '{}'".format(ke.args[0])) raise AttributeError(f"Config has no '{ke.args[0]}'")
def __setattr__(self, attr, value): def __setattr__(self, attr, value):
self[attr] = value self[attr] = value
def from_envvar(self, variable_name): # NOTE(tomaszdrozdz): remove in version: 21.3
"""Load a configuration from an environment variable pointing to # We replace from_envvar(), from_object(), from_pyfile() config object
a configuration file. # methods with one simpler update_config() method.
# We also replace "loading module from file code" in from_pyfile()
:param variable_name: name of the environment variable # in a favour of load_module_from_file_location().
:return: bool. ``True`` if able to load config, ``False`` otherwise. # Please see pull request: 1903
""" # and issue: 1895
config_file = os.environ.get(variable_name) from_envvar = from_envvar
if not config_file: from_pyfile = from_pyfile
raise RuntimeError( from_object = from_object
"The environment variable %r is not set and "
"thus configuration could not be loaded." % variable_name
)
return self.from_pyfile(config_file)
def from_pyfile(self, filename):
"""Update the values in the config from a Python file.
Only the uppercase variables in that module are stored in the config.
:param filename: an absolute path to the config file
"""
module = types.ModuleType("config")
module.__file__ = filename
try:
with open(filename) as config_file:
exec( # nosec
compile(config_file.read(), filename, "exec"),
module.__dict__,
)
except IOError as e:
e.strerror = "Unable to load configuration file (%s)" % e.strerror
raise
except Exception as e:
raise PyFileError(filename) from e
self.from_object(module)
return True
def from_object(self, obj):
"""Update the values from the given object.
Objects are usually either modules or classes.
Just the uppercase variables in that object are stored in the config.
Example usage::
from yourapplication import default_config
app.config.from_object(default_config)
or also:
app.config.from_object('myproject.config.MyConfigClass')
You should not use this function to load the actual configuration but
rather configuration defaults. The actual config should be loaded
with :meth:`from_pyfile` and ideally from a location not within the
package because the package might be installed system wide.
:param obj: an object holding the configuration
"""
if isinstance(obj, str):
obj = import_string(obj)
for key in dir(obj):
if key.isupper():
self[key] = getattr(obj, key)
def load_environment_vars(self, prefix=SANIC_PREFIX): def load_environment_vars(self, prefix=SANIC_PREFIX):
""" """
Looks for prefixed environment variables and applies Looks for prefixed environment variables and applies
them to the configuration if present. them to the configuration if present.
""" """
for k, v in os.environ.items(): for k, v in environ.items():
if k.startswith(prefix): if k.startswith(prefix):
_, config_key = k.split(prefix, 1) _, config_key = k.split(prefix, 1)
try: try:
@@ -135,23 +92,47 @@ class Config(dict):
self[config_key] = float(v) self[config_key] = float(v)
except ValueError: except ValueError:
try: try:
self[config_key] = strtobool(v) self[config_key] = str_to_bool(v)
except ValueError: except ValueError:
self[config_key] = v self[config_key] = v
def update_config(self, config: Union[bytes, str, dict, Any]):
"""Update app.config.
def strtobool(val): Note:: only upper case settings are considered.
"""
This function was borrowed from distutils.utils. While distutils
is part of stdlib, it feels odd to use distutils in main application code.
The function was modified to walk its talk and actually return bool You can upload app config by providing path to py file
and not int. holding settings.
"""
val = val.lower() # /some/py/file
if val in ("y", "yes", "t", "true", "on", "1"): A = 1
return True B = 2
elif val in ("n", "no", "f", "false", "off", "0"):
return False config.update_config("${some}/py/file")
else:
raise ValueError("invalid truth value %r" % (val,)) Yes you can put environment variable here, but they must be provided
in format: ${some_env_var}, and mark that $some_env_var is treated
as plain string.
You can upload app config by providing dict holding settings.
d = {"A": 1, "B": 2}
config.update_config(d)
You can upload app config by providing any object holding settings,
but in such case config.__dict__ will be used as dict holding settings.
class C:
A = 1
B = 2
config.update_config(C)"""
if isinstance(config, (bytes, str)):
config = load_module_from_file_location(location=config)
if not isinstance(config, dict):
config = config.__dict__
config = dict(filter(lambda i: i[0].isupper(), config.items()))
self.update(config)

106
sanic/deprecated.py Normal file
View File

@@ -0,0 +1,106 @@
# NOTE(tomaszdrozdz): remove in version: 21.3
# We replace from_envvar(), from_object(), from_pyfile() config object methods
# with one simpler update_config() method.
# We also replace "loading module from file code" in from_pyfile()
# in a favour of load_module_from_file_location().
# Please see pull request: 1903
# and issue: 1895
import types
from os import environ
from typing import Any
from warnings import warn
from sanic.exceptions import PyFileError
from sanic.helpers import import_string
def from_envvar(self, variable_name: str) -> bool:
"""Load a configuration from an environment variable pointing to
a configuration file.
:param variable_name: name of the environment variable
:return: bool. ``True`` if able to load config, ``False`` otherwise.
"""
warn(
"Using `from_envvar` method is deprecated and will be removed in "
"v21.3, use `app.update_config` method instead.",
DeprecationWarning,
stacklevel=2,
)
config_file = environ.get(variable_name)
if not config_file:
raise RuntimeError(
f"The environment variable {variable_name} is not set and "
f"thus configuration could not be loaded."
)
return self.from_pyfile(config_file)
def from_pyfile(self, filename: str) -> bool:
"""Update the values in the config from a Python file.
Only the uppercase variables in that module are stored in the config.
:param filename: an absolute path to the config file
"""
warn(
"Using `from_pyfile` method is deprecated and will be removed in "
"v21.3, use `app.update_config` method instead.",
DeprecationWarning,
stacklevel=2,
)
module = types.ModuleType("config")
module.__file__ = filename
try:
with open(filename) as config_file:
exec( # nosec
compile(config_file.read(), filename, "exec"),
module.__dict__,
)
except IOError as e:
e.strerror = "Unable to load configuration file (e.strerror)"
raise
except Exception as e:
raise PyFileError(filename) from e
self.from_object(module)
return True
def from_object(self, obj: Any) -> None:
"""Update the values from the given object.
Objects are usually either modules or classes.
Just the uppercase variables in that object are stored in the config.
Example usage::
from yourapplication import default_config
app.config.from_object(default_config)
or also:
app.config.from_object('myproject.config.MyConfigClass')
You should not use this function to load the actual configuration but
rather configuration defaults. The actual config should be loaded
with :meth:`from_pyfile` and ideally from a location not within the
package because the package might be installed system wide.
:param obj: an object holding the configuration
"""
warn(
"Using `from_object` method is deprecated and will be removed in "
"v21.3, use `app.update_config` method instead.",
DeprecationWarning,
stacklevel=2,
)
if isinstance(obj, str):
obj = import_string(obj)
for key in dir(obj):
if key.isupper():
self[key] = getattr(obj, key)

330
sanic/errorpages.py Normal file
View File

@@ -0,0 +1,330 @@
import sys
import typing as t
from functools import partial
from traceback import extract_tb
from sanic.exceptions import InvalidUsage, SanicException
from sanic.helpers import STATUS_CODES
from sanic.request import Request
from sanic.response import HTTPResponse, html, json, text
try:
from ujson import dumps
dumps = partial(dumps, escape_forward_slashes=False)
except ImportError: # noqa
from json import dumps # type: ignore
FALLBACK_TEXT = (
"The server encountered an internal error and "
"cannot complete your request."
)
FALLBACK_STATUS = 500
class BaseRenderer:
def __init__(self, request, exception, debug):
self.request = request
self.exception = exception
self.debug = debug
@property
def headers(self):
if isinstance(self.exception, SanicException):
return getattr(self.exception, "headers", {})
return {}
@property
def status(self):
if isinstance(self.exception, SanicException):
return getattr(self.exception, "status_code", FALLBACK_STATUS)
return FALLBACK_STATUS
@property
def text(self):
if self.debug or isinstance(self.exception, SanicException):
return str(self.exception)
return FALLBACK_TEXT
@property
def title(self):
status_text = STATUS_CODES.get(self.status, b"Error Occurred").decode()
return f"{self.status}{status_text}"
def render(self):
output = (
self.full
if self.debug and not getattr(self.exception, "quiet", False)
else self.minimal
)
return output()
def minimal(self): # noqa
raise NotImplementedError
def full(self): # noqa
raise NotImplementedError
class HTMLRenderer(BaseRenderer):
TRACEBACK_STYLE = """
html { font-family: sans-serif }
h2 { color: #888; }
.tb-wrapper p { margin: 0 }
.frame-border { margin: 1rem }
.frame-line > * { padding: 0.3rem 0.6rem }
.frame-line { margin-bottom: 0.3rem }
.frame-code { font-size: 16px; padding-left: 4ch }
.tb-wrapper { border: 1px solid #eee }
.tb-header { background: #eee; padding: 0.3rem; font-weight: bold }
.frame-descriptor { background: #e2eafb; font-size: 14px }
"""
TRACEBACK_WRAPPER_HTML = (
"<div class=tb-header>{exc_name}: {exc_value}</div>"
"<div class=tb-wrapper>{frame_html}</div>"
)
TRACEBACK_BORDER = (
"<div class=frame-border>"
"The above exception was the direct cause of the following exception:"
"</div>"
)
TRACEBACK_LINE_HTML = (
"<div class=frame-line>"
"<p class=frame-descriptor>"
"File {0.filename}, line <i>{0.lineno}</i>, "
"in <code><b>{0.name}</b></code>"
"<p class=frame-code><code>{0.line}</code>"
"</div>"
)
OUTPUT_HTML = (
"<!DOCTYPE html><html lang=en>"
"<meta charset=UTF-8><title>{title}</title>\n"
"<style>{style}</style>\n"
"<h1>{title}</h1><p>{text}\n"
"{body}"
)
def full(self):
return html(
self.OUTPUT_HTML.format(
title=self.title,
text=self.text,
style=self.TRACEBACK_STYLE,
body=self._generate_body(),
),
status=self.status,
)
def minimal(self):
return html(
self.OUTPUT_HTML.format(
title=self.title,
text=self.text,
style=self.TRACEBACK_STYLE,
body="",
),
status=self.status,
headers=self.headers,
)
@property
def text(self):
return escape(super().text)
@property
def title(self):
return escape(f"⚠️ {super().title}")
def _generate_body(self):
_, exc_value, __ = sys.exc_info()
exceptions = []
while exc_value:
exceptions.append(self._format_exc(exc_value))
exc_value = exc_value.__cause__
traceback_html = self.TRACEBACK_BORDER.join(reversed(exceptions))
appname = escape(self.request.app.name)
name = escape(self.exception.__class__.__name__)
value = escape(self.exception)
path = escape(self.request.path)
lines = [
f"<h2>Traceback of {appname} (most recent call last):</h2>",
f"{traceback_html}",
"<div class=summary><p>",
f"<b>{name}: {value}</b> while handling path <code>{path}</code>",
"</div>",
]
return "\n".join(lines)
def _format_exc(self, exc):
frames = extract_tb(exc.__traceback__)
frame_html = "".join(
self.TRACEBACK_LINE_HTML.format(frame) for frame in frames
)
return self.TRACEBACK_WRAPPER_HTML.format(
exc_name=escape(exc.__class__.__name__),
exc_value=escape(exc),
frame_html=frame_html,
)
class TextRenderer(BaseRenderer):
OUTPUT_TEXT = "{title}\n{bar}\n{text}\n\n{body}"
SPACER = " "
def full(self):
return text(
self.OUTPUT_TEXT.format(
title=self.title,
text=self.text,
bar=("=" * len(self.title)),
body=self._generate_body(),
),
status=self.status,
)
def minimal(self):
return text(
self.OUTPUT_TEXT.format(
title=self.title,
text=self.text,
bar=("=" * len(self.title)),
body="",
),
status=self.status,
headers=self.headers,
)
@property
def title(self):
return f"⚠️ {super().title}"
def _generate_body(self):
_, exc_value, __ = sys.exc_info()
exceptions = []
# traceback_html = self.TRACEBACK_BORDER.join(reversed(exceptions))
lines = [
f"{self.exception.__class__.__name__}: {self.exception} while "
f"handling path {self.request.path}",
f"Traceback of {self.request.app.name} (most recent call last):\n",
]
while exc_value:
exceptions.append(self._format_exc(exc_value))
exc_value = exc_value.__cause__
return "\n".join(lines + exceptions[::-1])
def _format_exc(self, exc):
frames = "\n\n".join(
[
f"{self.SPACER * 2}File {frame.filename}, "
f"line {frame.lineno}, in "
f"{frame.name}\n{self.SPACER * 2}{frame.line}"
for frame in extract_tb(exc.__traceback__)
]
)
return f"{self.SPACER}{exc.__class__.__name__}: {exc}\n{frames}"
class JSONRenderer(BaseRenderer):
def full(self):
output = self._generate_output(full=True)
return json(output, status=self.status, dumps=dumps)
def minimal(self):
output = self._generate_output(full=False)
return json(output, status=self.status, dumps=dumps)
def _generate_output(self, *, full):
output = {
"description": self.title,
"status": self.status,
"message": self.text,
}
if full:
_, exc_value, __ = sys.exc_info()
exceptions = []
while exc_value:
exceptions.append(
{
"type": exc_value.__class__.__name__,
"exception": str(exc_value),
"frames": [
{
"file": frame.filename,
"line": frame.lineno,
"name": frame.name,
"src": frame.line,
}
for frame in extract_tb(exc_value.__traceback__)
],
}
)
exc_value = exc_value.__cause__
output["path"] = self.request.path
output["args"] = self.request.args
output["exceptions"] = exceptions[::-1]
return output
@property
def title(self):
return STATUS_CODES.get(self.status, b"Error Occurred").decode()
def escape(text):
"""Minimal HTML escaping, not for attribute values (unlike html.escape)."""
return f"{text}".replace("&", "&amp;").replace("<", "&lt;")
RENDERERS_BY_CONFIG = {
"html": HTMLRenderer,
"json": JSONRenderer,
"text": TextRenderer,
}
RENDERERS_BY_CONTENT_TYPE = {
"multipart/form-data": HTMLRenderer,
"application/json": JSONRenderer,
"text/plain": TextRenderer,
}
def exception_response(
request: Request,
exception: Exception,
debug: bool,
renderer: t.Type[t.Optional[BaseRenderer]] = None,
) -> HTTPResponse:
"""Render a response for the default FALLBACK exception handler"""
if not renderer:
renderer = HTMLRenderer
if request:
if request.app.config.FALLBACK_ERROR_FORMAT == "auto":
try:
renderer = JSONRenderer if request.json else HTMLRenderer
except InvalidUsage:
renderer = HTMLRenderer
content_type, *_ = request.headers.get(
"content-type", ""
).split(";")
renderer = RENDERERS_BY_CONTENT_TYPE.get(
content_type, renderer
)
else:
render_format = request.app.config.FALLBACK_ERROR_FORMAT
renderer = RENDERERS_BY_CONFIG.get(render_format, renderer)
renderer = t.cast(t.Type[BaseRenderer], renderer)
return renderer(request, exception, debug).render()

View File

@@ -1,133 +1,18 @@
from sanic.helpers import STATUS_CODES from sanic.helpers import STATUS_CODES
TRACEBACK_STYLE = """
<style>
body {
padding: 20px;
font-family: Arial, sans-serif;
}
p {
margin: 0;
}
.summary {
padding: 10px;
}
h1 {
margin-bottom: 0;
}
h3 {
margin-top: 10px;
}
h3 code {
font-size: 24px;
}
.frame-line > * {
padding: 5px 10px;
}
.frame-line {
margin-bottom: 5px;
}
.frame-code {
font-size: 16px;
padding-left: 30px;
}
.tb-wrapper {
border: 1px solid #f3f3f3;
}
.tb-header {
background-color: #f3f3f3;
padding: 5px 10px;
}
.tb-border {
padding-top: 20px;
}
.frame-descriptor {
background-color: #e2eafb;
}
.frame-descriptor {
font-size: 14px;
}
</style>
"""
TRACEBACK_WRAPPER_HTML = """
<html>
<head>
{style}
</head>
<body>
{inner_html}
<div class="summary">
<p>
<b>{exc_name}: {exc_value}</b>
while handling path <code>{path}</code>
</p>
</div>
</body>
</html>
"""
TRACEBACK_WRAPPER_INNER_HTML = """
<h1>{exc_name}</h1>
<h3><code>{exc_value}</code></h3>
<div class="tb-wrapper">
<p class="tb-header">Traceback (most recent call last):</p>
{frame_html}
</div>
"""
TRACEBACK_BORDER = """
<div class="tb-border">
<b><i>
The above exception was the direct cause of the
following exception:
</i></b>
</div>
"""
TRACEBACK_LINE_HTML = """
<div class="frame-line">
<p class="frame-descriptor">
File {0.filename}, line <i>{0.lineno}</i>,
in <code><b>{0.name}</b></code>
</p>
<p class="frame-code"><code>{0.line}</code></p>
</div>
"""
INTERNAL_SERVER_ERROR_HTML = """
<h1>Internal Server Error</h1>
<p>
The server encountered an internal error and cannot complete
your request.
</p>
"""
_sanic_exceptions = {} _sanic_exceptions = {}
def add_status_code(code): def add_status_code(code, quiet=None):
""" """
Decorator used for adding exceptions to :class:`SanicException`. Decorator used for adding exceptions to :class:`SanicException`.
""" """
def class_decorator(cls): def class_decorator(cls):
cls.status_code = code cls.status_code = code
if quiet or quiet is None and code != 500:
cls.quiet = True
_sanic_exceptions[code] = cls _sanic_exceptions[code] = cls
return cls return cls
@@ -135,12 +20,16 @@ def add_status_code(code):
class SanicException(Exception): class SanicException(Exception):
def __init__(self, message, status_code=None): def __init__(self, message, status_code=None, quiet=None):
super().__init__(message) super().__init__(message)
if status_code is not None: if status_code is not None:
self.status_code = status_code self.status_code = status_code
# quiet=None/False/True with None meaning choose by status
if quiet or quiet is None and status_code not in (None, 500):
self.quiet = True
@add_status_code(404) @add_status_code(404)
class NotFound(SanicException): class NotFound(SanicException):
@@ -156,10 +45,7 @@ class InvalidUsage(SanicException):
class MethodNotSupported(SanicException): class MethodNotSupported(SanicException):
def __init__(self, message, method, allowed_methods): def __init__(self, message, method, allowed_methods):
super().__init__(message) super().__init__(message)
self.headers = dict() self.headers = {"Allow": ", ".join(allowed_methods)}
self.headers["Allow"] = ", ".join(allowed_methods)
if method in ["HEAD", "PATCH", "PUT", "DELETE"]:
self.headers["Content-Length"] = 0
@add_status_code(500) @add_status_code(500)
@@ -212,10 +98,7 @@ class HeaderNotFound(InvalidUsage):
class ContentRangeError(SanicException): class ContentRangeError(SanicException):
def __init__(self, message, content_range): def __init__(self, message, content_range):
super().__init__(message) super().__init__(message)
self.headers = { self.headers = {"Content-Range": f"bytes */{content_range.total}"}
"Content-Type": "text/plain",
"Content-Range": "bytes */%s" % (content_range.total,),
}
@add_status_code(417) @add_status_code(417)
@@ -282,10 +165,14 @@ class Unauthorized(SanicException):
challenge = ", ".join(values) challenge = ", ".join(values)
self.headers = { self.headers = {
"WWW-Authenticate": "{} {}".format(scheme, challenge).rstrip() "WWW-Authenticate": f"{scheme} {challenge}".rstrip()
} }
class LoadFileException(SanicException):
pass
def abort(status_code, message=None): def abort(status_code, message=None):
""" """
Raise an exception based on SanicException. Returns the HTTP response Raise an exception based on SanicException. Returns the HTTP response

View File

@@ -1,21 +1,13 @@
import sys from traceback import format_exc
from traceback import extract_tb, format_exc
from sanic.errorpages import exception_response
from sanic.exceptions import ( from sanic.exceptions import (
INTERNAL_SERVER_ERROR_HTML,
TRACEBACK_BORDER,
TRACEBACK_LINE_HTML,
TRACEBACK_STYLE,
TRACEBACK_WRAPPER_HTML,
TRACEBACK_WRAPPER_INNER_HTML,
ContentRangeError, ContentRangeError,
HeaderNotFound, HeaderNotFound,
InvalidRangeType, InvalidRangeType,
SanicException,
) )
from sanic.log import logger from sanic.log import logger
from sanic.response import html, text from sanic.response import text
class ErrorHandler: class ErrorHandler:
@@ -40,35 +32,6 @@ class ErrorHandler:
self.cached_handlers = {} self.cached_handlers = {}
self.debug = False self.debug = False
def _render_exception(self, exception):
frames = extract_tb(exception.__traceback__)
frame_html = []
for frame in frames:
frame_html.append(TRACEBACK_LINE_HTML.format(frame))
return TRACEBACK_WRAPPER_INNER_HTML.format(
exc_name=exception.__class__.__name__,
exc_value=exception,
frame_html="".join(frame_html),
)
def _render_traceback_html(self, exception, request):
exc_type, exc_value, tb = sys.exc_info()
exceptions = []
while exc_value:
exceptions.append(self._render_exception(exc_value))
exc_value = exc_value.__cause__
return TRACEBACK_WRAPPER_HTML.format(
style=TRACEBACK_STYLE,
exc_name=exception.__class__.__name__,
exc_value=exception,
inner_html=TRACEBACK_BORDER.join(reversed(exceptions)),
path=request.path,
)
def add(self, exception, handler): def add(self, exception, handler):
""" """
Add a new exception handler to an already existing handler object. Add a new exception handler to an already existing handler object.
@@ -166,27 +129,17 @@ class ErrorHandler:
:class:`Exception` :class:`Exception`
:return: :return:
""" """
self.log(format_exc()) quiet = getattr(exception, "quiet", False)
try: if quiet is False:
url = repr(request.url) try:
except AttributeError: url = repr(request.url)
url = "unknown" except AttributeError:
url = "unknown"
response_message = "Exception occurred while handling uri: %s" self.log(format_exc())
logger.exception(response_message, url) logger.exception("Exception occurred while handling uri: %s", url)
if issubclass(type(exception), SanicException): return exception_response(request, exception, self.debug)
return text(
"Error: {}".format(exception),
status=getattr(exception, "status_code", 500),
headers=getattr(exception, "headers", dict()),
)
elif self.debug:
html_output = self._render_traceback_html(exception, request)
return html(html_output, status=500)
else:
return html(INTERNAL_SERVER_ERROR_HTML, status=500)
class ContentRangeHandler: class ContentRangeHandler:

View File

@@ -3,6 +3,8 @@ import re
from typing import Any, Dict, Iterable, List, Optional, Tuple, Union from typing import Any, Dict, Iterable, List, Optional, Tuple, Union
from urllib.parse import unquote from urllib.parse import unquote
from sanic.helpers import STATUS_CODES
HeaderIterable = Iterable[Tuple[str, Any]] # Values convertible to str HeaderIterable = Iterable[Tuple[str, Any]] # Values convertible to str
Options = Dict[str, Union[int, str]] # key=value fields in various headers Options = Dict[str, Union[int, str]] # key=value fields in various headers
@@ -180,3 +182,19 @@ def format_http1(headers: HeaderIterable) -> bytes:
- Values are converted into strings if necessary. - Values are converted into strings if necessary.
""" """
return "".join(f"{name}: {val}\r\n" for name, val in headers).encode() return "".join(f"{name}: {val}\r\n" for name, val in headers).encode()
def format_http1_response(
status: int, headers: HeaderIterable, body=b""
) -> bytes:
"""Format a full HTTP/1.1 response.
- If `body` is included, content-length must be specified in headers.
"""
headerbytes = format_http1(headers)
return b"HTTP/1.1 %d %b\r\n%b\r\n%b" % (
status,
STATUS_CODES.get(status, b"UNKNOWN"),
headerbytes,
body,
)

View File

@@ -3,7 +3,6 @@ import signal
import subprocess import subprocess
import sys import sys
from multiprocessing import Process
from time import sleep from time import sleep
@@ -35,101 +34,26 @@ def _iter_module_files():
def _get_args_for_reloading(): def _get_args_for_reloading():
"""Returns the executable.""" """Returns the executable."""
rv = [sys.executable]
main_module = sys.modules["__main__"] main_module = sys.modules["__main__"]
mod_spec = getattr(main_module, "__spec__", None) mod_spec = getattr(main_module, "__spec__", None)
if sys.argv[0] in ("", "-c"):
raise RuntimeError(
f"Autoreloader cannot work with argv[0]={sys.argv[0]!r}"
)
if mod_spec: if mod_spec:
# Parent exe was launched as a module rather than a script # Parent exe was launched as a module rather than a script
rv.extend(["-m", mod_spec.name]) return [sys.executable, "-m", mod_spec.name] + sys.argv[1:]
if len(sys.argv) > 1: return [sys.executable] + sys.argv
rv.extend(sys.argv[1:])
else:
rv.extend(sys.argv)
return rv
def restart_with_reloader(): def restart_with_reloader():
"""Create a new process and a subprocess in it with the same arguments as """Create a new process and a subprocess in it with the same arguments as
this one. this one.
""" """
cwd = os.getcwd() return subprocess.Popen(
args = _get_args_for_reloading() _get_args_for_reloading(),
new_environ = os.environ.copy() env={**os.environ, "SANIC_SERVER_RUNNING": "true"},
new_environ["SANIC_SERVER_RUNNING"] = "true"
cmd = " ".join(args)
worker_process = Process(
target=subprocess.call,
args=(cmd,),
kwargs={"cwd": cwd, "shell": True, "env": new_environ},
) )
worker_process.start()
return worker_process
def kill_process_children_unix(pid):
"""Find and kill child processes of a process (maximum two level).
:param pid: PID of parent process (process ID)
:return: Nothing
"""
root_process_path = "/proc/{pid}/task/{pid}/children".format(pid=pid)
if not os.path.isfile(root_process_path):
return
with open(root_process_path) as children_list_file:
children_list_pid = children_list_file.read().split()
for child_pid in children_list_pid:
children_proc_path = "/proc/%s/task/%s/children" % (
child_pid,
child_pid,
)
if not os.path.isfile(children_proc_path):
continue
with open(children_proc_path) as children_list_file_2:
children_list_pid_2 = children_list_file_2.read().split()
for _pid in children_list_pid_2:
try:
os.kill(int(_pid), signal.SIGTERM)
except ProcessLookupError:
continue
try:
os.kill(int(child_pid), signal.SIGTERM)
except ProcessLookupError:
continue
def kill_process_children_osx(pid):
"""Find and kill child processes of a process.
:param pid: PID of parent process (process ID)
:return: Nothing
"""
subprocess.run(["pkill", "-P", str(pid)])
def kill_process_children(pid):
"""Find and kill child processes of a process.
:param pid: PID of parent process (process ID)
:return: Nothing
"""
if sys.platform == "darwin":
kill_process_children_osx(pid)
elif sys.platform == "linux":
kill_process_children_unix(pid)
else:
pass # should signal error here
def kill_program_completly(proc):
"""Kill worker and it's child processes and exit.
:param proc: worker process (process ID)
:return: Nothing
"""
kill_process_children(proc.pid)
proc.terminate()
os._exit(0)
def watchdog(sleep_interval): def watchdog(sleep_interval):
@@ -138,30 +62,42 @@ def watchdog(sleep_interval):
:param sleep_interval: interval in second. :param sleep_interval: interval in second.
:return: Nothing :return: Nothing
""" """
def interrupt_self(*args):
raise KeyboardInterrupt
mtimes = {} mtimes = {}
signal.signal(signal.SIGTERM, interrupt_self)
if os.name == "nt":
signal.signal(signal.SIGBREAK, interrupt_self)
worker_process = restart_with_reloader() worker_process = restart_with_reloader()
signal.signal(
signal.SIGTERM, lambda *args: kill_program_completly(worker_process)
)
signal.signal(
signal.SIGINT, lambda *args: kill_program_completly(worker_process)
)
while True:
for filename in _iter_module_files():
try:
mtime = os.stat(filename).st_mtime
except OSError:
continue
old_time = mtimes.get(filename) try:
if old_time is None: while True:
mtimes[filename] = mtime need_reload = False
continue
elif mtime > old_time: for filename in _iter_module_files():
kill_process_children(worker_process.pid) try:
mtime = os.stat(filename).st_mtime
except OSError:
continue
old_time = mtimes.get(filename)
if old_time is None:
mtimes[filename] = mtime
elif mtime > old_time:
mtimes[filename] = mtime
need_reload = True
if need_reload:
worker_process.terminate() worker_process.terminate()
worker_process.wait()
worker_process = restart_with_reloader() worker_process = restart_with_reloader()
mtimes[filename] = mtime
break
sleep(sleep_interval) sleep(sleep_interval)
except KeyboardInterrupt:
pass
finally:
worker_process.terminate()
worker_process.wait()

View File

@@ -1,6 +1,5 @@
import asyncio import asyncio
import email.utils import email.utils
import warnings
from collections import defaultdict, namedtuple from collections import defaultdict, namedtuple
from http.cookies import SimpleCookie from http.cookies import SimpleCookie
@@ -56,6 +55,14 @@ class StreamBuffer:
self._queue.task_done() self._queue.task_done()
return payload return payload
async def __aiter__(self):
"""Support `async for data in request.stream`"""
while True:
data = await self.read()
if not data:
break
yield data
async def put(self, payload): async def put(self, payload):
await self._queue.put(payload) await self._queue.put(payload)
@@ -80,6 +87,7 @@ class Request:
"_socket", "_socket",
"app", "app",
"body", "body",
"conn_info",
"ctx", "ctx",
"endpoint", "endpoint",
"headers", "headers",
@@ -110,6 +118,7 @@ class Request:
# Init but do not inhale # Init but do not inhale
self.body_init() self.body_init()
self.conn_info = None
self.ctx = SimpleNamespace() self.ctx = SimpleNamespace()
self.parsed_forwarded = None self.parsed_forwarded = None
self.parsed_json = None self.parsed_json = None
@@ -123,44 +132,37 @@ class Request:
self.endpoint = None self.endpoint = None
def __repr__(self): def __repr__(self):
return "<{0}: {1} {2}>".format( class_name = self.__class__.__name__
self.__class__.__name__, self.method, self.path return f"<{class_name}: {self.method} {self.path}>"
)
def get(self, key, default=None):
""".. deprecated:: 19.9
Custom context is now stored in `request.custom_context.yourkey`"""
return self.ctx.__dict__.get(key, default)
def __contains__(self, key):
""".. deprecated:: 19.9
Custom context is now stored in `request.custom_context.yourkey`"""
return key in self.ctx.__dict__
def __getitem__(self, key):
""".. deprecated:: 19.9
Custom context is now stored in `request.custom_context.yourkey`"""
return self.ctx.__dict__[key]
def __delitem__(self, key):
""".. deprecated:: 19.9
Custom context is now stored in `request.custom_context.yourkey`"""
del self.ctx.__dict__[key]
def __setitem__(self, key, value):
""".. deprecated:: 19.9
Custom context is now stored in `request.custom_context.yourkey`"""
setattr(self.ctx, key, value)
def body_init(self): def body_init(self):
""".. deprecated:: 20.3"""
self.body = [] self.body = []
def body_push(self, data): def body_push(self, data):
""".. deprecated:: 20.3"""
self.body.append(data) self.body.append(data)
def body_finish(self): def body_finish(self):
""".. deprecated:: 20.3"""
self.body = b"".join(self.body) self.body = b"".join(self.body)
async def receive_body(self):
"""Receive request.body, if not already received.
Streaming handlers may call this to receive the full body.
This is added as a compatibility shim in Sanic 20.3 because future
versions of Sanic will make all requests streaming and will use this
function instead of the non-async body_init/push/finish functions.
Please make an issue if your code depends on the old functionality and
cannot be upgraded to the new API.
"""
if not self.stream:
return
self.body = b"".join([data async for data in self.stream])
@property @property
def json(self): def json(self):
if self.parsed_json is None: if self.parsed_json is None:
@@ -282,18 +284,6 @@ class Request:
args = property(get_args) args = property(get_args)
@property
def raw_args(self) -> dict:
if self.app.debug: # pragma: no cover
warnings.simplefilter("default")
warnings.warn(
"Use of raw_args will be deprecated in "
"the future versions. Please use args or query_args "
"properties instead",
DeprecationWarning,
)
return {k: v[0] for k, v in self.args.items()}
def get_query_args( def get_query_args(
self, self,
keep_blank_values: bool = False, keep_blank_values: bool = False,
@@ -361,56 +351,55 @@ class Request:
self._cookies = {} self._cookies = {}
return self._cookies return self._cookies
@property
def content_type(self):
return self.headers.get("Content-Type", DEFAULT_HTTP_CONTENT_TYPE)
@property
def match_info(self):
"""return matched info after resolving route"""
return self.app.router.get(self)[2]
# Transport properties (obtained from local interface only)
@property @property
def ip(self): def ip(self):
""" """
:return: peer ip of the socket :return: peer ip of the socket
""" """
if not hasattr(self, "_socket"): return self.conn_info.client if self.conn_info else ""
self._get_address()
return self._ip
@property @property
def port(self): def port(self):
""" """
:return: peer port of the socket :return: peer port of the socket
""" """
if not hasattr(self, "_socket"): return self.conn_info.client_port if self.conn_info else 0
self._get_address()
return self._port
@property @property
def socket(self): def socket(self):
if not hasattr(self, "_socket"): return self.conn_info.peername if self.conn_info else (None, None)
self._get_address()
return self._socket
def _get_address(self):
self._socket = self.transport.get_extra_info("peername") or (
None,
None,
)
self._ip = self._socket[0]
self._port = self._socket[1]
@property @property
def server_name(self): def path(self) -> str:
""" """Path of the local HTTP request."""
Attempt to get the server's external hostname in this order: return self._parsed_url.path.decode("utf-8")
`config.SERVER_NAME`, proxied or direct Host headers
:func:`Request.host`
:return: the server name without port number # Proxy properties (using SERVER_NAME/forwarded/request/transport info)
:rtype: str
"""
server_name = self.app.config.get("SERVER_NAME")
if server_name:
host = server_name.split("//", 1)[-1].split("/", 1)[0]
return parse_host(host)[0]
return parse_host(self.host)[0]
@property @property
def forwarded(self): def forwarded(self):
"""
Active proxy information obtained from request headers, as specified in
Sanic configuration.
Field names by, for, proto, host, port and path are normalized.
- for and by IPv6 addresses are bracketed
- port (int) is only set by port headers, not from host.
- path is url-unencoded
Additional values may be available from new style Forwarded headers.
"""
if self.parsed_forwarded is None: if self.parsed_forwarded is None:
self.parsed_forwarded = ( self.parsed_forwarded = (
parse_forwarded(self.headers, self.app.config) parse_forwarded(self.headers, self.app.config)
@@ -420,50 +409,30 @@ class Request:
return self.parsed_forwarded return self.parsed_forwarded
@property @property
def server_port(self): def remote_addr(self) -> str:
""" """
Attempt to get the server's external port number in this order: Client IP address, if available.
`config.SERVER_NAME`, proxied or direct Host headers 1. proxied remote address `self.forwarded['for']`
:func:`Request.host`, 2. local remote address `self.ip`
actual port used by the transport layer socket. :return: IPv4, bracketed IPv6, UNIX socket name or arbitrary string
:return: server port
:rtype: int
"""
if self.forwarded:
return self.forwarded.get("port") or (
80 if self.scheme in ("http", "ws") else 443
)
return (
parse_host(self.host)[1]
or self.transport.get_extra_info("sockname")[1]
)
@property
def remote_addr(self):
"""Attempt to return the original client ip based on `forwarded`,
`x-forwarded-for` or `x-real-ip`. If HTTP headers are unavailable or
untrusted, returns an empty string.
:return: original client ip.
""" """
if not hasattr(self, "_remote_addr"): if not hasattr(self, "_remote_addr"):
self._remote_addr = self.forwarded.get("for", "") self._remote_addr = self.forwarded.get("for", "") # or self.ip
return self._remote_addr return self._remote_addr
@property @property
def scheme(self): def scheme(self) -> str:
""" """
Attempt to get the request scheme. Determine request scheme.
Seeking the value in this order: 1. `config.SERVER_NAME` if in full URL format
`forwarded` header, `x-forwarded-proto` header, 2. proxied proto/scheme
`x-scheme` header, the sanic app itself. 3. local connection protocol
:return: http|https|ws|wss or arbitrary value given by the headers. :return: http|https|ws|wss or arbitrary value given by the headers.
:rtype: str
""" """
forwarded_proto = self.forwarded.get("proto") if "//" in self.app.config.get("SERVER_NAME", ""):
if forwarded_proto: return self.app.config.SERVER_NAME.split("//")[0]
return forwarded_proto if "proto" in self.forwarded:
return self.forwarded["proto"]
if ( if (
self.app.websocket_enabled self.app.websocket_enabled
@@ -479,25 +448,41 @@ class Request:
return scheme return scheme
@property @property
def host(self): def host(self) -> str:
""" """
:return: proxied or direct Host header. Hostname and port number may be The currently effective server 'host' (hostname or hostname:port).
separated by sanic.headers.parse_host(request.host). 1. `config.SERVER_NAME` overrides any client headers
2. proxied host of original request
3. request host header
hostname and port may be separated by
`sanic.headers.parse_host(request.host)`.
:return: the first matching host found, or empty string
""" """
return self.forwarded.get("host", self.headers.get("Host", "")) server_name = self.app.config.get("SERVER_NAME")
if server_name:
return server_name.split("//", 1)[-1].split("/", 1)[0]
return self.forwarded.get("host") or self.headers.get("host", "")
@property @property
def content_type(self): def server_name(self) -> str:
return self.headers.get("Content-Type", DEFAULT_HTTP_CONTENT_TYPE) """The hostname the client connected to, by `request.host`."""
return parse_host(self.host)[0] or ""
@property @property
def match_info(self): def server_port(self) -> int:
"""return matched info after resolving route""" """
return self.app.router.get(self)[2] The port the client connected to, by forwarded `port` or
`request.host`.
Default port is returned as 80 and 443 based on `request.scheme`.
"""
port = self.forwarded.get("port") or parse_host(self.host)[1]
return port or (80 if self.scheme in ("http", "ws") else 443)
@property @property
def path(self): def server_path(self) -> str:
return self._parsed_url.path.decode("utf-8") """Full path of current URL. Uses proxied or local path."""
return self.forwarded.get("path") or self.path
@property @property
def query_string(self): def query_string(self):
@@ -538,7 +523,7 @@ class Request:
): ):
netloc = host netloc = host
else: else:
netloc = "{}:{}".format(host, port) netloc = f"{host}:{port}"
return self.app.url_for( return self.app.url_for(
view_name, _external=True, _scheme=scheme, _server=netloc, **kwargs view_name, _external=True, _scheme=scheme, _server=netloc, **kwargs

View File

@@ -1,34 +1,29 @@
import warnings
from functools import partial from functools import partial
from mimetypes import guess_type from mimetypes import guess_type
from os import path from os import path
from urllib.parse import quote_plus from urllib.parse import quote_plus
from aiofiles import open as open_async # type: ignore from sanic.compat import Header, open_async
from sanic.compat import Header
from sanic.cookies import CookieJar from sanic.cookies import CookieJar
from sanic.headers import format_http1 from sanic.headers import format_http1, format_http1_response
from sanic.helpers import STATUS_CODES, has_message_body, remove_entity_headers from sanic.helpers import has_message_body, remove_entity_headers
try: try:
from ujson import dumps as json_dumps from ujson import dumps as json_dumps
except ImportError: except ImportError:
from json import dumps
# This is done in order to ensure that the JSON response is # This is done in order to ensure that the JSON response is
# kept consistent across both ujson and inbuilt json usage. # kept consistent across both ujson and inbuilt json usage.
from json import dumps
json_dumps = partial(dumps, separators=(",", ":")) json_dumps = partial(dumps, separators=(",", ":"))
class BaseHTTPResponse: class BaseHTTPResponse:
def _encode_body(self, data): def _encode_body(self, data):
try: return data.encode() if hasattr(data, "encode") else data
# Try to encode it regularly
return data.encode()
except AttributeError:
# Convert it to a str if you can't
return str(data).encode()
def _parse_headers(self): def _parse_headers(self):
return format_http1(self.headers.items()) return format_http1(self.headers.items())
@@ -39,6 +34,32 @@ class BaseHTTPResponse:
self._cookies = CookieJar(self.headers) self._cookies = CookieJar(self.headers)
return self._cookies return self._cookies
def get_headers(
self,
version="1.1",
keep_alive=False,
keep_alive_timeout=None,
body=b"",
):
""".. deprecated:: 20.3:
This function is not public API and will be removed."""
# self.headers get priority over content_type
if self.content_type and "Content-Type" not in self.headers:
self.headers["Content-Type"] = self.content_type
if keep_alive:
self.headers["Connection"] = "keep-alive"
if keep_alive_timeout is not None:
self.headers["Keep-Alive"] = keep_alive_timeout
else:
self.headers["Connection"] = "close"
if self.status in (304, 412):
self.headers = remove_entity_headers(self.headers)
return format_http1_response(self.status, self.headers.items(), body)
class StreamingHTTPResponse(BaseHTTPResponse): class StreamingHTTPResponse(BaseHTTPResponse):
__slots__ = ( __slots__ = (
@@ -56,7 +77,7 @@ class StreamingHTTPResponse(BaseHTTPResponse):
streaming_fn, streaming_fn,
status=200, status=200,
headers=None, headers=None,
content_type="text/plain", content_type="text/plain; charset=utf-8",
chunked=True, chunked=True,
): ):
self.content_type = content_type self.content_type = content_type
@@ -65,14 +86,14 @@ class StreamingHTTPResponse(BaseHTTPResponse):
self.headers = Header(headers or {}) self.headers = Header(headers or {})
self.chunked = chunked self.chunked = chunked
self._cookies = None self._cookies = None
self.protocol = None
async def write(self, data): async def write(self, data):
"""Writes a chunk of data to the streaming response. """Writes a chunk of data to the streaming response.
:param data: bytes-ish data to be written. :param data: str or bytes-ish data to be written.
""" """
if type(data) != bytes: data = self._encode_body(data)
data = self._encode_body(data)
if self.chunked: if self.chunked:
await self.protocol.push_data(b"%x\r\n%b\r\n" % (len(data), data)) await self.protocol.push_data(b"%x\r\n%b\r\n" % (len(data), data))
@@ -104,33 +125,11 @@ class StreamingHTTPResponse(BaseHTTPResponse):
def get_headers( def get_headers(
self, version="1.1", keep_alive=False, keep_alive_timeout=None self, version="1.1", keep_alive=False, keep_alive_timeout=None
): ):
# This is all returned in a kind-of funky way
# We tried to make this as fast as possible in pure python
timeout_header = b""
if keep_alive and keep_alive_timeout is not None:
timeout_header = b"Keep-Alive: %d\r\n" % keep_alive_timeout
if self.chunked and version == "1.1": if self.chunked and version == "1.1":
self.headers["Transfer-Encoding"] = "chunked" self.headers["Transfer-Encoding"] = "chunked"
self.headers.pop("Content-Length", None) self.headers.pop("Content-Length", None)
self.headers["Content-Type"] = self.headers.get(
"Content-Type", self.content_type
)
headers = self._parse_headers() return super().get_headers(version, keep_alive, keep_alive_timeout)
if self.status == 200:
status = b"OK"
else:
status = STATUS_CODES.get(self.status)
return (b"HTTP/%b %d %b\r\n" b"%b" b"%b\r\n") % (
version.encode(),
self.status,
status,
timeout_header,
headers,
)
class HTTPResponse(BaseHTTPResponse): class HTTPResponse(BaseHTTPResponse):
@@ -145,23 +144,18 @@ class HTTPResponse(BaseHTTPResponse):
body_bytes=b"", body_bytes=b"",
): ):
self.content_type = content_type self.content_type = content_type
self.body = body_bytes if body is None else self._encode_body(body)
if body is not None:
self.body = self._encode_body(body)
else:
self.body = body_bytes
self.status = status self.status = status
self.headers = Header(headers or {}) self.headers = Header(headers or {})
self._cookies = None self._cookies = None
def output(self, version="1.1", keep_alive=False, keep_alive_timeout=None): if body_bytes:
# This is all returned in a kind-of funky way warnings.warn(
# We tried to make this as fast as possible in pure python "Parameter `body_bytes` is deprecated, use `body` instead",
timeout_header = b"" DeprecationWarning,
if keep_alive and keep_alive_timeout is not None: )
timeout_header = b"Keep-Alive: %d\r\n" % keep_alive_timeout
def output(self, version="1.1", keep_alive=False, keep_alive_timeout=None):
body = b"" body = b""
if has_message_body(self.status): if has_message_body(self.status):
body = self.body body = self.body
@@ -169,31 +163,7 @@ class HTTPResponse(BaseHTTPResponse):
"Content-Length", len(self.body) "Content-Length", len(self.body)
) )
# self.headers get priority over content_type return self.get_headers(version, keep_alive, keep_alive_timeout, body)
if self.content_type and "Content-Type" not in self.headers:
self.headers["Content-Type"] = self.content_type
if self.status in (304, 412):
self.headers = remove_entity_headers(self.headers)
headers = self._parse_headers()
if self.status == 200:
status = b"OK"
else:
status = STATUS_CODES.get(self.status, b"UNKNOWN RESPONSE")
return (
b"HTTP/%b %d %b\r\n" b"Connection: %b\r\n" b"%b" b"%b\r\n" b"%b"
) % (
version.encode(),
self.status,
status,
b"keep-alive" if keep_alive else b"close",
timeout_header,
headers,
body,
)
@property @property
def cookies(self): def cookies(self):
@@ -202,16 +172,14 @@ class HTTPResponse(BaseHTTPResponse):
return self._cookies return self._cookies
def empty( def empty(status=204, headers=None):
status=204, headers=None,
):
""" """
Returns an empty response to the client. Returns an empty response to the client.
:param status Response code. :param status Response code.
:param headers Custom Headers. :param headers Custom Headers.
""" """
return HTTPResponse(body_bytes=b"", status=status, headers=headers,) return HTTPResponse(body=b"", status=status, headers=headers)
def json( def json(
@@ -220,7 +188,7 @@ def json(
headers=None, headers=None,
content_type="application/json", content_type="application/json",
dumps=json_dumps, dumps=json_dumps,
**kwargs **kwargs,
): ):
""" """
Returns response object with body in json format. Returns response object with body in json format.
@@ -249,6 +217,21 @@ def text(
:param headers: Custom Headers. :param headers: Custom Headers.
:param content_type: the content type (string) of the response :param content_type: the content type (string) of the response
""" """
if not isinstance(body, str):
warnings.warn(
"Types other than str will be deprecated in future versions for"
f" response.text, got type {type(body).__name__})",
DeprecationWarning,
)
# Type conversions are deprecated and quite b0rked but still supported for
# text() until applications get fixed. This try-except should be removed.
try:
# Avoid repr(body).encode() b0rkage for body that is already encoded.
# memoryview used only to test bytes-ishness.
with memoryview(body):
pass
except TypeError:
body = f"{body}" # no-op if body is already str
return HTTPResponse( return HTTPResponse(
body, status=status, headers=headers, content_type=content_type body, status=status, headers=headers, content_type=content_type
) )
@@ -266,7 +249,7 @@ def raw(
:param content_type: the content type (string) of the response. :param content_type: the content type (string) of the response.
""" """
return HTTPResponse( return HTTPResponse(
body_bytes=body, body=body,
status=status, status=status,
headers=headers, headers=headers,
content_type=content_type, content_type=content_type,
@@ -277,10 +260,14 @@ def html(body, status=200, headers=None):
""" """
Returns response object with body in html format. Returns response object with body in html format.
:param body: Response data to be encoded. :param body: str or bytes-ish, or an object with __html__ or _repr_html_.
:param status: Response code. :param status: Response code.
:param headers: Custom Headers. :param headers: Custom Headers.
""" """
if hasattr(body, "__html__"):
body = body.__html__()
elif hasattr(body, "_repr_html_"):
body = body._repr_html_()
return HTTPResponse( return HTTPResponse(
body, body,
status=status, status=status,
@@ -308,29 +295,27 @@ async def file(
headers = headers or {} headers = headers or {}
if filename: if filename:
headers.setdefault( headers.setdefault(
"Content-Disposition", 'attachment; filename="{}"'.format(filename) "Content-Disposition", f'attachment; filename="{filename}"'
) )
filename = filename or path.split(location)[-1] filename = filename or path.split(location)[-1]
async with open_async(location, mode="rb") as _file: async with await open_async(location, mode="rb") as f:
if _range: if _range:
await _file.seek(_range.start) await f.seek(_range.start)
out_stream = await _file.read(_range.size) out_stream = await f.read(_range.size)
headers["Content-Range"] = "bytes %s-%s/%s" % ( headers[
_range.start, "Content-Range"
_range.end, ] = f"bytes {_range.start}-{_range.end}/{_range.total}"
_range.total,
)
status = 206 status = 206
else: else:
out_stream = await _file.read() out_stream = await f.read()
mime_type = mime_type or guess_type(filename)[0] or "text/plain" mime_type = mime_type or guess_type(filename)[0] or "text/plain"
return HTTPResponse( return HTTPResponse(
body=out_stream,
status=status, status=status,
headers=headers, headers=headers,
content_type=mime_type, content_type=mime_type,
body_bytes=out_stream,
) )
@@ -357,43 +342,36 @@ async def file_stream(
headers = headers or {} headers = headers or {}
if filename: if filename:
headers.setdefault( headers.setdefault(
"Content-Disposition", 'attachment; filename="{}"'.format(filename) "Content-Disposition", f'attachment; filename="{filename}"'
) )
filename = filename or path.split(location)[-1] filename = filename or path.split(location)[-1]
mime_type = mime_type or guess_type(filename)[0] or "text/plain"
if _range:
start = _range.start
end = _range.end
total = _range.total
_file = await open_async(location, mode="rb") headers["Content-Range"] = f"bytes {start}-{end}/{total}"
status = 206
async def _streaming_fn(response): async def _streaming_fn(response):
nonlocal _file, chunk_size async with await open_async(location, mode="rb") as f:
try:
if _range: if _range:
chunk_size = min((_range.size, chunk_size)) await f.seek(_range.start)
await _file.seek(_range.start)
to_send = _range.size to_send = _range.size
while to_send > 0: while to_send > 0:
content = await _file.read(chunk_size) content = await f.read(min((_range.size, chunk_size)))
if len(content) < 1: if len(content) < 1:
break break
to_send -= len(content) to_send -= len(content)
await response.write(content) await response.write(content)
else: else:
while True: while True:
content = await _file.read(chunk_size) content = await f.read(chunk_size)
if len(content) < 1: if len(content) < 1:
break break
await response.write(content) await response.write(content)
finally:
await _file.close()
return # Returning from this fn closes the stream
mime_type = mime_type or guess_type(filename)[0] or "text/plain"
if _range:
headers["Content-Range"] = "bytes %s-%s/%s" % (
_range.start,
_range.end,
_range.total,
)
status = 206
return StreamingHTTPResponse( return StreamingHTTPResponse(
streaming_fn=_streaming_fn, streaming_fn=_streaming_fn,
status=status, status=status,

View File

@@ -109,7 +109,7 @@ class Router:
name, pattern = parameter_string.split(":", 1) name, pattern = parameter_string.split(":", 1)
if not name: if not name:
raise ValueError( raise ValueError(
"Invalid parameter syntax: {}".format(parameter_string) f"Invalid parameter syntax: {parameter_string}"
) )
default = (str, pattern) default = (str, pattern)
@@ -143,7 +143,7 @@ class Router:
routes = [] routes = []
if version is not None: if version is not None:
version = re.escape(str(version).strip("/").lstrip("v")) version = re.escape(str(version).strip("/").lstrip("v"))
uri = "/".join(["/v{}".format(version), uri.lstrip("/")]) uri = "/".join([f"/v{version}", uri.lstrip("/")])
# add regular version # add regular version
routes.append(self._add(uri, methods, handler, host, name)) routes.append(self._add(uri, methods, handler, host, name))
@@ -203,8 +203,8 @@ class Router:
else: else:
if not isinstance(host, Iterable): if not isinstance(host, Iterable):
raise ValueError( raise ValueError(
"Expected either string or Iterable of " f"Expected either string or Iterable of "
"host strings, not {!r}".format(host) f"host strings, not {host!r}"
) )
for host_ in host: for host_ in host:
@@ -225,8 +225,7 @@ class Router:
if name in parameter_names: if name in parameter_names:
raise ParameterNameConflicts( raise ParameterNameConflicts(
"Multiple parameter named <{name}> " f"Multiple parameter named <{name}> " f"in route uri {uri}"
"in route uri {uri}".format(name=name, uri=uri)
) )
parameter_names.add(name) parameter_names.add(name)
@@ -240,23 +239,23 @@ class Router:
elif re.search(r"/", pattern): elif re.search(r"/", pattern):
properties["unhashable"] = True properties["unhashable"] = True
return "({})".format(pattern) return f"({pattern})"
pattern_string = re.sub(self.parameter_pattern, add_parameter, uri) pattern_string = re.sub(self.parameter_pattern, add_parameter, uri)
pattern = re.compile(r"^{}$".format(pattern_string)) pattern = re.compile(fr"^{pattern_string}$")
def merge_route(route, methods, handler): def merge_route(route, methods, handler):
# merge to the existing route when possible. # merge to the existing route when possible.
if not route.methods or not methods: if not route.methods or not methods:
# method-unspecified routes are not mergeable. # method-unspecified routes are not mergeable.
raise RouteExists("Route already registered: {}".format(uri)) raise RouteExists(f"Route already registered: {uri}")
elif route.methods.intersection(methods): elif route.methods.intersection(methods):
# already existing method is not overloadable. # already existing method is not overloadable.
duplicated = methods.intersection(route.methods) duplicated = methods.intersection(route.methods)
duplicated_methods = ",".join(list(duplicated))
raise RouteExists( raise RouteExists(
"Route already registered: {} [{}]".format( f"Route already registered: {uri} [{duplicated_methods}]"
uri, ",".join(list(duplicated))
)
) )
if isinstance(route.handler, CompositionView): if isinstance(route.handler, CompositionView):
view = route.handler view = route.handler
@@ -296,9 +295,9 @@ class Router:
name = name.split("_static_", 1)[-1] name = name.split("_static_", 1)[-1]
if hasattr(handler, "__blueprintname__"): if hasattr(handler, "__blueprintname__"):
handler_name = "{}.{}".format( bp_name = handler.__blueprintname__
handler.__blueprintname__, name or handler.__name__
) handler_name = f"{bp_name}.{name or handler.__name__}"
else: else:
handler_name = name or getattr(handler, "__name__", None) handler_name = name or getattr(handler, "__name__", None)
@@ -352,37 +351,6 @@ class Router:
else: else:
return -1, None return -1, None
def remove(self, uri, clean_cache=True, host=None):
if host is not None:
uri = host + uri
try:
route = self.routes_all.pop(uri)
for handler_name, pairs in self.routes_names.items():
if pairs[0] == uri:
self.routes_names.pop(handler_name)
break
for handler_name, pairs in self.routes_static_files.items():
if pairs[0] == uri:
self.routes_static_files.pop(handler_name)
break
except KeyError:
raise RouteDoesNotExist("Route was not registered: {}".format(uri))
if route in self.routes_always_check:
self.routes_always_check.remove(route)
elif (
url_hash(uri) in self.routes_dynamic
and route in self.routes_dynamic[url_hash(uri)]
):
self.routes_dynamic[url_hash(uri)].remove(route)
else:
self.routes_static.pop(uri)
if clean_cache:
self._get.cache_clear()
@lru_cache(maxsize=ROUTER_CACHE_SIZE) @lru_cache(maxsize=ROUTER_CACHE_SIZE)
def find_route_by_view_name(self, view_name, name=None): def find_route_by_view_name(self, view_name, name=None):
"""Find a route in the router based on the specified view name. """Find a route in the router based on the specified view name.
@@ -442,7 +410,7 @@ class Router:
# Check against known static routes # Check against known static routes
route = self.routes_static.get(url) route = self.routes_static.get(url)
method_not_supported = MethodNotSupported( method_not_supported = MethodNotSupported(
"Method {} not allowed for URL {}".format(method, url), f"Method {method} not allowed for URL {url}",
method=method, method=method,
allowed_methods=self.get_supported_methods(url), allowed_methods=self.get_supported_methods(url),
) )
@@ -472,7 +440,7 @@ class Router:
# Route was found but the methods didn't match # Route was found but the methods didn't match
if route_found: if route_found:
raise method_not_supported raise method_not_supported
raise NotFound("Requested URL {} not found".format(url)) raise NotFound(f"Requested URL {url} not found")
kwargs = { kwargs = {
p.name: p.cast(value) p.name: p.cast(value)
@@ -484,7 +452,7 @@ class Router:
return route_handler, [], kwargs, route.uri, route.name return route_handler, [], kwargs, route.uri, route.name
def is_stream_handler(self, request): def is_stream_handler(self, request):
""" Handler for request is stream or not. """Handler for request is stream or not.
:param request: Request object :param request: Request object
:return: bool :return: bool
""" """

View File

@@ -1,20 +1,26 @@
import asyncio import asyncio
import multiprocessing
import os import os
import secrets
import socket
import stat
import sys
import traceback import traceback
from collections import deque from collections import deque
from functools import partial from functools import partial
from inspect import isawaitable from inspect import isawaitable
from multiprocessing import Process from ipaddress import ip_address
from signal import SIG_IGN, SIGINT, SIGTERM, Signals from signal import SIG_IGN, SIGINT, SIGTERM, Signals
from signal import signal as signal_func from signal import signal as signal_func
from socket import SO_REUSEADDR, SOL_SOCKET, socket
from time import time from time import time
from typing import Dict, Type, Union
from httptools import HttpRequestParser # type: ignore from httptools import HttpRequestParser # type: ignore
from httptools.parser.errors import HttpParserError # type: ignore from httptools.parser.errors import HttpParserError # type: ignore
from sanic.compat import Header from sanic.compat import Header, ctrlc_workaround_for_windows
from sanic.config import Config
from sanic.exceptions import ( from sanic.exceptions import (
HeaderExpectationFailed, HeaderExpectationFailed,
InvalidUsage, InvalidUsage,
@@ -36,11 +42,48 @@ try:
except ImportError: except ImportError:
pass pass
OS_IS_WINDOWS = os.name == "nt"
class Signal: class Signal:
stopped = False stopped = False
class ConnInfo:
"""Local and remote addresses and SSL status info."""
__slots__ = (
"sockname",
"peername",
"server",
"server_port",
"client",
"client_port",
"ssl",
)
def __init__(self, transport, unix=None):
self.ssl = bool(transport.get_extra_info("sslcontext"))
self.server = self.client = ""
self.server_port = self.client_port = 0
self.peername = None
self.sockname = addr = transport.get_extra_info("sockname")
if isinstance(addr, str): # UNIX socket
self.server = unix or addr
return
# IPv4 (ip, port) or IPv6 (ip, port, flowinfo, scopeid)
if isinstance(addr, tuple):
self.server = addr[0] if len(addr) == 2 else f"[{addr[0]}]"
self.server_port = addr[1]
# self.server gets non-standard port appended
if addr[1] != (443 if self.ssl else 80):
self.server = f"{self.server}:{addr[1]}"
self.peername = addr = transport.get_extra_info("peername")
if isinstance(addr, tuple):
self.client = addr[0] if len(addr) == 2 else f"[{addr[0]}]"
self.client_port = addr[1]
class HttpProtocol(asyncio.Protocol): class HttpProtocol(asyncio.Protocol):
""" """
This class provides a basic HTTP implementation of the sanic framework. This class provides a basic HTTP implementation of the sanic framework.
@@ -54,6 +97,7 @@ class HttpProtocol(asyncio.Protocol):
"transport", "transport",
"connections", "connections",
"signal", "signal",
"conn_info",
# request params # request params
"parser", "parser",
"request", "request",
@@ -68,7 +112,6 @@ class HttpProtocol(asyncio.Protocol):
"request_buffer_queue_size", "request_buffer_queue_size",
"request_class", "request_class",
"is_request_stream", "is_request_stream",
"router",
"error_handler", "error_handler",
# enable or disable access log purpose # enable or disable access log purpose
"access_log", "access_log",
@@ -86,7 +129,7 @@ class HttpProtocol(asyncio.Protocol):
"_keep_alive", "_keep_alive",
"_header_fragment", "_header_fragment",
"state", "state",
"_debug", "_unix",
"_body_chunks", "_body_chunks",
) )
@@ -95,46 +138,38 @@ class HttpProtocol(asyncio.Protocol):
*, *,
loop, loop,
app, app,
request_handler,
error_handler,
signal=Signal(), signal=Signal(),
connections=None, connections=None,
request_timeout=60,
response_timeout=60,
keep_alive_timeout=5,
request_max_size=None,
request_buffer_queue_size=100,
request_class=None,
access_log=True,
keep_alive=True,
is_request_stream=False,
router=None,
state=None, state=None,
debug=False, unix=None,
**kwargs **kwargs,
): ):
asyncio.set_event_loop(loop)
self.loop = loop self.loop = loop
deprecated_loop = self.loop if sys.version_info < (3, 7) else None
self.app = app self.app = app
self.transport = None self.transport = None
self.conn_info = None
self.request = None self.request = None
self.parser = None self.parser = None
self.url = None self.url = None
self.headers = None self.headers = None
self.router = router
self.signal = signal self.signal = signal
self.access_log = access_log self.access_log = self.app.config.ACCESS_LOG
self.connections = connections if connections is not None else set() self.connections = connections if connections is not None else set()
self.request_handler = request_handler self.request_handler = self.app.handle_request
self.error_handler = error_handler self.error_handler = self.app.error_handler
self.request_timeout = request_timeout self.request_timeout = self.app.config.REQUEST_TIMEOUT
self.request_buffer_queue_size = request_buffer_queue_size self.request_buffer_queue_size = (
self.response_timeout = response_timeout self.app.config.REQUEST_BUFFER_QUEUE_SIZE
self.keep_alive_timeout = keep_alive_timeout )
self.request_max_size = request_max_size self.response_timeout = self.app.config.RESPONSE_TIMEOUT
self.request_class = request_class or Request self.keep_alive_timeout = self.app.config.KEEP_ALIVE_TIMEOUT
self.is_request_stream = is_request_stream self.request_max_size = self.app.config.REQUEST_MAX_SIZE
self.request_class = self.app.request_class or Request
self.is_request_stream = self.app.is_request_stream
self._is_stream_handler = False self._is_stream_handler = False
self._not_paused = asyncio.Event(loop=loop) self._not_paused = asyncio.Event(loop=deprecated_loop)
self._total_request_size = 0 self._total_request_size = 0
self._request_timeout_handler = None self._request_timeout_handler = None
self._response_timeout_handler = None self._response_timeout_handler = None
@@ -143,12 +178,12 @@ class HttpProtocol(asyncio.Protocol):
self._last_response_time = None self._last_response_time = None
self._request_handler_task = None self._request_handler_task = None
self._request_stream_task = None self._request_stream_task = None
self._keep_alive = keep_alive self._keep_alive = self.app.config.KEEP_ALIVE
self._header_fragment = b"" self._header_fragment = b""
self.state = state if state else {} self.state = state if state else {}
if "requests_count" not in self.state: if "requests_count" not in self.state:
self.state["requests_count"] = 0 self.state["requests_count"] = 0
self._debug = debug self._unix = unix
self._not_paused.set() self._not_paused.set()
self._body_chunks = deque() self._body_chunks = deque()
@@ -177,6 +212,7 @@ class HttpProtocol(asyncio.Protocol):
self.request_timeout, self.request_timeout_callback self.request_timeout, self.request_timeout_callback
) )
self.transport = transport self.transport = transport
self.conn_info = ConnInfo(transport, unix=self._unix)
self._last_request_time = time() self._last_request_time = time()
def connection_lost(self, exc): def connection_lost(self, exc):
@@ -276,7 +312,7 @@ class HttpProtocol(asyncio.Protocol):
self.parser.feed_data(data) self.parser.feed_data(data)
except HttpParserError: except HttpParserError:
message = "Bad Request" message = "Bad Request"
if self._debug: if self.app.debug:
message += "\n" + traceback.format_exc() message += "\n" + traceback.format_exc()
self.write_error(InvalidUsage(message)) self.write_error(InvalidUsage(message))
@@ -314,6 +350,7 @@ class HttpProtocol(asyncio.Protocol):
transport=self.transport, transport=self.transport,
app=self.app, app=self.app,
) )
self.request.conn_info = self.conn_info
# Remove any existing KeepAlive handler here, # Remove any existing KeepAlive handler here,
# It will be recreated if required on the new request. # It will be recreated if required on the new request.
if self._keep_alive_timeout_handler: if self._keep_alive_timeout_handler:
@@ -324,7 +361,7 @@ class HttpProtocol(asyncio.Protocol):
self.expect_handler() self.expect_handler()
if self.is_request_stream: if self.is_request_stream:
self._is_stream_handler = self.router.is_stream_handler( self._is_stream_handler = self.app.router.is_stream_handler(
self.request self.request
) )
if self._is_stream_handler: if self._is_stream_handler:
@@ -343,9 +380,7 @@ class HttpProtocol(asyncio.Protocol):
self.transport.write(b"HTTP/1.1 100 Continue\r\n\r\n") self.transport.write(b"HTTP/1.1 100 Continue\r\n\r\n")
else: else:
self.write_error( self.write_error(
HeaderExpectationFailed( HeaderExpectationFailed(f"Unknown Expect: {expect}")
"Unknown Expect: {expect}".format(expect=expect)
)
) )
def on_body(self, body): def on_body(self, body):
@@ -383,12 +418,13 @@ class HttpProtocol(asyncio.Protocol):
async def stream_append(self): async def stream_append(self):
while self._body_chunks: while self._body_chunks:
body = self._body_chunks.popleft() body = self._body_chunks.popleft()
if self.request.stream.is_full(): if self.request:
self.transport.pause_reading() if self.request.stream.is_full():
await self.request.stream.put(body) self.transport.pause_reading()
self.transport.resume_reading() await self.request.stream.put(body)
else: self.transport.resume_reading()
await self.request.stream.put(body) else:
await self.request.stream.put(body)
def on_message_complete(self): def on_message_complete(self):
# Entire request (headers and whole body) is received. # Entire request (headers and whole body) is received.
@@ -452,13 +488,9 @@ class HttpProtocol(asyncio.Protocol):
extra["host"] = "UNKNOWN" extra["host"] = "UNKNOWN"
if self.request is not None: if self.request is not None:
if self.request.ip: if self.request.ip:
extra["host"] = "{0}:{1}".format( extra["host"] = f"{self.request.ip}:{self.request.port}"
self.request.ip, self.request.port
)
extra["request"] = "{0} {1}".format( extra["request"] = f"{self.request.method} {self.request.url}"
self.request.method, self.request.url
)
else: else:
extra["request"] = "nil" extra["request"] = "nil"
@@ -488,16 +520,14 @@ class HttpProtocol(asyncio.Protocol):
) )
self.write_error(ServerError("Invalid response type")) self.write_error(ServerError("Invalid response type"))
except RuntimeError: except RuntimeError:
if self._debug: if self.app.debug:
logger.error( logger.error(
"Connection lost before response written @ %s", "Connection lost before response written @ %s",
self.request.ip, self.request.ip,
) )
keep_alive = False keep_alive = False
except Exception as e: except Exception as e:
self.bail_out( self.bail_out(f"Writing response failed, connection closed {e!r}")
"Writing response failed, connection closed {}".format(repr(e))
)
finally: finally:
if not keep_alive: if not keep_alive:
self.transport.close() self.transport.close()
@@ -541,16 +571,14 @@ class HttpProtocol(asyncio.Protocol):
) )
self.write_error(ServerError("Invalid response type")) self.write_error(ServerError("Invalid response type"))
except RuntimeError: except RuntimeError:
if self._debug: if self.app.debug:
logger.error( logger.error(
"Connection lost before response written @ %s", "Connection lost before response written @ %s",
self.request.ip, self.request.ip,
) )
keep_alive = False keep_alive = False
except Exception as e: except Exception as e:
self.bail_out( self.bail_out(f"Writing response failed, connection closed {e!r}")
"Writing response failed, connection closed {}".format(repr(e))
)
finally: finally:
if not keep_alive: if not keep_alive:
self.transport.close() self.transport.close()
@@ -574,14 +602,14 @@ class HttpProtocol(asyncio.Protocol):
version = self.request.version if self.request else "1.1" version = self.request.version if self.request else "1.1"
self.transport.write(response.output(version)) self.transport.write(response.output(version))
except RuntimeError: except RuntimeError:
if self._debug: if self.app.debug:
logger.error( logger.error(
"Connection lost before error written @ %s", "Connection lost before error written @ %s",
self.request.ip if self.request else "Unknown", self.request.ip if self.request else "Unknown",
) )
except Exception as e: except Exception as e:
self.bail_out( self.bail_out(
"Writing error failed, connection closed {}".format(repr(e)), f"Writing error failed, connection closed {e!r}",
from_error=True, from_error=True,
) )
finally: finally:
@@ -642,7 +670,7 @@ class HttpProtocol(asyncio.Protocol):
:return: boolean - True if closed, false if staying open :return: boolean - True if closed, false if staying open
""" """
if not self.parser: if not self.parser and self.transport is not None:
self.transport.close() self.transport.close()
return True return True
return False return False
@@ -731,6 +759,26 @@ class AsyncioServer:
task = asyncio.ensure_future(coro, loop=self.loop) task = asyncio.ensure_future(coro, loop=self.loop)
return task return task
def start_serving(self):
if self.server:
try:
return self.server.start_serving()
except AttributeError:
raise NotImplementedError(
"server.start_serving not available in this version "
"of asyncio or uvloop."
)
def serve_forever(self):
if self.server:
try:
return self.server.serve_forever()
except AttributeError:
raise NotImplementedError(
"server.serve_forever not available in this version "
"of asyncio or uvloop."
)
def __await__(self): def __await__(self):
"""Starts the asyncio server, returns AsyncServerCoro""" """Starts the asyncio server, returns AsyncServerCoro"""
task = asyncio.ensure_future(self.serve_coro) task = asyncio.ensure_future(self.serve_coro)
@@ -744,20 +792,13 @@ def serve(
host, host,
port, port,
app, app,
request_handler,
error_handler,
before_start=None, before_start=None,
after_start=None, after_start=None,
before_stop=None, before_stop=None,
after_stop=None, after_stop=None,
debug=False,
request_timeout=60,
response_timeout=60,
keep_alive_timeout=5,
ssl=None, ssl=None,
sock=None, sock=None,
request_max_size=None, unix=None,
request_buffer_queue_size=100,
reuse_port=False, reuse_port=False,
loop=None, loop=None,
protocol=HttpProtocol, protocol=HttpProtocol,
@@ -767,25 +808,13 @@ def serve(
run_async=False, run_async=False,
connections=None, connections=None,
signal=Signal(), signal=Signal(),
request_class=None,
access_log=True,
keep_alive=True,
is_request_stream=False,
router=None,
websocket_max_size=None,
websocket_max_queue=None,
websocket_read_limit=2 ** 16,
websocket_write_limit=2 ** 16,
state=None, state=None,
graceful_shutdown_timeout=15.0,
asyncio_server_kwargs=None, asyncio_server_kwargs=None,
): ):
"""Start asynchronous HTTP Server on an individual process. """Start asynchronous HTTP Server on an individual process.
:param host: Address to host on :param host: Address to host on
:param port: Port to host on :param port: Port to host on
:param request_handler: Sanic request handler with middleware
:param error_handler: Sanic error handler with middleware
:param before_start: function to be executed before the server starts :param before_start: function to be executed before the server starts
listening. Takes arguments `app` instance and `loop` listening. Takes arguments `app` instance and `loop`
:param after_start: function to be executed after the server starts :param after_start: function to be executed after the server starts
@@ -796,35 +825,13 @@ def serve(
:param after_stop: function to be executed when a stop signal is :param after_stop: function to be executed when a stop signal is
received after it is respected. Takes arguments received after it is respected. Takes arguments
`app` instance and `loop` `app` instance and `loop`
:param debug: enables debug output (slows server)
:param request_timeout: time in seconds
:param response_timeout: time in seconds
:param keep_alive_timeout: time in seconds
:param ssl: SSLContext :param ssl: SSLContext
:param sock: Socket for the server to accept connections from :param sock: Socket for the server to accept connections from
:param request_max_size: size in bytes, `None` for no limit :param unix: Unix socket to listen on instead of TCP port
:param reuse_port: `True` for multiple workers :param reuse_port: `True` for multiple workers
:param loop: asyncio compatible event loop :param loop: asyncio compatible event loop
:param protocol: subclass of asyncio protocol class
:param run_async: bool: Do not create a new event loop for the server, :param run_async: bool: Do not create a new event loop for the server,
and return an AsyncServer object rather than running it and return an AsyncServer object rather than running it
:param request_class: Request class to use
:param access_log: disable/enable access log
:param websocket_max_size: enforces the maximum size for
incoming messages in bytes.
:param websocket_max_queue: sets the maximum length of the queue
that holds incoming messages.
:param websocket_read_limit: sets the high-water limit of the buffer for
incoming bytes, the low-water limit is half
the high-water limit.
:param websocket_write_limit: sets the high-water limit of the buffer for
outgoing bytes, the low-water limit is a
quarter of the high-water limit.
:param is_request_stream: disable/enable Request.stream
:param request_buffer_queue_size: streaming request buffer queue size
:param router: Router object
:param graceful_shutdown_timeout: How long take to Force close non-idle
connection
:param asyncio_server_kwargs: key-value args for asyncio/uvloop :param asyncio_server_kwargs: key-value args for asyncio/uvloop
create_server method create_server method
:return: Nothing :return: Nothing
@@ -834,59 +841,48 @@ def serve(
loop = asyncio.new_event_loop() loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop) asyncio.set_event_loop(loop)
if debug: if app.debug:
loop.set_debug(debug) loop.set_debug(app.debug)
app.asgi = False app.asgi = False
connections = connections if connections is not None else set() connections = connections if connections is not None else set()
protocol_kwargs = _build_protocol_kwargs(protocol, app.config)
server = partial( server = partial(
protocol, protocol,
loop=loop, loop=loop,
connections=connections, connections=connections,
signal=signal, signal=signal,
app=app, app=app,
request_handler=request_handler,
error_handler=error_handler,
request_timeout=request_timeout,
response_timeout=response_timeout,
keep_alive_timeout=keep_alive_timeout,
request_max_size=request_max_size,
request_buffer_queue_size=request_buffer_queue_size,
request_class=request_class,
access_log=access_log,
keep_alive=keep_alive,
is_request_stream=is_request_stream,
router=router,
websocket_max_size=websocket_max_size,
websocket_max_queue=websocket_max_queue,
websocket_read_limit=websocket_read_limit,
websocket_write_limit=websocket_write_limit,
state=state, state=state,
debug=debug, unix=unix,
**protocol_kwargs,
) )
asyncio_server_kwargs = ( asyncio_server_kwargs = (
asyncio_server_kwargs if asyncio_server_kwargs else {} asyncio_server_kwargs if asyncio_server_kwargs else {}
) )
# UNIX sockets are always bound by us (to preserve semantics between modes)
if unix:
sock = bind_unix_socket(unix, backlog=backlog)
server_coroutine = loop.create_server( server_coroutine = loop.create_server(
server, server,
host, None if sock else host,
port, None if sock else port,
ssl=ssl, ssl=ssl,
reuse_port=reuse_port, reuse_port=reuse_port,
sock=sock, sock=sock,
backlog=backlog, backlog=backlog,
**asyncio_server_kwargs **asyncio_server_kwargs,
) )
if run_async: if run_async:
return AsyncioServer( return AsyncioServer(
loop, loop=loop,
server_coroutine, serve_coro=server_coroutine,
connections, connections=connections,
after_start, after_start=after_start,
before_stop, before_stop=before_stop,
after_stop, after_stop=after_stop,
) )
trigger_events(before_start, loop) trigger_events(before_start, loop)
@@ -905,15 +901,11 @@ def serve(
# Register signals for graceful termination # Register signals for graceful termination
if register_sys_signals: if register_sys_signals:
_singals = (SIGTERM,) if run_multiple else (SIGINT, SIGTERM) if OS_IS_WINDOWS:
for _signal in _singals: ctrlc_workaround_for_windows(app)
try: else:
loop.add_signal_handler(_signal, loop.stop) for _signal in [SIGTERM] if run_multiple else [SIGINT, SIGTERM]:
except NotImplementedError: loop.add_signal_handler(_signal, app.stop)
logger.warning(
"Sanic tried to use loop.add_signal_handler "
"but it is not implemented on this platform."
)
pid = os.getpid() pid = os.getpid()
try: try:
logger.info("Starting worker [%s]", pid) logger.info("Starting worker [%s]", pid)
@@ -937,8 +929,9 @@ def serve(
# We should provide graceful_shutdown_timeout, # We should provide graceful_shutdown_timeout,
# instead of letting connection hangs forever. # instead of letting connection hangs forever.
# Let's roughly calcucate time. # Let's roughly calcucate time.
graceful = app.config.GRACEFUL_SHUTDOWN_TIMEOUT
start_shutdown = 0 start_shutdown = 0
while connections and (start_shutdown < graceful_shutdown_timeout): while connections and (start_shutdown < graceful):
loop.run_until_complete(asyncio.sleep(0.1)) loop.run_until_complete(asyncio.sleep(0.1))
start_shutdown = start_shutdown + 0.1 start_shutdown = start_shutdown + 0.1
@@ -951,12 +944,106 @@ def serve(
else: else:
conn.close() conn.close()
_shutdown = asyncio.gather(*coros, loop=loop) _shutdown = asyncio.gather(*coros)
loop.run_until_complete(_shutdown) loop.run_until_complete(_shutdown)
trigger_events(after_stop, loop) trigger_events(after_stop, loop)
loop.close() loop.close()
remove_unix_socket(unix)
def _build_protocol_kwargs(
protocol: Type[HttpProtocol], config: Config
) -> Dict[str, Union[int, float]]:
if hasattr(protocol, "websocket_handshake"):
return {
"websocket_max_size": config.WEBSOCKET_MAX_SIZE,
"websocket_max_queue": config.WEBSOCKET_MAX_QUEUE,
"websocket_read_limit": config.WEBSOCKET_READ_LIMIT,
"websocket_write_limit": config.WEBSOCKET_WRITE_LIMIT,
"websocket_ping_timeout": config.WEBSOCKET_PING_TIMEOUT,
"websocket_ping_interval": config.WEBSOCKET_PING_INTERVAL,
}
return {}
def bind_socket(host: str, port: int, *, backlog=100) -> socket.socket:
"""Create TCP server socket.
:param host: IPv4, IPv6 or hostname may be specified
:param port: TCP port number
:param backlog: Maximum number of connections to queue
:return: socket.socket object
"""
try: # IP address: family must be specified for IPv6 at least
ip = ip_address(host)
host = str(ip)
sock = socket.socket(
socket.AF_INET6 if ip.version == 6 else socket.AF_INET
)
except ValueError: # Hostname, may become AF_INET or AF_INET6
sock = socket.socket()
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
sock.bind((host, port))
sock.listen(backlog)
return sock
def bind_unix_socket(path: str, *, mode=0o666, backlog=100) -> socket.socket:
"""Create unix socket.
:param path: filesystem path
:param backlog: Maximum number of connections to queue
:return: socket.socket object
"""
"""Open or atomically replace existing socket with zero downtime."""
# Sanitise and pre-verify socket path
path = os.path.abspath(path)
folder = os.path.dirname(path)
if not os.path.isdir(folder):
raise FileNotFoundError(f"Socket folder does not exist: {folder}")
try:
if not stat.S_ISSOCK(os.stat(path, follow_symlinks=False).st_mode):
raise FileExistsError(f"Existing file is not a socket: {path}")
except FileNotFoundError:
pass
# Create new socket with a random temporary name
tmp_path = f"{path}.{secrets.token_urlsafe()}"
sock = socket.socket(socket.AF_UNIX)
try:
# Critical section begins (filename races)
sock.bind(tmp_path)
try:
os.chmod(tmp_path, mode)
# Start listening before rename to avoid connection failures
sock.listen(backlog)
os.rename(tmp_path, path)
except: # noqa: E722
try:
os.unlink(tmp_path)
finally:
raise
except: # noqa: E722
try:
sock.close()
finally:
raise
return sock
def remove_unix_socket(path: str) -> None:
"""Remove dead unix socket during server exit."""
if not path:
return
try:
if stat.S_ISSOCK(os.stat(path, follow_symlinks=False).st_mode):
# Is it actually dead (doesn't belong to a new server instance)?
with socket.socket(socket.AF_UNIX) as testsock:
try:
testsock.connect(path)
except ConnectionRefusedError:
os.unlink(path)
except FileNotFoundError:
pass
def serve_multiple(server_settings, workers): def serve_multiple(server_settings, workers):
@@ -971,11 +1058,17 @@ def serve_multiple(server_settings, workers):
server_settings["reuse_port"] = True server_settings["reuse_port"] = True
server_settings["run_multiple"] = True server_settings["run_multiple"] = True
# Handling when custom socket is not provided. # Create a listening socket or use the one in settings
if server_settings.get("sock") is None: sock = server_settings.get("sock")
sock = socket() unix = server_settings["unix"]
sock.setsockopt(SOL_SOCKET, SO_REUSEADDR, 1) backlog = server_settings["backlog"]
sock.bind((server_settings["host"], server_settings["port"])) if unix:
sock = bind_unix_socket(unix, backlog=backlog)
server_settings["unix"] = unix
if sock is None:
sock = bind_socket(
server_settings["host"], server_settings["port"], backlog=backlog
)
sock.set_inheritable(True) sock.set_inheritable(True)
server_settings["sock"] = sock server_settings["sock"] = sock
server_settings["host"] = None server_settings["host"] = None
@@ -990,9 +1083,10 @@ def serve_multiple(server_settings, workers):
signal_func(SIGINT, lambda s, f: sig_handler(s, f)) signal_func(SIGINT, lambda s, f: sig_handler(s, f))
signal_func(SIGTERM, lambda s, f: sig_handler(s, f)) signal_func(SIGTERM, lambda s, f: sig_handler(s, f))
mp = multiprocessing.get_context("fork")
for _ in range(workers): for _ in range(workers):
process = Process(target=serve, kwargs=server_settings) process = mp.Process(target=serve, kwargs=server_settings)
process.daemon = True process.daemon = True
process.start() process.start()
processes.append(process) processes.append(process)
@@ -1003,4 +1097,6 @@ def serve_multiple(server_settings, workers):
# the above processes will block this until they're stopped # the above processes will block this until they're stopped
for process in processes: for process in processes:
process.terminate() process.terminate()
server_settings.get("sock").close()
sock.close()
remove_unix_socket(unix)

View File

@@ -1,11 +1,11 @@
from functools import partial, wraps
from mimetypes import guess_type from mimetypes import guess_type
from os import path from os import path
from re import sub from re import sub
from time import gmtime, strftime from time import gmtime, strftime
from urllib.parse import unquote from urllib.parse import unquote
from aiofiles.os import stat # type: ignore from sanic.compat import stat_async
from sanic.exceptions import ( from sanic.exceptions import (
ContentRangeError, ContentRangeError,
FileNotFound, FileNotFound,
@@ -16,6 +16,89 @@ from sanic.handlers import ContentRangeHandler
from sanic.response import HTTPResponse, file, file_stream from sanic.response import HTTPResponse, file, file_stream
async def _static_request_handler(
file_or_directory,
use_modified_since,
use_content_range,
stream_large_files,
request,
content_type=None,
file_uri=None,
):
# Using this to determine if the URL is trying to break out of the path
# served. os.path.realpath seems to be very slow
if file_uri and "../" in file_uri:
raise InvalidUsage("Invalid URL")
# Merge served directory and requested file if provided
# Strip all / that in the beginning of the URL to help prevent python
# from herping a derp and treating the uri as an absolute path
root_path = file_path = file_or_directory
if file_uri:
file_path = path.join(file_or_directory, sub("^[/]*", "", file_uri))
# URL decode the path sent by the browser otherwise we won't be able to
# match filenames which got encoded (filenames with spaces etc)
file_path = path.abspath(unquote(file_path))
if not file_path.startswith(path.abspath(unquote(root_path))):
raise FileNotFound(
"File not found", path=file_or_directory, relative_url=file_uri
)
try:
headers = {}
# Check if the client has been sent this file before
# and it has not been modified since
stats = None
if use_modified_since:
stats = await stat_async(file_path)
modified_since = strftime(
"%a, %d %b %Y %H:%M:%S GMT", gmtime(stats.st_mtime)
)
if request.headers.get("If-Modified-Since") == modified_since:
return HTTPResponse(status=304)
headers["Last-Modified"] = modified_since
_range = None
if use_content_range:
_range = None
if not stats:
stats = await stat_async(file_path)
headers["Accept-Ranges"] = "bytes"
headers["Content-Length"] = str(stats.st_size)
if request.method != "HEAD":
try:
_range = ContentRangeHandler(request, stats)
except HeaderNotFound:
pass
else:
del headers["Content-Length"]
for key, value in _range.headers.items():
headers[key] = value
headers["Content-Type"] = (
content_type or guess_type(file_path)[0] or "text/plain"
)
if request.method == "HEAD":
return HTTPResponse(headers=headers)
else:
if stream_large_files:
if type(stream_large_files) == int:
threshold = stream_large_files
else:
threshold = 1024 * 1024
if not stats:
stats = await stat_async(file_path)
if stats.st_size >= threshold:
return await file_stream(
file_path, headers=headers, _range=_range
)
return await file(file_path, headers=headers, _range=_range)
except ContentRangeError:
raise
except Exception:
raise FileNotFound(
"File not found", path=file_or_directory, relative_url=file_uri
)
def register( def register(
app, app,
uri, uri,
@@ -57,85 +140,20 @@ def register(
if not path.isfile(file_or_directory): if not path.isfile(file_or_directory):
uri += "<file_uri:" + pattern + ">" uri += "<file_uri:" + pattern + ">"
async def _handler(request, file_uri=None):
# Using this to determine if the URL is trying to break out of the path
# served. os.path.realpath seems to be very slow
if file_uri and "../" in file_uri:
raise InvalidUsage("Invalid URL")
# Merge served directory and requested file if provided
# Strip all / that in the beginning of the URL to help prevent python
# from herping a derp and treating the uri as an absolute path
root_path = file_path = file_or_directory
if file_uri:
file_path = path.join(
file_or_directory, sub("^[/]*", "", file_uri)
)
# URL decode the path sent by the browser otherwise we won't be able to
# match filenames which got encoded (filenames with spaces etc)
file_path = path.abspath(unquote(file_path))
if not file_path.startswith(path.abspath(unquote(root_path))):
raise FileNotFound(
"File not found", path=file_or_directory, relative_url=file_uri
)
try:
headers = {}
# Check if the client has been sent this file before
# and it has not been modified since
stats = None
if use_modified_since:
stats = await stat(file_path)
modified_since = strftime(
"%a, %d %b %Y %H:%M:%S GMT", gmtime(stats.st_mtime)
)
if request.headers.get("If-Modified-Since") == modified_since:
return HTTPResponse(status=304)
headers["Last-Modified"] = modified_since
_range = None
if use_content_range:
_range = None
if not stats:
stats = await stat(file_path)
headers["Accept-Ranges"] = "bytes"
headers["Content-Length"] = str(stats.st_size)
if request.method != "HEAD":
try:
_range = ContentRangeHandler(request, stats)
except HeaderNotFound:
pass
else:
del headers["Content-Length"]
for key, value in _range.headers.items():
headers[key] = value
headers["Content-Type"] = (
content_type or guess_type(file_path)[0] or "text/plain"
)
if request.method == "HEAD":
return HTTPResponse(headers=headers)
else:
if stream_large_files:
if type(stream_large_files) == int:
threshold = stream_large_files
else:
threshold = 1024 * 1024
if not stats:
stats = await stat(file_path)
if stats.st_size >= threshold:
return await file_stream(
file_path, headers=headers, _range=_range
)
return await file(file_path, headers=headers, _range=_range)
except ContentRangeError:
raise
except Exception:
raise FileNotFound(
"File not found", path=file_or_directory, relative_url=file_uri
)
# special prefix for static files # special prefix for static files
if not name.startswith("_static_"): if not name.startswith("_static_"):
name = "_static_{}".format(name) name = f"_static_{name}"
_handler = wraps(_static_request_handler)(
partial(
_static_request_handler,
file_or_directory,
use_modified_since,
use_content_range,
stream_large_files,
content_type=content_type,
)
)
app.route( app.route(
uri, uri,

View File

@@ -11,8 +11,10 @@ from sanic.response import text
ASGI_HOST = "mockserver" ASGI_HOST = "mockserver"
ASGI_PORT = 1234
ASGI_BASE_URL = f"http://{ASGI_HOST}:{ASGI_PORT}"
HOST = "127.0.0.1" HOST = "127.0.0.1"
PORT = 42101 PORT = None
class SanicTestClient: class SanicTestClient:
@@ -22,8 +24,16 @@ class SanicTestClient:
self.port = port self.port = port
self.host = host self.host = host
@app.listener("after_server_start")
def _start_test_mode(sanic, *args, **kwargs):
sanic.test_mode = True
@app.listener("before_server_end")
def _end_test_mode(sanic, *args, **kwargs):
sanic.test_mode = False
def get_new_session(self): def get_new_session(self):
return httpx.Client() return httpx.AsyncClient(verify=False)
async def _local_request(self, method, url, *args, **kwargs): async def _local_request(self, method, url, *args, **kwargs):
logger.info(url) logger.info(url)
@@ -38,20 +48,22 @@ class SanicTestClient:
try: try:
response = await getattr(session, method.lower())( response = await getattr(session, method.lower())(
url, verify=False, *args, **kwargs url, *args, **kwargs
) )
except NameError: except NameError:
raise Exception(response.status_code) raise Exception(response.status_code)
response.body = await response.aread()
response.status = response.status_code
response.content_type = response.headers.get("content-type")
# response can be decoded as json after response._content
# is set by response.aread()
try: try:
response.json = response.json() response.json = response.json()
except (JSONDecodeError, UnicodeDecodeError): except (JSONDecodeError, UnicodeDecodeError):
response.json = None response.json = None
response.body = await response.read()
response.status = response.status_code
response.content_type = response.headers.get("content-type")
if raw_cookies: if raw_cookies:
response.raw_cookies = {} response.raw_cookies = {}
@@ -93,7 +105,9 @@ class SanicTestClient:
if self.port: if self.port:
server_kwargs = dict( server_kwargs = dict(
host=host or self.host, port=self.port, **server_kwargs host=host or self.host,
port=self.port,
**server_kwargs,
) )
host, port = host or self.host, self.port host, port = host or self.host, self.port
else: else:
@@ -101,17 +115,19 @@ class SanicTestClient:
sock.bind((host or self.host, 0)) sock.bind((host or self.host, 0))
server_kwargs = dict(sock=sock, **server_kwargs) server_kwargs = dict(sock=sock, **server_kwargs)
host, port = sock.getsockname() host, port = sock.getsockname()
self.port = port
if uri.startswith( if uri.startswith(
("http:", "https:", "ftp:", "ftps://", "//", "ws:", "wss:") ("http:", "https:", "ftp:", "ftps://", "//", "ws:", "wss:")
): ):
url = uri url = uri
else: else:
uri = uri if uri.startswith("/") else "/{uri}".format(uri=uri) uri = uri if uri.startswith("/") else f"/{uri}"
scheme = "ws" if method == "websocket" else "http" scheme = "ws" if method == "websocket" else "http"
url = "{scheme}://{host}:{port}{uri}".format( url = f"{scheme}://{host}:{port}{uri}"
scheme=scheme, host=host, port=port, uri=uri # Tests construct URLs using PORT = None, which means random port not
) # known until this function is called, so fix that here
url = url.replace(":None/", f":{port}/")
@self.app.listener("after_server_start") @self.app.listener("after_server_start")
async def _collect_response(sanic, loop): async def _collect_response(sanic, loop):
@@ -129,7 +145,7 @@ class SanicTestClient:
self.app.listeners["after_server_start"].pop() self.app.listeners["after_server_start"].pop()
if exceptions: if exceptions:
raise ValueError("Exception during request: {}".format(exceptions)) raise ValueError(f"Exception during request: {exceptions}")
if gather_request: if gather_request:
try: try:
@@ -137,17 +153,13 @@ class SanicTestClient:
return request, response return request, response
except BaseException: # noqa except BaseException: # noqa
raise ValueError( raise ValueError(
"Request and response object expected, got ({})".format( f"Request and response object expected, got ({results})"
results
)
) )
else: else:
try: try:
return results[-1] return results[-1]
except BaseException: # noqa except BaseException: # noqa
raise ValueError( raise ValueError(f"Request object expected, got ({results})")
"Request object expected, got ({})".format(results)
)
def get(self, *args, **kwargs): def get(self, *args, **kwargs):
return self._sanic_endpoint_test("get", *args, **kwargs) return self._sanic_endpoint_test("get", *args, **kwargs)
@@ -185,30 +197,33 @@ async def app_call_with_return(self, scope, receive, send):
return await asgi_app() return await asgi_app()
class SanicASGIDispatch(httpx.dispatch.ASGIDispatch): class SanicASGITestClient(httpx.AsyncClient):
pass
class SanicASGITestClient(httpx.Client):
def __init__( def __init__(
self, self,
app, app,
base_url: str = "http://{}".format(ASGI_HOST), base_url: str = ASGI_BASE_URL,
suppress_exceptions: bool = False, suppress_exceptions: bool = False,
) -> None: ) -> None:
app.__class__.__call__ = app_call_with_return app.__class__.__call__ = app_call_with_return
app.asgi = True app.asgi = True
self.app = app self.app = app
transport = httpx.ASGITransport(app=app, client=(ASGI_HOST, ASGI_PORT))
dispatch = SanicASGIDispatch(app=app, client=(ASGI_HOST, PORT)) super().__init__(transport=transport, base_url=base_url)
super().__init__(dispatch=dispatch, base_url=base_url)
self.last_request = None self.last_request = None
def _collect_request(request): def _collect_request(request):
self.last_request = request self.last_request = request
@app.listener("after_server_start")
def _start_test_mode(sanic, *args, **kwargs):
sanic.test_mode = True
@app.listener("before_server_end")
def _end_test_mode(sanic, *args, **kwargs):
sanic.test_mode = False
app.request_middleware.appendleft(_collect_request) app.request_middleware.appendleft(_collect_request)
async def request(self, method, url, gather_request=True, *args, **kwargs): async def request(self, method, url, gather_request=True, *args, **kwargs):
@@ -224,7 +239,7 @@ class SanicASGITestClient(httpx.Client):
async def websocket(self, uri, subprotocols=None, *args, **kwargs): async def websocket(self, uri, subprotocols=None, *args, **kwargs):
scheme = "ws" scheme = "ws"
path = uri path = uri
root_path = "{}://{}".format(scheme, ASGI_HOST) root_path = f"{scheme}://{ASGI_HOST}"
headers = kwargs.get("headers", {}) headers = kwargs.get("headers", {})
headers.setdefault("connection", "upgrade") headers.setdefault("connection", "upgrade")

99
sanic/utils.py Normal file
View File

@@ -0,0 +1,99 @@
from importlib.util import module_from_spec, spec_from_file_location
from os import environ as os_environ
from re import findall as re_findall
from typing import Union
from .exceptions import LoadFileException
def str_to_bool(val: str) -> bool:
"""Takes string and tries to turn it into bool as human would do.
If val is in case insensitive (
"y", "yes", "yep", "yup", "t",
"true", "on", "enable", "enabled", "1"
) returns True.
If val is in case insensitive (
"n", "no", "f", "false", "off", "disable", "disabled", "0"
) returns False.
Else Raise ValueError."""
val = val.lower()
if val in {
"y",
"yes",
"yep",
"yup",
"t",
"true",
"on",
"enable",
"enabled",
"1",
}:
return True
elif val in {"n", "no", "f", "false", "off", "disable", "disabled", "0"}:
return False
else:
raise ValueError(f"Invalid truth value {val}")
def load_module_from_file_location(
location: Union[bytes, str], encoding: str = "utf8", *args, **kwargs
):
"""Returns loaded module provided as a file path.
:param args:
Coresponds to importlib.util.spec_from_file_location location
parameters,but with this differences:
- It has to be of a string or bytes type.
- You can also use here environment variables
in format ${some_env_var}.
Mark that $some_env_var will not be resolved as environment variable.
:encoding:
If location parameter is of a bytes type, then use this encoding
to decode it into string.
:param args:
Coresponds to the rest of importlib.util.spec_from_file_location
parameters.
:param kwargs:
Coresponds to the rest of importlib.util.spec_from_file_location
parameters.
For example You can:
some_module = load_module_from_file_location(
"some_module_name",
"/some/path/${some_env_var}"
)
"""
# 1) Parse location.
if isinstance(location, bytes):
location = location.decode(encoding)
# A) Check if location contains any environment variables
# in format ${some_env_var}.
env_vars_in_location = set(re_findall(r"\${(.+?)}", location))
# B) Check these variables exists in environment.
not_defined_env_vars = env_vars_in_location.difference(os_environ.keys())
if not_defined_env_vars:
raise LoadFileException(
"The following environment variables are not set: "
f"{', '.join(not_defined_env_vars)}"
)
# C) Substitute them in location.
for env_var in env_vars_in_location:
location = location.replace("${" + env_var + "}", os_environ[env_var])
# 2) Load and return module.
name = location.split("/")[-1].split(".")[
0
] # get just the file name without path and .py extension
_mod_spec = spec_from_file_location(name, location, *args, **kwargs)
module = module_from_spec(_mod_spec)
_mod_spec.loader.exec_module(module) # type: ignore
return module

View File

@@ -96,14 +96,10 @@ class CompositionView:
handler.is_stream = stream handler.is_stream = stream
for method in methods: for method in methods:
if method not in HTTP_METHODS: if method not in HTTP_METHODS:
raise InvalidUsage( raise InvalidUsage(f"{method} is not a valid HTTP method.")
"{} is not a valid HTTP method.".format(method)
)
if method in self.handlers: if method in self.handlers:
raise InvalidUsage( raise InvalidUsage(f"Method {method} is already registered.")
"Method {} is already registered.".format(method)
)
self.handlers[method] = handler self.handlers[method] = handler
def __call__(self, request, *args, **kwargs): def __call__(self, request, *args, **kwargs):

View File

@@ -3,6 +3,7 @@ from typing import (
Awaitable, Awaitable,
Callable, Callable,
Dict, Dict,
List,
MutableMapping, MutableMapping,
Optional, Optional,
Union, Union,
@@ -34,6 +35,8 @@ class WebSocketProtocol(HttpProtocol):
websocket_max_queue=None, websocket_max_queue=None,
websocket_read_limit=2 ** 16, websocket_read_limit=2 ** 16,
websocket_write_limit=2 ** 16, websocket_write_limit=2 ** 16,
websocket_ping_interval=20,
websocket_ping_timeout=20,
**kwargs **kwargs
): ):
super().__init__(*args, **kwargs) super().__init__(*args, **kwargs)
@@ -44,6 +47,8 @@ class WebSocketProtocol(HttpProtocol):
self.websocket_max_queue = websocket_max_queue self.websocket_max_queue = websocket_max_queue
self.websocket_read_limit = websocket_read_limit self.websocket_read_limit = websocket_read_limit
self.websocket_write_limit = websocket_write_limit self.websocket_write_limit = websocket_write_limit
self.websocket_ping_interval = websocket_ping_interval
self.websocket_ping_timeout = websocket_ping_timeout
# timeouts make no sense for websocket routes # timeouts make no sense for websocket routes
def request_timeout_callback(self): def request_timeout_callback(self):
@@ -113,11 +118,13 @@ class WebSocketProtocol(HttpProtocol):
# hook up the websocket protocol # hook up the websocket protocol
self.websocket = WebSocketCommonProtocol( self.websocket = WebSocketCommonProtocol(
timeout=self.websocket_timeout, close_timeout=self.websocket_timeout,
max_size=self.websocket_max_size, max_size=self.websocket_max_size,
max_queue=self.websocket_max_queue, max_queue=self.websocket_max_queue,
read_limit=self.websocket_read_limit, read_limit=self.websocket_read_limit,
write_limit=self.websocket_write_limit, write_limit=self.websocket_write_limit,
ping_interval=self.websocket_ping_interval,
ping_timeout=self.websocket_ping_timeout,
) )
# Following two lines are required for websockets 8.x # Following two lines are required for websockets 8.x
self.websocket.is_client = False self.websocket.is_client = False
@@ -137,9 +144,11 @@ class WebSocketConnection:
self, self,
send: Callable[[ASIMessage], Awaitable[None]], send: Callable[[ASIMessage], Awaitable[None]],
receive: Callable[[], Awaitable[ASIMessage]], receive: Callable[[], Awaitable[ASIMessage]],
subprotocols: Optional[List[str]] = None,
) -> None: ) -> None:
self._send = send self._send = send
self._receive = receive self._receive = receive
self.subprotocols = subprotocols or []
async def send(self, data: Union[str, bytes], *args, **kwargs) -> None: async def send(self, data: Union[str, bytes], *args, **kwargs) -> None:
message: Dict[str, Union[str, bytes]] = {"type": "websocket.send"} message: Dict[str, Union[str, bytes]] = {"type": "websocket.send"}
@@ -164,7 +173,14 @@ class WebSocketConnection:
receive = recv receive = recv
async def accept(self) -> None: async def accept(self) -> None:
await self._send({"type": "websocket.accept", "subprotocol": ""}) await self._send(
{
"type": "websocket.accept",
"subprotocol": ",".join(
[subprotocol for subprotocol in self.subprotocols]
),
}
)
async def close(self) -> None: async def close(self) -> None:
pass pass

View File

@@ -5,7 +5,7 @@ import signal
import sys import sys
import traceback import traceback
import gunicorn.workers.base as base # type: ignore from gunicorn.workers import base as base # type: ignore
from sanic.server import HttpProtocol, Signal, serve, trigger_events from sanic.server import HttpProtocol, Signal, serve, trigger_events
from sanic.websocket import WebSocketProtocol from sanic.websocket import WebSocketProtocol
@@ -174,7 +174,7 @@ class GunicornWorker(base.Worker):
@staticmethod @staticmethod
def _create_ssl_context(cfg): def _create_ssl_context(cfg):
""" Creates SSLContext instance for usage in asyncio.create_server. """Creates SSLContext instance for usage in asyncio.create_server.
See ssl.SSLSocket.__init__ for more details. See ssl.SSLSocket.__init__ for more details.
""" """
ctx = ssl.SSLContext(cfg.ssl_version) ctx = ssl.SSLContext(cfg.ssl_version)

View File

@@ -11,11 +11,3 @@ line_length = 79
lines_after_imports = 2 lines_after_imports = 2
lines_between_types = 1 lines_between_types = 1
multi_line_output = 3 multi_line_output = 3
not_skip = __init__.py
[version]
current_version = 19.12.0
files = sanic/__version__.py
current_version_pattern = __version__ = "{current_version}"
new_version_pattern = __version__ = "{new_version}"

View File

@@ -5,7 +5,6 @@ import codecs
import os import os
import re import re
import sys import sys
from distutils.util import strtobool from distutils.util import strtobool
from setuptools import setup from setuptools import setup
@@ -39,9 +38,7 @@ def open_local(paths, mode="r", encoding="utf8"):
with open_local(["sanic", "__version__.py"], encoding="latin1") as fp: with open_local(["sanic", "__version__.py"], encoding="latin1") as fp:
try: try:
version = re.findall( version = re.findall(r"^__version__ = \"([^']+)\"\r?$", fp.read(), re.M)[0]
r"^__version__ = \"([^']+)\"\r?$", fp.read(), re.M
)[0]
except IndexError: except IndexError:
raise RuntimeError("Unable to determine version.") raise RuntimeError("Unable to determine version.")
@@ -68,12 +65,12 @@ setup_kwargs = {
"License :: OSI Approved :: MIT License", "License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
], ],
"entry_points": {"console_scripts": ["sanic = sanic.__main__:main"]},
} }
env_dependency = ( env_dependency = '; sys_platform != "win32" ' 'and implementation_name == "cpython"'
'; sys_platform != "win32" ' 'and implementation_name == "cpython"'
)
ujson = "ujson>=1.35" + env_dependency ujson = "ujson>=1.35" + env_dependency
uvloop = "uvloop>=0.5.3" + env_dependency uvloop = "uvloop>=0.5.3" + env_dependency
@@ -82,9 +79,9 @@ requirements = [
uvloop, uvloop,
ujson, ujson,
"aiofiles>=0.3.0", "aiofiles>=0.3.0",
"websockets>=7.0,<9.0", "websockets>=8.1,<9.0",
"multidict>=4.0,<5.0", "multidict>=4.0,<5.0",
"httpx==0.9.3", "httpx==0.15.4",
] ]
tests_require = [ tests_require = [

View File

@@ -103,7 +103,7 @@ def sanic_router():
for method, route in route_details: for method, route in route_details:
try: try:
router._add( router._add(
uri="/{}".format(route), uri=f"/{route}",
methods=frozenset({method}), methods=frozenset({method}),
host="localhost", host="localhost",
handler=_handler, handler=_handler,

View File

@@ -1,7 +1,6 @@
# Run with: gunicorn --workers=1 --worker-class=meinheld.gmeinheld.MeinheldWorker simple_server:main # Run with: gunicorn --workers=1 --worker-class=meinheld.gmeinheld.MeinheldWorker simple_server:main
""" Minimal helloworld application. """ Minimal helloworld application.
""" """
import ujson import ujson
from wheezy.http import HTTPResponse, WSGIApplication from wheezy.http import HTTPResponse, WSGIApplication
@@ -39,6 +38,7 @@ main = WSGIApplication(
if __name__ == "__main__": if __name__ == "__main__":
import sys import sys
from wsgiref.simple_server import make_server from wsgiref.simple_server import make_server
try: try:

View File

@@ -0,0 +1 @@
TEST_SETTING_VALUE = 1

View File

@@ -3,9 +3,11 @@ import logging
import sys import sys
from inspect import isawaitable from inspect import isawaitable
from unittest.mock import patch
import pytest import pytest
from sanic import Sanic
from sanic.exceptions import SanicException from sanic.exceptions import SanicException
from sanic.response import text from sanic.response import text
@@ -44,10 +46,11 @@ def test_create_asyncio_server(app):
@pytest.mark.skipif( @pytest.mark.skipif(
sys.version_info < (3, 7), reason="requires python3.7 or higher" sys.version_info < (3, 7), reason="requires python3.7 or higher"
) )
def test_asyncio_server_start_serving(app): def test_asyncio_server_no_start_serving(app):
if not uvloop_installed(): if not uvloop_installed():
loop = asyncio.get_event_loop() loop = asyncio.get_event_loop()
asyncio_srv_coro = app.create_server( asyncio_srv_coro = app.create_server(
port=43123,
return_asyncio_server=True, return_asyncio_server=True,
asyncio_server_kwargs=dict(start_serving=False), asyncio_server_kwargs=dict(start_serving=False),
) )
@@ -55,6 +58,26 @@ def test_asyncio_server_start_serving(app):
assert srv.is_serving() is False assert srv.is_serving() is False
@pytest.mark.skipif(
sys.version_info < (3, 7), reason="requires python3.7 or higher"
)
def test_asyncio_server_start_serving(app):
if not uvloop_installed():
loop = asyncio.get_event_loop()
asyncio_srv_coro = app.create_server(
port=43124,
return_asyncio_server=True,
asyncio_server_kwargs=dict(start_serving=False),
)
srv = loop.run_until_complete(asyncio_srv_coro)
assert srv.is_serving() is False
loop.run_until_complete(srv.start_serving())
assert srv.is_serving() is True
wait_close = srv.close()
loop.run_until_complete(wait_close)
# Looks like we can't easily test `serve_forever()`
def test_app_loop_not_running(app): def test_app_loop_not_running(app):
with pytest.raises(SanicException) as excinfo: with pytest.raises(SanicException) as excinfo:
app.loop app.loop
@@ -103,11 +126,11 @@ def test_app_handle_request_handler_is_none(app, monkeypatch):
def handler(request): def handler(request):
return text("test") return text("test")
request, response = app.test_client.get("/test") _, response = app.test_client.get("/test")
assert ( assert (
response.text "'None' was returned while requesting a handler from the router"
== "Error: 'None' was returned while requesting a handler from the router" in response.text
) )
@@ -126,6 +149,43 @@ def test_app_enable_websocket(app, websocket_enabled, enable):
assert app.websocket_enabled == True assert app.websocket_enabled == True
@patch("sanic.app.WebSocketProtocol")
def test_app_websocket_parameters(websocket_protocol_mock, app):
app.config.WEBSOCKET_MAX_SIZE = 44
app.config.WEBSOCKET_MAX_QUEUE = 45
app.config.WEBSOCKET_READ_LIMIT = 46
app.config.WEBSOCKET_WRITE_LIMIT = 47
app.config.WEBSOCKET_PING_TIMEOUT = 48
app.config.WEBSOCKET_PING_INTERVAL = 50
@app.websocket("/ws")
async def handler(request, ws):
await ws.send("test")
try:
# This will fail because WebSocketProtocol is mocked and only the call kwargs matter
app.test_client.get("/ws")
except:
pass
websocket_protocol_call_args = websocket_protocol_mock.call_args
ws_kwargs = websocket_protocol_call_args[1]
assert ws_kwargs["websocket_max_size"] == app.config.WEBSOCKET_MAX_SIZE
assert ws_kwargs["websocket_max_queue"] == app.config.WEBSOCKET_MAX_QUEUE
assert ws_kwargs["websocket_read_limit"] == app.config.WEBSOCKET_READ_LIMIT
assert (
ws_kwargs["websocket_write_limit"] == app.config.WEBSOCKET_WRITE_LIMIT
)
assert (
ws_kwargs["websocket_ping_timeout"]
== app.config.WEBSOCKET_PING_TIMEOUT
)
assert (
ws_kwargs["websocket_ping_interval"]
== app.config.WEBSOCKET_PING_INTERVAL
)
def test_handle_request_with_nested_exception(app, monkeypatch): def test_handle_request_with_nested_exception(app, monkeypatch):
err_msg = "Mock Exception" err_msg = "Mock Exception"
@@ -166,9 +226,7 @@ def test_handle_request_with_nested_exception_debug(app, monkeypatch):
request, response = app.test_client.get("/", debug=True) request, response = app.test_client.get("/", debug=True)
assert response.status == 500 assert response.status == 500
assert response.text.startswith( assert response.text.startswith(
"Error while handling error: {}\nStack: Traceback (most recent call last):\n".format( f"Error while handling error: {err_msg}\nStack: Traceback (most recent call last):\n"
err_msg
)
) )
@@ -188,10 +246,42 @@ def test_handle_request_with_nested_sanic_exception(app, monkeypatch, caplog):
with caplog.at_level(logging.ERROR): with caplog.at_level(logging.ERROR):
request, response = app.test_client.get("/") request, response = app.test_client.get("/")
port = request.server_port
assert port > 0
assert response.status == 500 assert response.status == 500
assert response.text == "Error: Mock SanicException" assert "Mock SanicException" in response.text
assert ( assert (
"sanic.root", "sanic.root",
logging.ERROR, logging.ERROR,
"Exception occurred while handling uri: 'http://127.0.0.1:42101/'", f"Exception occurred while handling uri: 'http://127.0.0.1:{port}/'",
) in caplog.record_tuples ) in caplog.record_tuples
def test_app_name_required():
with pytest.deprecated_call():
Sanic()
def test_app_has_test_mode_sync():
app = Sanic("test")
@app.get("/")
def handler(request):
assert request.app.test_mode
return text("test")
_, response = app.test_client.get("/")
assert response.status == 200
# @pytest.mark.asyncio
# async def test_app_has_test_mode_async():
# app = Sanic("test")
# @app.get("/")
# async def handler(request):
# assert request.app.test_mode
# return text("test")
# _, response = await app.asgi_client.get("/")
# assert response.status == 200

View File

@@ -1,4 +1,5 @@
import asyncio import asyncio
import sys
from collections import deque, namedtuple from collections import deque, namedtuple
@@ -81,7 +82,12 @@ def test_listeners_triggered(app):
with pytest.warns(UserWarning): with pytest.warns(UserWarning):
server.run() server.run()
for task in asyncio.Task.all_tasks(): all_tasks = (
asyncio.Task.all_tasks()
if sys.version_info < (3, 7)
else asyncio.all_tasks(asyncio.get_event_loop())
)
for task in all_tasks:
task.cancel() task.cancel()
assert before_server_start assert before_server_start
@@ -126,7 +132,12 @@ def test_listeners_triggered_async(app):
with pytest.warns(UserWarning): with pytest.warns(UserWarning):
server.run() server.run()
for task in asyncio.Task.all_tasks(): all_tasks = (
asyncio.Task.all_tasks()
if sys.version_info < (3, 7)
else asyncio.all_tasks(asyncio.get_event_loop())
)
for task in all_tasks:
task.cancel() task.cancel()
assert before_server_start assert before_server_start
@@ -197,6 +208,53 @@ async def test_websocket_receive(send, receive, message_stack):
assert text == msg["text"] assert text == msg["text"]
@pytest.mark.asyncio
async def test_websocket_accept_with_no_subprotocols(
send, receive, message_stack
):
ws = WebSocketConnection(send, receive)
await ws.accept()
assert len(message_stack) == 1
message = message_stack.popleft()
assert message["type"] == "websocket.accept"
assert message["subprotocol"] == ""
assert "bytes" not in message
@pytest.mark.asyncio
async def test_websocket_accept_with_subprotocol(send, receive, message_stack):
subprotocols = ["graphql-ws"]
ws = WebSocketConnection(send, receive, subprotocols)
await ws.accept()
assert len(message_stack) == 1
message = message_stack.popleft()
assert message["type"] == "websocket.accept"
assert message["subprotocol"] == "graphql-ws"
assert "bytes" not in message
@pytest.mark.asyncio
async def test_websocket_accept_with_multiple_subprotocols(
send, receive, message_stack
):
subprotocols = ["graphql-ws", "hello", "world"]
ws = WebSocketConnection(send, receive, subprotocols)
await ws.accept()
assert len(message_stack) == 1
message = message_stack.popleft()
assert message["type"] == "websocket.accept"
assert message["subprotocol"] == "graphql-ws,hello,world"
assert "bytes" not in message
def test_improper_websocket_connection(transport, send, receive): def test_improper_websocket_connection(transport, send, receive):
with pytest.raises(InvalidUsage): with pytest.raises(InvalidUsage):
transport.get_websocket_connection() transport.get_websocket_connection()
@@ -221,7 +279,7 @@ async def test_request_class_custom():
class MyCustomRequest(Request): class MyCustomRequest(Request):
pass pass
app = Sanic(request_class=MyCustomRequest) app = Sanic(name=__name__, request_class=MyCustomRequest)
@app.get("/custom") @app.get("/custom")
def custom_request(request): def custom_request(request):

View File

@@ -18,4 +18,4 @@ def test_bad_request_response(app):
app.run(host="127.0.0.1", port=42101, debug=False) app.run(host="127.0.0.1", port=42101, debug=False)
assert lines[0] == b"HTTP/1.1 400 Bad Request\r\n" assert lines[0] == b"HTTP/1.1 400 Bad Request\r\n"
assert lines[-1] == b"Error: Bad Request" assert b"Bad Request" in lines[-1]

View File

@@ -40,9 +40,9 @@ def test_bp_group_with_additional_route_params(app: Sanic):
) )
def blueprint_2_named_method(request: Request, param): def blueprint_2_named_method(request: Request, param):
if request.method == "DELETE": if request.method == "DELETE":
return text("DELETE_{}".format(param)) return text(f"DELETE_{param}")
elif request.method == "PATCH": elif request.method == "PATCH":
return text("PATCH_{}".format(param)) return text(f"PATCH_{param}")
blueprint_group = Blueprint.group( blueprint_group = Blueprint.group(
blueprint_1, blueprint_2, url_prefix="/api" blueprint_1, blueprint_2, url_prefix="/api"

View File

@@ -46,19 +46,19 @@ def test_versioned_routes_get(app, method):
func = getattr(bp, method) func = getattr(bp, method)
if callable(func): if callable(func):
@func("/{}".format(method), version=1) @func(f"/{method}", version=1)
def handler(request): def handler(request):
return text("OK") return text("OK")
else: else:
print(func) print(func)
raise Exception("{} is not callable".format(func)) raise Exception(f"{func} is not callable")
app.blueprint(bp) app.blueprint(bp)
client_method = getattr(app.test_client, method) client_method = getattr(app.test_client, method)
request, response = client_method("/v1/{}".format(method)) request, response = client_method(f"/v1/{method}")
assert response.status == 200 assert response.status == 200
@@ -252,8 +252,90 @@ def test_several_bp_with_host(app):
assert response.text == "Hello3" assert response.text == "Hello3"
def test_bp_with_host_list(app):
bp = Blueprint(
"test_bp_host",
url_prefix="/test1",
host=["example.com", "sub.example.com"],
)
@bp.route("/")
def handler1(request):
return text("Hello")
@bp.route("/", host=["sub1.example.com"])
def handler2(request):
return text("Hello subdomain!")
app.blueprint(bp)
headers = {"Host": "example.com"}
request, response = app.test_client.get("/test1/", headers=headers)
assert response.text == "Hello"
headers = {"Host": "sub.example.com"}
request, response = app.test_client.get("/test1/", headers=headers)
assert response.text == "Hello"
headers = {"Host": "sub1.example.com"}
request, response = app.test_client.get("/test1/", headers=headers)
assert response.text == "Hello subdomain!"
def test_several_bp_with_host_list(app):
bp = Blueprint(
"test_text",
url_prefix="/test",
host=["example.com", "sub.example.com"],
)
bp2 = Blueprint(
"test_text2",
url_prefix="/test",
host=["sub1.example.com", "sub2.example.com"],
)
@bp.route("/")
def handler(request):
return text("Hello")
@bp2.route("/")
def handler1(request):
return text("Hello2")
@bp2.route("/other/")
def handler2(request):
return text("Hello3")
app.blueprint(bp)
app.blueprint(bp2)
assert bp.host == ["example.com", "sub.example.com"]
headers = {"Host": "example.com"}
request, response = app.test_client.get("/test/", headers=headers)
assert response.text == "Hello"
assert bp.host == ["example.com", "sub.example.com"]
headers = {"Host": "sub.example.com"}
request, response = app.test_client.get("/test/", headers=headers)
assert response.text == "Hello"
assert bp2.host == ["sub1.example.com", "sub2.example.com"]
headers = {"Host": "sub1.example.com"}
request, response = app.test_client.get("/test/", headers=headers)
assert response.text == "Hello2"
request, response = app.test_client.get("/test/other/", headers=headers)
assert response.text == "Hello3"
assert bp2.host == ["sub1.example.com", "sub2.example.com"]
headers = {"Host": "sub2.example.com"}
request, response = app.test_client.get("/test/", headers=headers)
assert response.text == "Hello2"
request, response = app.test_client.get("/test/other/", headers=headers)
assert response.text == "Hello3"
def test_bp_middleware(app): def test_bp_middleware(app):
blueprint = Blueprint("test_middleware") blueprint = Blueprint("test_bp_middleware")
@blueprint.middleware("response") @blueprint.middleware("response")
async def process_response(request, response): async def process_response(request, response):
@@ -271,6 +353,46 @@ def test_bp_middleware(app):
assert response.text == "FAIL" assert response.text == "FAIL"
def test_bp_middleware_order(app):
blueprint = Blueprint("test_bp_middleware_order")
order = list()
@blueprint.middleware("request")
def mw_1(request):
order.append(1)
@blueprint.middleware("request")
def mw_2(request):
order.append(2)
@blueprint.middleware("request")
def mw_3(request):
order.append(3)
@blueprint.middleware("response")
def mw_4(request, response):
order.append(6)
@blueprint.middleware("response")
def mw_5(request, response):
order.append(5)
@blueprint.middleware("response")
def mw_6(request, response):
order.append(4)
@blueprint.route("/")
def process_response(request):
return text("OK")
app.blueprint(blueprint)
order.clear()
request, response = app.test_client.get("/")
assert response.status == 200
assert order == [1, 2, 3, 4, 5, 6]
def test_bp_exception_handler(app): def test_bp_exception_handler(app):
blueprint = Blueprint("test_middleware") blueprint = Blueprint("test_middleware")
@@ -553,9 +675,7 @@ def test_bp_group_with_default_url_prefix(app):
from uuid import uuid4 from uuid import uuid4
resource_id = str(uuid4()) resource_id = str(uuid4())
request, response = app.test_client.get( request, response = app.test_client.get(f"/api/v1/resources/{resource_id}")
"/api/v1/resources/{0}".format(resource_id)
)
assert response.json == {"resource_id": resource_id} assert response.json == {"resource_id": resource_id}
@@ -669,9 +789,9 @@ def test_duplicate_blueprint(app):
app.blueprint(bp1) app.blueprint(bp1)
assert str(excinfo.value) == ( assert str(excinfo.value) == (
'A blueprint with the name "{}" is already registered. ' f'A blueprint with the name "{bp_name}" is already registered. '
"Blueprint names must be unique." "Blueprint names must be unique."
).format(bp_name) )
@pytest.mark.parametrize("debug", [True, False, None]) @pytest.mark.parametrize("debug", [True, False, None])

View File

@@ -44,42 +44,42 @@ def test_load_from_object_string_exception(app):
def test_auto_load_env(): def test_auto_load_env():
environ["SANIC_TEST_ANSWER"] = "42" environ["SANIC_TEST_ANSWER"] = "42"
app = Sanic() app = Sanic(name=__name__)
assert app.config.TEST_ANSWER == 42 assert app.config.TEST_ANSWER == 42
del environ["SANIC_TEST_ANSWER"] del environ["SANIC_TEST_ANSWER"]
def test_auto_load_bool_env(): def test_auto_load_bool_env():
environ["SANIC_TEST_ANSWER"] = "True" environ["SANIC_TEST_ANSWER"] = "True"
app = Sanic() app = Sanic(name=__name__)
assert app.config.TEST_ANSWER == True assert app.config.TEST_ANSWER == True
del environ["SANIC_TEST_ANSWER"] del environ["SANIC_TEST_ANSWER"]
def test_dont_load_env(): def test_dont_load_env():
environ["SANIC_TEST_ANSWER"] = "42" environ["SANIC_TEST_ANSWER"] = "42"
app = Sanic(load_env=False) app = Sanic(name=__name__, load_env=False)
assert getattr(app.config, "TEST_ANSWER", None) is None assert getattr(app.config, "TEST_ANSWER", None) is None
del environ["SANIC_TEST_ANSWER"] del environ["SANIC_TEST_ANSWER"]
def test_load_env_prefix(): def test_load_env_prefix():
environ["MYAPP_TEST_ANSWER"] = "42" environ["MYAPP_TEST_ANSWER"] = "42"
app = Sanic(load_env="MYAPP_") app = Sanic(name=__name__, load_env="MYAPP_")
assert app.config.TEST_ANSWER == 42 assert app.config.TEST_ANSWER == 42
del environ["MYAPP_TEST_ANSWER"] del environ["MYAPP_TEST_ANSWER"]
def test_load_env_prefix_float_values(): def test_load_env_prefix_float_values():
environ["MYAPP_TEST_ROI"] = "2.3" environ["MYAPP_TEST_ROI"] = "2.3"
app = Sanic(load_env="MYAPP_") app = Sanic(name=__name__, load_env="MYAPP_")
assert app.config.TEST_ROI == 2.3 assert app.config.TEST_ROI == 2.3
del environ["MYAPP_TEST_ROI"] del environ["MYAPP_TEST_ROI"]
def test_load_env_prefix_string_value(): def test_load_env_prefix_string_value():
environ["MYAPP_TEST_TOKEN"] = "somerandomtesttoken" environ["MYAPP_TEST_TOKEN"] = "somerandomtesttoken"
app = Sanic(load_env="MYAPP_") app = Sanic(name=__name__, load_env="MYAPP_")
assert app.config.TEST_TOKEN == "somerandomtesttoken" assert app.config.TEST_TOKEN == "somerandomtesttoken"
del environ["MYAPP_TEST_TOKEN"] del environ["MYAPP_TEST_TOKEN"]

View File

@@ -15,7 +15,8 @@ from sanic.response import text
def test_cookies(app): def test_cookies(app):
@app.route("/") @app.route("/")
def handler(request): def handler(request):
response = text("Cookies are: {}".format(request.cookies["test"])) cookie_value = request.cookies["test"]
response = text(f"Cookies are: {cookie_value}")
response.cookies["right_back"] = "at you" response.cookies["right_back"] = "at you"
return response return response
@@ -31,7 +32,8 @@ def test_cookies(app):
async def test_cookies_asgi(app): async def test_cookies_asgi(app):
@app.route("/") @app.route("/")
def handler(request): def handler(request):
response = text("Cookies are: {}".format(request.cookies["test"])) cookie_value = request.cookies["test"]
response = text(f"Cookies are: {cookie_value}")
response.cookies["right_back"] = "at you" response.cookies["right_back"] = "at you"
return response return response
@@ -52,7 +54,7 @@ def test_false_cookies_encoded(app, httponly, expected):
response = text("hello cookies") response = text("hello cookies")
response.cookies["hello"] = "world" response.cookies["hello"] = "world"
response.cookies["hello"]["httponly"] = httponly response.cookies["hello"]["httponly"] = httponly
return text(response.cookies["hello"].encode("utf8")) return text(response.cookies["hello"].encode("utf8").decode())
request, response = app.test_client.get("/") request, response = app.test_client.get("/")
@@ -78,7 +80,8 @@ def test_false_cookies(app, httponly, expected):
def test_http2_cookies(app): def test_http2_cookies(app):
@app.route("/") @app.route("/")
async def handler(request): async def handler(request):
response = text("Cookies are: {}".format(request.cookies["test"])) cookie_value = request.cookies["test"]
response = text(f"Cookies are: {cookie_value}")
return response return response
headers = {"cookie": "test=working!"} headers = {"cookie": "test=working!"}

View File

@@ -17,12 +17,12 @@ def test_create_task(app):
@app.route("/early") @app.route("/early")
def not_set(request): def not_set(request):
return text(e.is_set()) return text(str(e.is_set()))
@app.route("/late") @app.route("/late")
async def set(request): async def set(request):
await asyncio.sleep(0.1) await asyncio.sleep(0.1)
return text(e.is_set()) return text(str(e.is_set()))
request, response = app.test_client.get("/early") request, response = app.test_client.get("/early")
assert response.body == b"False" assert response.body == b"False"

View File

@@ -20,7 +20,7 @@ class CustomRequest(Request):
def test_custom_request(): def test_custom_request():
app = Sanic(request_class=CustomRequest) app = Sanic(name=__name__, request_class=CustomRequest)
@app.route("/post", methods=["POST"]) @app.route("/post", methods=["POST"])
async def post_handler(request): async def post_handler(request):

86
tests/test_errorpages.py Normal file
View File

@@ -0,0 +1,86 @@
import pytest
from sanic import Sanic
from sanic.errorpages import exception_response
from sanic.exceptions import NotFound
from sanic.request import Request
from sanic.response import HTTPResponse
@pytest.fixture
def app():
app = Sanic("error_page_testing")
@app.route("/error", methods=["GET", "POST"])
def err(request):
raise Exception("something went wrong")
return app
@pytest.fixture
def fake_request(app):
return Request(b"/foobar", {}, "1.1", "GET", None, app)
@pytest.mark.parametrize(
"fallback,content_type, exception, status",
(
(None, "text/html; charset=utf-8", Exception, 500),
("html", "text/html; charset=utf-8", Exception, 500),
("auto", "text/html; charset=utf-8", Exception, 500),
("text", "text/plain; charset=utf-8", Exception, 500),
("json", "application/json", Exception, 500),
(None, "text/html; charset=utf-8", NotFound, 404),
("html", "text/html; charset=utf-8", NotFound, 404),
("auto", "text/html; charset=utf-8", NotFound, 404),
("text", "text/plain; charset=utf-8", NotFound, 404),
("json", "application/json", NotFound, 404),
),
)
def test_should_return_html_valid_setting(
fake_request, fallback, content_type, exception, status
):
if fallback:
fake_request.app.config.FALLBACK_ERROR_FORMAT = fallback
try:
raise exception("bad stuff")
except Exception as e:
response = exception_response(fake_request, e, True)
assert isinstance(response, HTTPResponse)
assert response.status == status
assert response.content_type == content_type
def test_auto_fallback_with_data(app):
app.config.FALLBACK_ERROR_FORMAT = "auto"
_, response = app.test_client.get("/error")
assert response.status == 500
assert response.content_type == "text/html; charset=utf-8"
_, response = app.test_client.post("/error", json={"foo": "bar"})
assert response.status == 500
assert response.content_type == "application/json"
_, response = app.test_client.post("/error", data={"foo": "bar"})
assert response.status == 500
assert response.content_type == "text/html; charset=utf-8"
def test_auto_fallback_with_content_type(app):
app.config.FALLBACK_ERROR_FORMAT = "auto"
_, response = app.test_client.get(
"/error", headers={"content-type": "application/json"}
)
assert response.status == 500
assert response.content_type == "application/json"
_, response = app.test_client.get(
"/error", headers={"content-type": "text/plain"}
)
assert response.status == 500
assert response.content_type == "text/plain; charset=utf-8"

View File

@@ -172,7 +172,7 @@ def test_handled_unhandled_exception(exception_app):
request, response = exception_app.test_client.get("/divide_by_zero") request, response = exception_app.test_client.get("/divide_by_zero")
assert response.status == 500 assert response.status == 500
soup = BeautifulSoup(response.body, "html.parser") soup = BeautifulSoup(response.body, "html.parser")
assert soup.h1.text == "Internal Server Error" assert "Internal Server Error" in soup.h1.text
message = " ".join(soup.p.text.split()) message = " ".join(soup.p.text.split())
assert message == ( assert message == (
@@ -218,4 +218,4 @@ def test_abort(exception_app):
request, response = exception_app.test_client.get("/abort/message") request, response = exception_app.test_client.get("/abort/message")
assert response.status == 500 assert response.status == 500
assert response.text == "Error: Abort" assert "Abort" in response.text

View File

@@ -43,7 +43,7 @@ def handler_6(request, arg):
try: try:
foo = 1 / arg foo = 1 / arg
except Exception as e: except Exception as e:
raise e from ValueError("{}".format(arg)) raise e from ValueError(f"{arg}")
return text(foo) return text(foo)
@@ -86,7 +86,7 @@ def test_html_traceback_output_in_debug_mode():
summary_text = " ".join(soup.select(".summary")[0].text.split()) summary_text = " ".join(soup.select(".summary")[0].text.split())
assert ( assert (
"NameError: name 'bar' " "is not defined while handling path /4" "NameError: name 'bar' is not defined while handling path /4"
) == summary_text ) == summary_text
@@ -112,7 +112,7 @@ def test_chained_exception_handler():
summary_text = " ".join(soup.select(".summary")[0].text.split()) summary_text = " ".join(soup.select(".summary")[0].text.split())
assert ( assert (
"ZeroDivisionError: division by zero " "while handling path /6/0" "ZeroDivisionError: division by zero while handling path /6/0"
) == summary_text ) == summary_text

View File

@@ -3,56 +3,36 @@ import asyncio
from asyncio import sleep as aio_sleep from asyncio import sleep as aio_sleep
from json import JSONDecodeError from json import JSONDecodeError
import httpcore
import httpx import httpx
from sanic import Sanic, server from sanic import Sanic, server
from sanic.response import text from sanic.response import text
from sanic.testing import HOST, PORT, SanicTestClient from sanic.testing import HOST, SanicTestClient
CONFIG_FOR_TESTS = {"KEEP_ALIVE_TIMEOUT": 2, "KEEP_ALIVE": True} CONFIG_FOR_TESTS = {"KEEP_ALIVE_TIMEOUT": 2, "KEEP_ALIVE": True}
old_conn = None PORT = 42101 # test_keep_alive_timeout_reuse doesn't work with random port
from httpcore._async.base import ConnectionState
from httpcore._async.connection import AsyncHTTPConnection
from httpcore._types import Origin
class ReusableSanicConnectionPool( class ReusableSanicConnectionPool(httpcore.AsyncConnectionPool):
httpx.dispatch.connection_pool.ConnectionPool last_reused_connection = None
):
async def acquire_connection(self, origin, timeout):
global old_conn
connection = self.pop_connection(origin)
if connection is None: async def _get_connection_from_pool(self, *args, **kwargs):
pool_timeout = None if timeout is None else timeout.pool_timeout conn = await super()._get_connection_from_pool(*args, **kwargs)
self.__class__.last_reused_connection = conn
await self.max_connections.acquire(timeout=pool_timeout) return conn
connection = httpx.dispatch.connection.HTTPConnection(
origin,
verify=self.verify,
cert=self.cert,
http2=self.http2,
backend=self.backend,
release_func=self.release_connection,
trust_env=self.trust_env,
uds=self.uds,
)
self.active_connections.add(connection)
if old_conn is not None:
if old_conn != connection:
raise RuntimeError(
"We got a new connection, wanted the same one!"
)
old_conn = connection
return connection
class ResusableSanicSession(httpx.Client): class ResusableSanicSession(httpx.AsyncClient):
def __init__(self, *args, **kwargs) -> None: def __init__(self, *args, **kwargs) -> None:
dispatch = ReusableSanicConnectionPool() transport = ReusableSanicConnectionPool()
super().__init__(dispatch=dispatch, *args, **kwargs) super().__init__(transport=transport, *args, **kwargs)
class ReuseableSanicTestClient(SanicTestClient): class ReuseableSanicTestClient(SanicTestClient):
@@ -97,11 +77,9 @@ class ReuseableSanicTestClient(SanicTestClient):
): ):
url = uri url = uri
else: else:
uri = uri if uri.startswith("/") else "/{uri}".format(uri=uri) uri = uri if uri.startswith("/") else f"/{uri}"
scheme = "http" scheme = "http"
url = "{scheme}://{host}:{port}{uri}".format( url = f"{scheme}://{HOST}:{PORT}{uri}"
scheme=scheme, host=HOST, port=PORT, uri=uri
)
@self.app.listener("after_server_start") @self.app.listener("after_server_start")
async def _collect_response(loop): async def _collect_response(loop):
@@ -134,7 +112,7 @@ class ReuseableSanicTestClient(SanicTestClient):
self.app.listeners["after_server_start"].pop() self.app.listeners["after_server_start"].pop()
if exceptions: if exceptions:
raise ValueError("Exception during request: {}".format(exceptions)) raise ValueError(f"Exception during request: {exceptions}")
if gather_request: if gather_request:
self.app.request_middleware.pop() self.app.request_middleware.pop()
@@ -143,17 +121,13 @@ class ReuseableSanicTestClient(SanicTestClient):
return request, response return request, response
except Exception: except Exception:
raise ValueError( raise ValueError(
"Request and response object expected, got ({})".format( f"Request and response object expected, got ({results})"
results
)
) )
else: else:
try: try:
return results[-1] return results[-1]
except Exception: except Exception:
raise ValueError( raise ValueError(f"Request object expected, got ({results})")
"Request object expected, got ({})".format(results)
)
def kill_server(self): def kill_server(self):
try: try:
@@ -163,7 +137,7 @@ class ReuseableSanicTestClient(SanicTestClient):
self._server = None self._server = None
if self._session: if self._session:
self._loop.run_until_complete(self._session.close()) self._loop.run_until_complete(self._session.aclose())
self._session = None self._session = None
except Exception as e3: except Exception as e3:
@@ -182,7 +156,7 @@ class ReuseableSanicTestClient(SanicTestClient):
self._session = self.get_new_session() self._session = self.get_new_session()
try: try:
response = await getattr(self._session, method.lower())( response = await getattr(self._session, method.lower())(
url, verify=False, timeout=request_keepalive, *args, **kwargs url, timeout=request_keepalive, *args, **kwargs
) )
except NameError: except NameError:
raise Exception(response.status_code) raise Exception(response.status_code)
@@ -192,7 +166,7 @@ class ReuseableSanicTestClient(SanicTestClient):
except (JSONDecodeError, UnicodeDecodeError): except (JSONDecodeError, UnicodeDecodeError):
response.json = None response.json = None
response.body = await response.read() response.body = await response.aread()
response.status = response.status_code response.status = response.status_code
response.content_type = response.headers.get("content-type") response.content_type = response.headers.get("content-type")
@@ -230,8 +204,8 @@ async def handler3(request):
def test_keep_alive_timeout_reuse(): def test_keep_alive_timeout_reuse():
"""If the server keep-alive timeout and client keep-alive timeout are """If the server keep-alive timeout and client keep-alive timeout are
both longer than the delay, the client _and_ server will successfully both longer than the delay, the client _and_ server will successfully
reuse the existing connection.""" reuse the existing connection."""
try: try:
loop = asyncio.new_event_loop() loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop) asyncio.set_event_loop(loop)
@@ -244,6 +218,7 @@ def test_keep_alive_timeout_reuse():
request, response = client.get("/1") request, response = client.get("/1")
assert response.status == 200 assert response.status == 200
assert response.text == "OK" assert response.text == "OK"
assert ReusableSanicConnectionPool.last_reused_connection
finally: finally:
client.kill_server() client.kill_server()
@@ -256,20 +231,15 @@ def test_keep_alive_client_timeout():
asyncio.set_event_loop(loop) asyncio.set_event_loop(loop)
client = ReuseableSanicTestClient(keep_alive_app_client_timeout, loop) client = ReuseableSanicTestClient(keep_alive_app_client_timeout, loop)
headers = {"Connection": "keep-alive"} headers = {"Connection": "keep-alive"}
try: request, response = client.get(
request, response = client.get( "/1", headers=headers, request_keepalive=1
"/1", headers=headers, request_keepalive=1 )
) assert response.status == 200
assert response.status == 200 assert response.text == "OK"
assert response.text == "OK" loop.run_until_complete(aio_sleep(2))
loop.run_until_complete(aio_sleep(2)) exception = None
exception = None request, response = client.get("/1", request_keepalive=1)
request, response = client.get("/1", request_keepalive=1) assert ReusableSanicConnectionPool.last_reused_connection is None
except ValueError as e:
exception = e
assert exception is not None
assert isinstance(exception, ValueError)
assert "got a new connection" in exception.args[0]
finally: finally:
client.kill_server() client.kill_server()
@@ -284,22 +254,14 @@ def test_keep_alive_server_timeout():
asyncio.set_event_loop(loop) asyncio.set_event_loop(loop)
client = ReuseableSanicTestClient(keep_alive_app_server_timeout, loop) client = ReuseableSanicTestClient(keep_alive_app_server_timeout, loop)
headers = {"Connection": "keep-alive"} headers = {"Connection": "keep-alive"}
try: request, response = client.get(
request, response = client.get( "/1", headers=headers, request_keepalive=60
"/1", headers=headers, request_keepalive=60
)
assert response.status == 200
assert response.text == "OK"
loop.run_until_complete(aio_sleep(3))
exception = None
request, response = client.get("/1", request_keepalive=60)
except ValueError as e:
exception = e
assert exception is not None
assert isinstance(exception, ValueError)
assert (
"Connection reset" in exception.args[0]
or "got a new connection" in exception.args[0]
) )
assert response.status == 200
assert response.text == "OK"
loop.run_until_complete(aio_sleep(3))
exception = None
request, response = client.get("/1", request_keepalive=60)
assert ReusableSanicConnectionPool.last_reused_connection is None
finally: finally:
client.kill_server() client.kill_server()

View File

@@ -0,0 +1,35 @@
from pathlib import Path
from types import ModuleType
import pytest
from sanic.exceptions import LoadFileException
from sanic.utils import load_module_from_file_location
@pytest.fixture
def loaded_module_from_file_location():
return load_module_from_file_location(
str(Path(__file__).parent / "static/app_test_config.py")
)
@pytest.mark.dependency(name="test_load_module_from_file_location")
def test_load_module_from_file_location(loaded_module_from_file_location):
assert isinstance(loaded_module_from_file_location, ModuleType)
@pytest.mark.dependency(depends=["test_load_module_from_file_location"])
def test_loaded_module_from_file_location_name(
loaded_module_from_file_location,
):
assert loaded_module_from_file_location.__name__ == "app_test_config"
def test_load_module_from_file_location_with_non_existing_env_variable():
with pytest.raises(
LoadFileException,
match="The following environment variables are not set: MuuMilk",
):
load_module_from_file_location("${MuuMilk}")

View File

@@ -1,4 +1,5 @@
import logging import logging
import os
import uuid import uuid
from importlib import reload from importlib import reload
@@ -12,6 +13,7 @@ import sanic
from sanic import Sanic from sanic import Sanic
from sanic.log import LOGGING_CONFIG_DEFAULTS, logger from sanic.log import LOGGING_CONFIG_DEFAULTS, logger
from sanic.response import text from sanic.response import text
from sanic.testing import SanicTestClient
logging_format = """module: %(module)s; \ logging_format = """module: %(module)s; \
@@ -127,7 +129,7 @@ def test_log_connection_lost(app, debug, monkeypatch):
def test_logger(caplog): def test_logger(caplog):
rand_string = str(uuid.uuid4()) rand_string = str(uuid.uuid4())
app = Sanic() app = Sanic(name=__name__)
@app.get("/") @app.get("/")
def log_info(request): def log_info(request):
@@ -137,15 +139,67 @@ def test_logger(caplog):
with caplog.at_level(logging.INFO): with caplog.at_level(logging.INFO):
request, response = app.test_client.get("/") request, response = app.test_client.get("/")
port = request.server_port
# Note: testing with random port doesn't show the banner because it doesn't
# define host and port. This test supports both modes.
if caplog.record_tuples[0] == (
"sanic.root",
logging.INFO,
f"Goin' Fast @ http://127.0.0.1:{port}",
):
caplog.record_tuples.pop(0)
assert caplog.record_tuples[0] == ( assert caplog.record_tuples[0] == (
"sanic.root", "sanic.root",
logging.INFO, logging.INFO,
"Goin' Fast @ http://127.0.0.1:42101", f"http://127.0.0.1:{port}/",
)
assert caplog.record_tuples[1] == ("sanic.root", logging.INFO, rand_string)
assert caplog.record_tuples[-1] == (
"sanic.root",
logging.INFO,
"Server Stopped",
)
def test_logger_static_and_secure(caplog):
# Same as test_logger, except for more coverage:
# - test_client initialised separately for static port
# - using ssl
rand_string = str(uuid.uuid4())
app = Sanic(name=__name__)
@app.get("/")
def log_info(request):
logger.info(rand_string)
return text("hello")
current_dir = os.path.dirname(os.path.realpath(__file__))
ssl_cert = os.path.join(current_dir, "certs/selfsigned.cert")
ssl_key = os.path.join(current_dir, "certs/selfsigned.key")
ssl_dict = {"cert": ssl_cert, "key": ssl_key}
test_client = SanicTestClient(app, port=42101)
with caplog.at_level(logging.INFO):
request, response = test_client.get(
f"https://127.0.0.1:{test_client.port}/",
server_kwargs=dict(ssl=ssl_dict),
)
port = test_client.port
assert caplog.record_tuples[0] == (
"sanic.root",
logging.INFO,
f"Goin' Fast @ https://127.0.0.1:{port}",
) )
assert caplog.record_tuples[1] == ( assert caplog.record_tuples[1] == (
"sanic.root", "sanic.root",
logging.INFO, logging.INFO,
"http://127.0.0.1:42101/", f"https://127.0.0.1:{port}/",
) )
assert caplog.record_tuples[2] == ("sanic.root", logging.INFO, rand_string) assert caplog.record_tuples[2] == ("sanic.root", logging.INFO, rand_string)
assert caplog.record_tuples[-1] == ( assert caplog.record_tuples[-1] == (

View File

@@ -49,10 +49,10 @@ def test_logo_false(app, caplog):
loop.run_until_complete(_server.wait_closed()) loop.run_until_complete(_server.wait_closed())
app.stop() app.stop()
banner, port = caplog.record_tuples[ROW][2].rsplit(":", 1)
assert caplog.record_tuples[ROW][1] == logging.INFO assert caplog.record_tuples[ROW][1] == logging.INFO
assert caplog.record_tuples[ROW][ assert banner == "Goin' Fast @ http://127.0.0.1"
2 assert int(port) > 0
] == "Goin' Fast @ http://127.0.0.1:{}".format(PORT)
def test_logo_true(app, caplog): def test_logo_true(app, caplog):

View File

@@ -2,7 +2,7 @@ import logging
from asyncio import CancelledError from asyncio import CancelledError
from sanic.exceptions import NotFound from sanic.exceptions import NotFound, SanicException
from sanic.request import Request from sanic.request import Request
from sanic.response import HTTPResponse, text from sanic.response import HTTPResponse, text
@@ -93,7 +93,7 @@ def test_middleware_response_raise_cancelled_error(app, caplog):
"sanic.root", "sanic.root",
logging.ERROR, logging.ERROR,
"Exception occurred while handling uri: 'http://127.0.0.1:42101/'", "Exception occurred while handling uri: 'http://127.0.0.1:42101/'",
) in caplog.record_tuples ) not in caplog.record_tuples
def test_middleware_response_raise_exception(app, caplog): def test_middleware_response_raise_exception(app, caplog):
@@ -102,14 +102,16 @@ def test_middleware_response_raise_exception(app, caplog):
raise Exception("Exception at response middleware") raise Exception("Exception at response middleware")
with caplog.at_level(logging.ERROR): with caplog.at_level(logging.ERROR):
reqrequest, response = app.test_client.get("/") reqrequest, response = app.test_client.get("/fail")
assert response.status == 404 assert response.status == 404
# 404 errors are not logged
assert ( assert (
"sanic.root", "sanic.root",
logging.ERROR, logging.ERROR,
"Exception occurred while handling uri: 'http://127.0.0.1:42101/'", "Exception occurred while handling uri: 'http://127.0.0.1:42101/'",
) in caplog.record_tuples ) not in caplog.record_tuples
# Middleware exception ignored but logged
assert ( assert (
"sanic.error", "sanic.error",
logging.ERROR, logging.ERROR,

View File

@@ -87,3 +87,15 @@ def test_pickle_app_with_bp(app, protocol):
request, response = up_p_app.test_client.get("/") request, response = up_p_app.test_client.get("/")
assert up_p_app.is_request_stream is False assert up_p_app.is_request_stream is False
assert response.text == "Hello" assert response.text == "Hello"
@pytest.mark.parametrize("protocol", [3, 4])
def test_pickle_app_with_static(app, protocol):
app.route("/")(handler)
app.static("/static", "/tmp/static")
p_app = pickle.dumps(app, protocol=protocol)
del app
up_p_app = pickle.loads(p_app)
assert up_p_app
request, response = up_p_app.test_client.get("/static/missing.txt")
assert response.status == 404

View File

@@ -21,13 +21,13 @@ def test_versioned_named_routes_get(app, method):
bp = Blueprint("test_bp", url_prefix="/bp") bp = Blueprint("test_bp", url_prefix="/bp")
method = method.lower() method = method.lower()
route_name = "route_{}".format(method) route_name = f"route_{method}"
route_name2 = "route2_{}".format(method) route_name2 = f"route2_{method}"
func = getattr(app, method) func = getattr(app, method)
if callable(func): if callable(func):
@func("/{}".format(method), version=1, name=route_name) @func(f"/{method}", version=1, name=route_name)
def handler(request): def handler(request):
return text("OK") return text("OK")
@@ -38,7 +38,7 @@ def test_versioned_named_routes_get(app, method):
func = getattr(bp, method) func = getattr(bp, method)
if callable(func): if callable(func):
@func("/{}".format(method), version=1, name=route_name2) @func(f"/{method}", version=1, name=route_name2)
def handler2(request): def handler2(request):
return text("OK") return text("OK")
@@ -48,14 +48,14 @@ def test_versioned_named_routes_get(app, method):
app.blueprint(bp) app.blueprint(bp)
assert app.router.routes_all["/v1/{}".format(method)].name == route_name assert app.router.routes_all[f"/v1/{method}"].name == route_name
route = app.router.routes_all["/v1/bp/{}".format(method)] route = app.router.routes_all[f"/v1/bp/{method}"]
assert route.name == "test_bp.{}".format(route_name2) assert route.name == f"test_bp.{route_name2}"
assert app.url_for(route_name) == "/v1/{}".format(method) assert app.url_for(route_name) == f"/v1/{method}"
url = app.url_for("test_bp.{}".format(route_name2)) url = app.url_for(f"test_bp.{route_name2}")
assert url == "/v1/bp/{}".format(method) assert url == f"/v1/bp/{method}"
with pytest.raises(URLBuildError): with pytest.raises(URLBuildError):
app.url_for("handler") app.url_for("handler")

View File

@@ -27,7 +27,7 @@ def test_payload_too_large_at_data_received_default(app):
response = app.test_client.get("/1", gather_request=False) response = app.test_client.get("/1", gather_request=False)
assert response.status == 413 assert response.status == 413
assert response.text == "Error: Payload Too Large" assert "Payload Too Large" in response.text
def test_payload_too_large_at_on_header_default(app): def test_payload_too_large_at_on_header_default(app):
@@ -40,4 +40,4 @@ def test_payload_too_large_at_on_header_default(app):
data = "a" * 1000 data = "a" * 1000
response = app.test_client.post("/1", gather_request=False, data=data) response = app.test_client.post("/1", gather_request=False, data=data)
assert response.status == 413 assert response.status == 413
assert response.text == "Error: Payload Too Large" assert "Payload Too Large" in response.text

View File

@@ -115,14 +115,14 @@ def test_redirect_with_params(app, test_str):
@app.route("/api/v1/test/<test>/") @app.route("/api/v1/test/<test>/")
async def init_handler(request, test): async def init_handler(request, test):
return redirect("/api/v2/test/{}/".format(use_in_uri)) return redirect(f"/api/v2/test/{use_in_uri}/")
@app.route("/api/v2/test/<test>/") @app.route("/api/v2/test/<test>/")
async def target_handler(request, test): async def target_handler(request, test):
assert test == test_str assert test == test_str
return text("OK") return text("OK")
_, response = app.test_client.get("/api/v1/test/{}/".format(use_in_uri)) _, response = app.test_client.get(f"/api/v1/test/{use_in_uri}/")
assert response.status == 200 assert response.status == 200
assert response.content == b"OK" assert response.content == b"OK"

108
tests/test_reloader.py Normal file
View File

@@ -0,0 +1,108 @@
import os
import secrets
import sys
from contextlib import suppress
from subprocess import PIPE, Popen, TimeoutExpired
from tempfile import TemporaryDirectory
from textwrap import dedent
from threading import Timer
from time import sleep
import pytest
# We need to interrupt the autoreloader without killing it, so that the server gets terminated
# https://stefan.sofa-rockers.org/2013/08/15/handling-sub-process-hierarchies-python-linux-os-x/
try:
from signal import CTRL_BREAK_EVENT
from subprocess import CREATE_NEW_PROCESS_GROUP
flags = CREATE_NEW_PROCESS_GROUP
except ImportError:
flags = 0
def terminate(proc):
if flags:
proc.send_signal(CTRL_BREAK_EVENT)
else:
proc.terminate()
def write_app(filename, **runargs):
text = secrets.token_urlsafe()
with open(filename, "w") as f:
f.write(
dedent(
f"""\
import os
from sanic import Sanic
app = Sanic(__name__)
@app.listener("after_server_start")
def complete(*args):
print("complete", os.getpid(), {text!r})
if __name__ == "__main__":
app.run(**{runargs!r})
"""
)
)
return text
def scanner(proc):
for line in proc.stdout:
line = line.decode().strip()
print(">", line)
if line.startswith("complete"):
yield line
argv = dict(
script=[sys.executable, "reloader.py"],
module=[sys.executable, "-m", "reloader"],
sanic=[
sys.executable,
"-m",
"sanic",
"--port",
"42104",
"--debug",
"reloader.app",
],
)
@pytest.mark.parametrize(
"runargs, mode",
[
(dict(port=42102, auto_reload=True), "script"),
(dict(port=42103, debug=True), "module"),
(dict(), "sanic"),
],
)
async def test_reloader_live(runargs, mode):
with TemporaryDirectory() as tmpdir:
filename = os.path.join(tmpdir, "reloader.py")
text = write_app(filename, **runargs)
proc = Popen(argv[mode], cwd=tmpdir, stdout=PIPE, creationflags=flags)
try:
timeout = Timer(5, terminate, [proc])
timeout.start()
# Python apparently keeps using the old source sometimes if
# we don't sleep before rewrite (pycache timestamp problem?)
sleep(1)
line = scanner(proc)
assert text in next(line)
# Edit source code and try again
text = write_app(filename, **runargs)
assert text in next(line)
finally:
timeout.cancel()
terminate(proc)
with suppress(TimeoutExpired):
proc.wait(timeout=3)

View File

@@ -33,6 +33,23 @@ def test_custom_context(app):
} }
) )
@app.middleware("response")
def modify(request, response):
# Using response-middleware to access request ctx
try:
user = request.ctx.user
except AttributeError as e:
user = str(e)
try:
invalid = request.ctx.missing
except AttributeError as e:
invalid = str(e)
j = loads(response.body)
j["response_mw_valid"] = user
j["response_mw_invalid"] = invalid
return json(j)
request, response = app.test_client.get("/") request, response = app.test_client.get("/")
assert response.json == { assert response.json == {
"user": "sanic", "user": "sanic",
@@ -41,47 +58,11 @@ def test_custom_context(app):
"has_session": True, "has_session": True,
"has_missing": False, "has_missing": False,
"invalid": "'types.SimpleNamespace' object has no attribute 'missing'", "invalid": "'types.SimpleNamespace' object has no attribute 'missing'",
"response_mw_valid": "sanic",
"response_mw_invalid": "'types.SimpleNamespace' object has no attribute 'missing'",
} }
# Remove this once the deprecated API is abolished.
def test_custom_context_old(app):
@app.middleware("request")
def store(request):
try:
request["foo"]
except KeyError:
pass
request["user"] = "sanic"
sidekick = request.get("sidekick", "tails") # Item missing -> default
request["sidekick"] = sidekick
request["bar"] = request["sidekick"]
del request["sidekick"]
@app.route("/")
def handler(request):
return json(
{
"user": request.get("user"),
"sidekick": request.get("sidekick"),
"has_bar": "bar" in request,
"has_sidekick": "sidekick" in request,
}
)
request, response = app.test_client.get("/")
assert response.json == {
"user": "sanic",
"sidekick": None,
"has_bar": True,
"has_sidekick": False,
}
response_json = loads(response.text)
assert response_json["user"] == "sanic"
assert response_json.get("sidekick") is None
def test_app_injection(app): def test_app_injection(app):
expected = random.choice(range(0, 100)) expected = random.choice(range(0, 100))

View File

@@ -1,14 +1,17 @@
import asyncio
import pytest import pytest
from sanic.blueprints import Blueprint from sanic.blueprints import Blueprint
from sanic.exceptions import HeaderExpectationFailed from sanic.exceptions import HeaderExpectationFailed
from sanic.request import StreamBuffer from sanic.request import StreamBuffer
from sanic.response import stream, text from sanic.response import json, stream, text
from sanic.server import HttpProtocol
from sanic.views import CompositionView, HTTPMethodView from sanic.views import CompositionView, HTTPMethodView
from sanic.views import stream as stream_decorator from sanic.views import stream as stream_decorator
data = "abc" * 10000000 data = "abc" * 1_000_000
def test_request_stream_method_view(app): def test_request_stream_method_view(app):
@@ -329,15 +332,28 @@ def test_request_stream_handle_exception(app):
# 404 # 404
request, response = app.test_client.post("/in_valid_post", data=data) request, response = app.test_client.post("/in_valid_post", data=data)
assert response.status == 404 assert response.status == 404
assert response.text == "Error: Requested URL /in_valid_post not found" assert "Requested URL /in_valid_post not found" in response.text
# 405 # 405
request, response = app.test_client.get("/post/random_id") request, response = app.test_client.get("/post/random_id")
assert response.status == 405 assert response.status == 405
assert ( assert "Method GET not allowed for URL /post/random_id" in response.text
response.text == "Error: Method GET not allowed for URL"
" /post/random_id"
) @pytest.mark.asyncio
async def test_request_stream_unread(app):
"""ensure no error is raised when leaving unread bytes in byte-buffer"""
err = None
protocol = HttpProtocol(loop=asyncio.get_event_loop(), app=app)
try:
protocol.request = None
protocol._body_chunks.append("this is a test")
await protocol.stream_append()
except AttributeError as e:
err = e
assert err is None and not protocol._body_chunks
def test_request_stream_blueprint(app): def test_request_stream_blueprint(app):
@@ -616,3 +632,44 @@ def test_request_stream(app):
request, response = app.test_client.post("/bp_stream", data=data) request, response = app.test_client.post("/bp_stream", data=data)
assert response.status == 200 assert response.status == 200
assert response.text == data assert response.text == data
def test_streaming_new_api(app):
@app.post("/non-stream")
async def handler(request):
assert request.body == b"x"
await request.receive_body() # This should do nothing
assert request.body == b"x"
return text("OK")
@app.post("/1", stream=True)
async def handler(request):
assert request.stream
assert not request.body
await request.receive_body()
return text(request.body.decode().upper())
@app.post("/2", stream=True)
async def handler(request):
ret = []
async for data in request.stream:
# We should have no b"" or None, just proper chunks
assert data
assert isinstance(data, bytes)
ret.append(data.decode("ASCII"))
return json(ret)
request, response = app.test_client.post("/non-stream", data="x")
assert response.status == 200
request, response = app.test_client.post("/1", data="TEST data")
assert request.body == b"TEST data"
assert response.status == 200
assert response.text == "TEST DATA"
request, response = app.test_client.post("/2", data=data)
assert response.status == 200
res = response.json
assert isinstance(res, list)
assert len(res) > 1
assert "".join(res) == data

View File

@@ -1,70 +1,54 @@
import asyncio import asyncio
from typing import cast
import httpcore
import httpx import httpx
from httpcore._async.base import (
AsyncByteStream,
AsyncHTTPTransport,
ConnectionState,
NewConnectionRequired,
)
from httpcore._async.connection import AsyncHTTPConnection
from httpcore._async.connection_pool import ResponseByteStream
from httpcore._exceptions import LocalProtocolError, UnsupportedProtocol
from httpcore._types import TimeoutDict
from httpcore._utils import url_to_origin
from sanic import Sanic from sanic import Sanic
from sanic.response import text from sanic.response import text
from sanic.testing import SanicTestClient from sanic.testing import SanicTestClient
class DelayableHTTPConnection(httpx.dispatch.connection.HTTPConnection): class DelayableHTTPConnection(httpcore._async.connection.AsyncHTTPConnection):
def __init__(self, *args, **kwargs): async def arequest(self, *args, **kwargs):
self._request_delay = None await asyncio.sleep(2)
if "request_delay" in kwargs: return await super().arequest(*args, **kwargs)
self._request_delay = kwargs.pop("request_delay")
super().__init__(*args, **kwargs)
async def send(self, request, verify=None, cert=None, timeout=None):
if self.h11_connection is None and self.h2_connection is None:
await self.connect(verify=verify, cert=cert, timeout=timeout)
async def _open_socket(self, *args, **kwargs):
retval = await super()._open_socket(*args, **kwargs)
if self._request_delay: if self._request_delay:
await asyncio.sleep(self._request_delay) await asyncio.sleep(self._request_delay)
return retval
if self.h2_connection is not None:
response = await self.h2_connection.send(request, timeout=timeout)
else:
assert self.h11_connection is not None
response = await self.h11_connection.send(request, timeout=timeout)
return response
class DelayableSanicConnectionPool( class DelayableSanicConnectionPool(httpcore.AsyncConnectionPool):
httpx.dispatch.connection_pool.ConnectionPool
):
def __init__(self, request_delay=None, *args, **kwargs): def __init__(self, request_delay=None, *args, **kwargs):
self._request_delay = request_delay self._request_delay = request_delay
super().__init__(*args, **kwargs) super().__init__(*args, **kwargs)
async def acquire_connection(self, origin, timeout=None): async def _add_to_pool(self, connection, timeout):
connection = self.pop_connection(origin) connection.__class__ = DelayableHTTPConnection
connection._request_delay = self._request_delay
if connection is None: await super()._add_to_pool(connection, timeout)
pool_timeout = None if timeout is None else timeout.pool_timeout
await self.max_connections.acquire(timeout=pool_timeout)
connection = DelayableHTTPConnection(
origin,
verify=self.verify,
cert=self.cert,
http2=self.http2,
backend=self.backend,
release_func=self.release_connection,
trust_env=self.trust_env,
uds=self.uds,
request_delay=self._request_delay,
)
self.active_connections.add(connection)
return connection
class DelayableSanicSession(httpx.Client): class DelayableSanicSession(httpx.AsyncClient):
def __init__(self, request_delay=None, *args, **kwargs) -> None: def __init__(self, request_delay=None, *args, **kwargs) -> None:
dispatch = DelayableSanicConnectionPool(request_delay=request_delay) transport = DelayableSanicConnectionPool(request_delay=request_delay)
super().__init__(dispatch=dispatch, *args, **kwargs) super().__init__(transport=transport, *args, **kwargs)
class DelayableSanicTestClient(SanicTestClient): class DelayableSanicTestClient(SanicTestClient):
@@ -102,7 +86,7 @@ def test_default_server_error_request_timeout():
client = DelayableSanicTestClient(request_timeout_default_app, 2) client = DelayableSanicTestClient(request_timeout_default_app, 2)
request, response = client.get("/1") request, response = client.get("/1")
assert response.status == 408 assert response.status == 408
assert response.text == "Error: Request Timeout" assert "Request Timeout" in response.text
def test_default_server_error_request_dont_timeout(): def test_default_server_error_request_dont_timeout():
@@ -125,4 +109,4 @@ def test_default_server_error_websocket_request_timeout():
request, response = client.get("/ws1", headers=headers) request, response = client.get("/ws1", headers=headers)
assert response.status == 408 assert response.status == 408
assert response.text == "Error: Request Timeout" assert "Request Timeout" in response.text

View File

@@ -11,8 +11,15 @@ import pytest
from sanic import Blueprint, Sanic from sanic import Blueprint, Sanic
from sanic.exceptions import ServerError from sanic.exceptions import ServerError
from sanic.request import DEFAULT_HTTP_CONTENT_TYPE, Request, RequestParameters from sanic.request import DEFAULT_HTTP_CONTENT_TYPE, Request, RequestParameters
from sanic.response import json, text from sanic.response import html, json, text
from sanic.testing import ASGI_HOST, HOST, PORT from sanic.testing import (
ASGI_BASE_URL,
ASGI_HOST,
ASGI_PORT,
HOST,
PORT,
SanicTestClient,
)
# ------------------------------------------------------------ # # ------------------------------------------------------------ #
@@ -44,7 +51,7 @@ async def test_sync_asgi(app):
def test_ip(app): def test_ip(app):
@app.route("/") @app.route("/")
def handler(request): def handler(request):
return text("{}".format(request.ip)) return text(f"{request.ip}")
request, response = app.test_client.get("/") request, response = app.test_client.get("/")
@@ -55,11 +62,14 @@ def test_ip(app):
async def test_ip_asgi(app): async def test_ip_asgi(app):
@app.route("/") @app.route("/")
def handler(request): def handler(request):
return text("{}".format(request.url)) return text(f"{request.url}")
request, response = await app.asgi_client.get("/") request, response = await app.asgi_client.get("/")
assert response.text == "http://mockserver/" if response.text.endswith("/") and not ASGI_BASE_URL.endswith("/"):
response.text[:-1] == ASGI_BASE_URL
else:
assert response.text == ASGI_BASE_URL
def test_text(app): def test_text(app):
@@ -72,6 +82,41 @@ def test_text(app):
assert response.text == "Hello" assert response.text == "Hello"
def test_html(app):
class Foo:
def __html__(self):
return "<h1>Foo</h1>"
def _repr_html_(self):
return "<h1>Foo object repr</h1>"
class Bar:
def _repr_html_(self):
return "<h1>Bar object repr</h1>"
@app.route("/")
async def handler(request):
return html("<h1>Hello</h1>")
@app.route("/foo")
async def handler(request):
return html(Foo())
@app.route("/bar")
async def handler(request):
return html(Bar())
request, response = app.test_client.get("/")
assert response.content_type == "text/html; charset=utf-8"
assert response.text == "<h1>Hello</h1>"
request, response = app.test_client.get("/foo")
assert response.text == "<h1>Foo</h1>"
request, response = app.test_client.get("/bar")
assert response.text == "<h1>Bar object repr</h1>"
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_text_asgi(app): async def test_text_asgi(app):
@app.route("/") @app.route("/")
@@ -290,7 +335,7 @@ def test_token(app):
token = "a1d895e0-553a-421a-8e22-5ff8ecb48cbf" token = "a1d895e0-553a-421a-8e22-5ff8ecb48cbf"
headers = { headers = {
"content-type": "application/json", "content-type": "application/json",
"Authorization": "{}".format(token), "Authorization": f"{token}",
} }
request, response = app.test_client.get("/", headers=headers) request, response = app.test_client.get("/", headers=headers)
@@ -300,7 +345,7 @@ def test_token(app):
token = "a1d895e0-553a-421a-8e22-5ff8ecb48cbf" token = "a1d895e0-553a-421a-8e22-5ff8ecb48cbf"
headers = { headers = {
"content-type": "application/json", "content-type": "application/json",
"Authorization": "Token {}".format(token), "Authorization": f"Token {token}",
} }
request, response = app.test_client.get("/", headers=headers) request, response = app.test_client.get("/", headers=headers)
@@ -310,7 +355,7 @@ def test_token(app):
token = "a1d895e0-553a-421a-8e22-5ff8ecb48cbf" token = "a1d895e0-553a-421a-8e22-5ff8ecb48cbf"
headers = { headers = {
"content-type": "application/json", "content-type": "application/json",
"Authorization": "Bearer {}".format(token), "Authorization": f"Bearer {token}",
} }
request, response = app.test_client.get("/", headers=headers) request, response = app.test_client.get("/", headers=headers)
@@ -335,7 +380,7 @@ async def test_token_asgi(app):
token = "a1d895e0-553a-421a-8e22-5ff8ecb48cbf" token = "a1d895e0-553a-421a-8e22-5ff8ecb48cbf"
headers = { headers = {
"content-type": "application/json", "content-type": "application/json",
"Authorization": "{}".format(token), "Authorization": f"{token}",
} }
request, response = await app.asgi_client.get("/", headers=headers) request, response = await app.asgi_client.get("/", headers=headers)
@@ -345,7 +390,7 @@ async def test_token_asgi(app):
token = "a1d895e0-553a-421a-8e22-5ff8ecb48cbf" token = "a1d895e0-553a-421a-8e22-5ff8ecb48cbf"
headers = { headers = {
"content-type": "application/json", "content-type": "application/json",
"Authorization": "Token {}".format(token), "Authorization": f"Token {token}",
} }
request, response = await app.asgi_client.get("/", headers=headers) request, response = await app.asgi_client.get("/", headers=headers)
@@ -355,7 +400,7 @@ async def test_token_asgi(app):
token = "a1d895e0-553a-421a-8e22-5ff8ecb48cbf" token = "a1d895e0-553a-421a-8e22-5ff8ecb48cbf"
headers = { headers = {
"content-type": "application/json", "content-type": "application/json",
"Authorization": "Bearer {}".format(token), "Authorization": f"Bearer {token}",
} }
request, response = await app.asgi_client.get("/", headers=headers) request, response = await app.asgi_client.get("/", headers=headers)
@@ -419,11 +464,13 @@ def test_standard_forwarded(app):
"X-Real-IP": "127.0.0.2", "X-Real-IP": "127.0.0.2",
"X-Forwarded-For": "127.0.1.1", "X-Forwarded-For": "127.0.1.1",
"X-Scheme": "ws", "X-Scheme": "ws",
"Host": "local.site",
} }
request, response = app.test_client.get("/", headers=headers) request, response = app.test_client.get("/", headers=headers)
assert response.json == {"for": "127.0.0.2", "proto": "ws"} assert response.json == {"for": "127.0.0.2", "proto": "ws"}
assert request.remote_addr == "127.0.0.2" assert request.remote_addr == "127.0.0.2"
assert request.scheme == "ws" assert request.scheme == "ws"
assert request.server_name == "local.site"
assert request.server_port == 80 assert request.server_port == 80
app.config.FORWARDED_SECRET = "mySecret" app.config.FORWARDED_SECRET = "mySecret"
@@ -536,7 +583,7 @@ async def test_standard_forwarded_asgi(app):
assert response.json() == {"for": "127.0.0.2", "proto": "ws"} assert response.json() == {"for": "127.0.0.2", "proto": "ws"}
assert request.remote_addr == "127.0.0.2" assert request.remote_addr == "127.0.0.2"
assert request.scheme == "ws" assert request.scheme == "ws"
assert request.server_port == 80 assert request.server_port == ASGI_PORT
app.config.FORWARDED_SECRET = "mySecret" app.config.FORWARDED_SECRET = "mySecret"
request, response = await app.asgi_client.get("/", headers=headers) request, response = await app.asgi_client.get("/", headers=headers)
@@ -993,8 +1040,8 @@ def test_url_attributes_no_ssl(app, path, query, expected_url):
app.add_route(handler, path) app.add_route(handler, path)
request, response = app.test_client.get(path + "?{}".format(query)) request, response = app.test_client.get(path + f"?{query}")
assert request.url == expected_url.format(HOST, PORT) assert request.url == expected_url.format(HOST, request.server_port)
parsed = urlparse(request.url) parsed = urlparse(request.url)
@@ -1007,9 +1054,9 @@ def test_url_attributes_no_ssl(app, path, query, expected_url):
@pytest.mark.parametrize( @pytest.mark.parametrize(
"path,query,expected_url", "path,query,expected_url",
[ [
("/foo", "", "http://{}/foo"), ("/foo", "", "{}/foo"),
("/bar/baz", "", "http://{}/bar/baz"), ("/bar/baz", "", "{}/bar/baz"),
("/moo/boo", "arg1=val1", "http://{}/moo/boo?arg1=val1"), ("/moo/boo", "arg1=val1", "{}/moo/boo?arg1=val1"),
], ],
) )
@pytest.mark.asyncio @pytest.mark.asyncio
@@ -1019,8 +1066,8 @@ async def test_url_attributes_no_ssl_asgi(app, path, query, expected_url):
app.add_route(handler, path) app.add_route(handler, path)
request, response = await app.asgi_client.get(path + "?{}".format(query)) request, response = await app.asgi_client.get(path + f"?{query}")
assert request.url == expected_url.format(ASGI_HOST) assert request.url == expected_url.format(ASGI_BASE_URL)
parsed = urlparse(request.url) parsed = urlparse(request.url)
@@ -1051,11 +1098,12 @@ def test_url_attributes_with_ssl_context(app, path, query, expected_url):
app.add_route(handler, path) app.add_route(handler, path)
port = app.test_client.port
request, response = app.test_client.get( request, response = app.test_client.get(
"https://{}:{}".format(HOST, PORT) + path + "?{}".format(query), f"https://{HOST}:{PORT}" + path + f"?{query}",
server_kwargs={"ssl": context}, server_kwargs={"ssl": context},
) )
assert request.url == expected_url.format(HOST, PORT) assert request.url == expected_url.format(HOST, request.server_port)
parsed = urlparse(request.url) parsed = urlparse(request.url)
@@ -1087,10 +1135,10 @@ def test_url_attributes_with_ssl_dict(app, path, query, expected_url):
app.add_route(handler, path) app.add_route(handler, path)
request, response = app.test_client.get( request, response = app.test_client.get(
"https://{}:{}".format(HOST, PORT) + path + "?{}".format(query), f"https://{HOST}:{PORT}" + path + f"?{query}",
server_kwargs={"ssl": ssl_dict}, server_kwargs={"ssl": ssl_dict},
) )
assert request.url == expected_url.format(HOST, PORT) assert request.url == expected_url.format(HOST, request.server_port)
parsed = urlparse(request.url) parsed = urlparse(request.url)
@@ -1571,33 +1619,6 @@ async def test_request_args_no_query_string_await(app):
assert request.args == {} assert request.args == {}
def test_request_raw_args(app):
params = {"test": "OK"}
@app.get("/")
def handler(request):
return text("pass")
request, response = app.test_client.get("/", params=params)
assert request.raw_args == params
@pytest.mark.asyncio
async def test_request_raw_args_asgi(app):
params = {"test": "OK"}
@app.get("/")
def handler(request):
return text("pass")
request, response = await app.asgi_client.get("/", params=params)
assert request.raw_args == params
def test_request_query_args(app): def test_request_query_args(app):
# test multiple params with the same key # test multiple params with the same key
params = [("test", "value1"), ("test", "value2")] params = [("test", "value1"), ("test", "value2")]
@@ -1798,13 +1819,17 @@ def test_request_port(app):
port = request.port port = request.port
assert isinstance(port, int) assert isinstance(port, int)
delattr(request, "_socket")
delattr(request, "_port") @pytest.mark.asyncio
async def test_request_port_asgi(app):
@app.get("/")
def handler(request):
return text("OK")
request, response = await app.asgi_client.get("/")
port = request.port port = request.port
assert isinstance(port, int) assert isinstance(port, int)
assert hasattr(request, "_socket")
assert hasattr(request, "_port")
def test_request_socket(app): def test_request_socket(app):
@@ -1823,12 +1848,6 @@ def test_request_socket(app):
assert ip == request.ip assert ip == request.ip
assert port == request.port assert port == request.port
delattr(request, "_socket")
socket = request.socket
assert isinstance(socket, tuple)
assert hasattr(request, "_socket")
def test_request_server_name(app): def test_request_server_name(app):
@app.get("/") @app.get("/")
@@ -1857,7 +1876,7 @@ def test_request_server_name_in_host_header(app):
request, response = app.test_client.get( request, response = app.test_client.get(
"/", headers={"Host": "mal_formed"} "/", headers={"Host": "mal_formed"}
) )
assert request.server_name == None # For now (later maybe 127.0.0.1) assert request.server_name == ""
def test_request_server_name_forwarded(app): def test_request_server_name_forwarded(app):
@@ -1882,8 +1901,9 @@ def test_request_server_port(app):
def handler(request): def handler(request):
return text("OK") return text("OK")
request, response = app.test_client.get("/", headers={"Host": "my-server"}) test_client = SanicTestClient(app)
assert request.server_port == app.test_client.port request, response = test_client.get("/", headers={"Host": "my-server"})
assert request.server_port == 80
def test_request_server_port_in_host_header(app): def test_request_server_port_in_host_header(app):
@@ -1904,7 +1924,10 @@ def test_request_server_port_in_host_header(app):
request, response = app.test_client.get( request, response = app.test_client.get(
"/", headers={"Host": "mal_formed:5555"} "/", headers={"Host": "mal_formed:5555"}
) )
assert request.server_port == app.test_client.port if PORT is None:
assert request.server_port != 5555
else:
assert request.server_port == app.test_client.port
def test_request_server_port_forwarded(app): def test_request_server_port_forwarded(app):
@@ -1939,13 +1962,10 @@ def test_server_name_and_url_for(app):
def handler(request): def handler(request):
return text("ok") return text("ok")
app.config.SERVER_NAME = "my-server" app.config.SERVER_NAME = "my-server" # This means default port
assert app.url_for("handler", _external=True) == "http://my-server/foo" assert app.url_for("handler", _external=True) == "http://my-server/foo"
request, response = app.test_client.get("/foo") request, response = app.test_client.get("/foo")
assert ( assert request.url_for("handler") == f"http://my-server/foo"
request.url_for("handler")
== f"http://my-server:{app.test_client.port}/foo"
)
app.config.SERVER_NAME = "https://my-server/path" app.config.SERVER_NAME = "https://my-server/path"
request, response = app.test_client.get("/foo") request, response = app.test_client.get("/foo")
@@ -2005,7 +2025,7 @@ async def test_request_form_invalid_content_type_asgi(app):
def test_endpoint_basic(): def test_endpoint_basic():
app = Sanic() app = Sanic(name=__name__)
@app.route("/") @app.route("/")
def my_unique_handler(request): def my_unique_handler(request):
@@ -2018,7 +2038,7 @@ def test_endpoint_basic():
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_endpoint_basic_asgi(): async def test_endpoint_basic_asgi():
app = Sanic() app = Sanic(name=__name__)
@app.route("/") @app.route("/")
def my_unique_handler(request): def my_unique_handler(request):
@@ -2097,5 +2117,5 @@ def test_url_for_without_server_name(app):
request, response = app.test_client.get("/sample") request, response = app.test_client.get("/sample")
assert ( assert (
response.json["url"] response.json["url"]
== f"http://127.0.0.1:{app.test_client.port}/url-for" == f"http://127.0.0.1:{request.server_port}/url-for"
) )

View File

@@ -1,6 +1,7 @@
import asyncio import asyncio
import inspect import inspect
import os import os
import warnings
from collections import namedtuple from collections import namedtuple
from mimetypes import guess_type from mimetypes import guess_type
@@ -15,13 +16,14 @@ from aiofiles import os as async_os
from sanic.response import ( from sanic.response import (
HTTPResponse, HTTPResponse,
StreamingHTTPResponse, StreamingHTTPResponse,
empty,
file, file,
file_stream, file_stream,
json, json,
raw, raw,
stream, stream,
text,
) )
from sanic.response import empty
from sanic.server import HttpProtocol from sanic.server import HttpProtocol
from sanic.testing import HOST, PORT from sanic.testing import HOST, PORT
@@ -29,13 +31,14 @@ from sanic.testing import HOST, PORT
JSON_DATA = {"ok": True} JSON_DATA = {"ok": True}
@pytest.mark.filterwarnings("ignore:Types other than str will be")
def test_response_body_not_a_string(app): def test_response_body_not_a_string(app):
"""Test when a response body sent from the application is not a string""" """Test when a response body sent from the application is not a string"""
random_num = choice(range(1000)) random_num = choice(range(1000))
@app.route("/hello") @app.route("/hello")
async def hello_route(request): async def hello_route(request):
return HTTPResponse(body=random_num) return text(random_num)
request, response = app.test_client.get("/hello") request, response = app.test_client.get("/hello")
assert response.text == str(random_num) assert response.text == str(random_num)
@@ -240,7 +243,7 @@ def test_non_chunked_streaming_adds_correct_headers(non_chunked_streaming_app):
def test_non_chunked_streaming_returns_correct_content( def test_non_chunked_streaming_returns_correct_content(
non_chunked_streaming_app non_chunked_streaming_app,
): ):
request, response = non_chunked_streaming_app.test_client.get("/") request, response = non_chunked_streaming_app.test_client.get("/")
assert response.text == "foo,bar" assert response.text == "foo,bar"
@@ -255,7 +258,7 @@ def test_stream_response_status_returns_correct_headers(status):
@pytest.mark.parametrize("keep_alive_timeout", [10, 20, 30]) @pytest.mark.parametrize("keep_alive_timeout", [10, 20, 30])
def test_stream_response_keep_alive_returns_correct_headers( def test_stream_response_keep_alive_returns_correct_headers(
keep_alive_timeout keep_alive_timeout,
): ):
response = StreamingHTTPResponse(sample_streaming_fn) response = StreamingHTTPResponse(sample_streaming_fn)
headers = response.get_headers( headers = response.get_headers(
@@ -284,7 +287,7 @@ def test_stream_response_does_not_include_chunked_header_if_disabled():
def test_stream_response_writes_correct_content_to_transport_when_chunked( def test_stream_response_writes_correct_content_to_transport_when_chunked(
streaming_app streaming_app,
): ):
response = StreamingHTTPResponse(sample_streaming_fn) response = StreamingHTTPResponse(sample_streaming_fn)
response.protocol = MagicMock(HttpProtocol) response.protocol = MagicMock(HttpProtocol)
@@ -406,7 +409,7 @@ def test_file_response(app, file_name, static_file_directory, status):
mime_type=guess_type(file_path)[0] or "text/plain", mime_type=guess_type(file_path)[0] or "text/plain",
) )
request, response = app.test_client.get("/files/{}".format(file_name)) request, response = app.test_client.get(f"/files/{file_name}")
assert response.status == status assert response.status == status
assert response.body == get_file_content(static_file_directory, file_name) assert response.body == get_file_content(static_file_directory, file_name)
assert "Content-Disposition" not in response.headers assert "Content-Disposition" not in response.headers
@@ -429,12 +432,13 @@ def test_file_response_custom_filename(
file_path = os.path.abspath(unquote(file_path)) file_path = os.path.abspath(unquote(file_path))
return file(file_path, filename=dest) return file(file_path, filename=dest)
request, response = app.test_client.get("/files/{}".format(source)) request, response = app.test_client.get(f"/files/{source}")
assert response.status == 200 assert response.status == 200
assert response.body == get_file_content(static_file_directory, source) assert response.body == get_file_content(static_file_directory, source)
assert response.headers[ assert (
"Content-Disposition" response.headers["Content-Disposition"]
] == 'attachment; filename="{}"'.format(dest) == f'attachment; filename="{dest}"'
)
@pytest.mark.parametrize("file_name", ["test.file", "decode me.txt"]) @pytest.mark.parametrize("file_name", ["test.file", "decode me.txt"])
@@ -459,7 +463,7 @@ def test_file_head_response(app, file_name, static_file_directory):
mime_type=guess_type(file_path)[0] or "text/plain", mime_type=guess_type(file_path)[0] or "text/plain",
) )
request, response = app.test_client.head("/files/{}".format(file_name)) request, response = app.test_client.head(f"/files/{file_name}")
assert response.status == 200 assert response.status == 200
assert "Accept-Ranges" in response.headers assert "Accept-Ranges" in response.headers
assert "Content-Length" in response.headers assert "Content-Length" in response.headers
@@ -482,7 +486,7 @@ def test_file_stream_response(app, file_name, static_file_directory):
mime_type=guess_type(file_path)[0] or "text/plain", mime_type=guess_type(file_path)[0] or "text/plain",
) )
request, response = app.test_client.get("/files/{}".format(file_name)) request, response = app.test_client.get(f"/files/{file_name}")
assert response.status == 200 assert response.status == 200
assert response.body == get_file_content(static_file_directory, file_name) assert response.body == get_file_content(static_file_directory, file_name)
assert "Content-Disposition" not in response.headers assert "Content-Disposition" not in response.headers
@@ -505,12 +509,13 @@ def test_file_stream_response_custom_filename(
file_path = os.path.abspath(unquote(file_path)) file_path = os.path.abspath(unquote(file_path))
return file_stream(file_path, chunk_size=32, filename=dest) return file_stream(file_path, chunk_size=32, filename=dest)
request, response = app.test_client.get("/files/{}".format(source)) request, response = app.test_client.get(f"/files/{source}")
assert response.status == 200 assert response.status == 200
assert response.body == get_file_content(static_file_directory, source) assert response.body == get_file_content(static_file_directory, source)
assert response.headers[ assert (
"Content-Disposition" response.headers["Content-Disposition"]
] == 'attachment; filename="{}"'.format(dest) == f'attachment; filename="{dest}"'
)
@pytest.mark.parametrize("file_name", ["test.file", "decode me.txt"]) @pytest.mark.parametrize("file_name", ["test.file", "decode me.txt"])
@@ -538,7 +543,7 @@ def test_file_stream_head_response(app, file_name, static_file_directory):
mime_type=guess_type(file_path)[0] or "text/plain", mime_type=guess_type(file_path)[0] or "text/plain",
) )
request, response = app.test_client.head("/files/{}".format(file_name)) request, response = app.test_client.head(f"/files/{file_name}")
assert response.status == 200 assert response.status == 200
# A HEAD request should never be streamed/chunked. # A HEAD request should never be streamed/chunked.
if "Transfer-Encoding" in response.headers: if "Transfer-Encoding" in response.headers:
@@ -576,11 +581,12 @@ def test_file_stream_response_range(
_range=range, _range=range,
) )
request, response = app.test_client.get("/files/{}".format(file_name)) request, response = app.test_client.get(f"/files/{file_name}")
assert response.status == 206 assert response.status == 206
assert "Content-Range" in response.headers assert "Content-Range" in response.headers
assert response.headers["Content-Range"] == "bytes {}-{}/{}".format( assert (
range.start, range.end, range.total response.headers["Content-Range"]
== f"bytes {range.start}-{range.end}/{range.total}"
) )
@@ -602,3 +608,17 @@ def test_empty_response(app):
request, response = app.test_client.get("/test") request, response = app.test_client.get("/test")
assert response.content_type is None assert response.content_type is None
assert response.body == b"" assert response.body == b""
def test_response_body_bytes_deprecated(app):
with warnings.catch_warnings(record=True) as w:
warnings.simplefilter("always")
HTTPResponse(body_bytes=b"bytes")
assert len(w) == 1
assert issubclass(w[0].category, DeprecationWarning)
assert (
"Parameter `body_bytes` is deprecated, use `body` instead"
in str(w[0].message)
)

View File

@@ -40,7 +40,7 @@ async def handler_2(request):
def test_default_server_error_response_timeout(): def test_default_server_error_response_timeout():
request, response = response_timeout_default_app.test_client.get("/1") request, response = response_timeout_default_app.test_client.get("/1")
assert response.status == 503 assert response.status == 503
assert response.text == "Error: Response Timeout" assert "Response Timeout" in response.text
response_handler_cancelled_app.flag = False response_handler_cancelled_app.flag = False
@@ -65,5 +65,5 @@ async def handler_3(request):
def test_response_handler_cancelled(): def test_response_handler_cancelled():
request, response = response_handler_cancelled_app.test_client.get("/1") request, response = response_handler_cancelled_app.test_client.get("/1")
assert response.status == 503 assert response.status == 503
assert response.text == "Error: Response Timeout" assert "Response Timeout" in response.text
assert response_handler_cancelled_app.flag is False assert response_handler_cancelled_app.flag is False

View File

@@ -6,6 +6,7 @@ from sanic import Sanic
from sanic.constants import HTTP_METHODS from sanic.constants import HTTP_METHODS
from sanic.response import json, text from sanic.response import json, text
from sanic.router import ParameterNameConflicts, RouteDoesNotExist, RouteExists from sanic.router import ParameterNameConflicts, RouteDoesNotExist, RouteExists
from sanic.testing import SanicTestClient
# ------------------------------------------------------------ # # ------------------------------------------------------------ #
@@ -20,17 +21,17 @@ def test_versioned_routes_get(app, method):
func = getattr(app, method) func = getattr(app, method)
if callable(func): if callable(func):
@func("/{}".format(method), version=1) @func(f"/{method}", version=1)
def handler(request): def handler(request):
return text("OK") return text("OK")
else: else:
print(func) print(func)
raise Exception("Method: {} is not callable".format(method)) raise Exception(f"Method: {method} is not callable")
client_method = getattr(app.test_client, method) client_method = getattr(app.test_client, method)
request, response = client_method("/v1/{}".format(method)) request, response = client_method(f"/v1/{method}")
assert response.status == 200 assert response.status == 200
@@ -167,35 +168,36 @@ def test_route_optional_slash(app):
def test_route_strict_slashes_set_to_false_and_host_is_a_list(app): def test_route_strict_slashes_set_to_false_and_host_is_a_list(app):
# Part of regression test for issue #1120 # Part of regression test for issue #1120
site1 = "127.0.0.1:{}".format(app.test_client.port) test_client = SanicTestClient(app, port=42101)
site1 = f"127.0.0.1:{test_client.port}"
# before fix, this raises a RouteExists error # before fix, this raises a RouteExists error
@app.get("/get", host=[site1, "site2.com"], strict_slashes=False) @app.get("/get", host=[site1, "site2.com"], strict_slashes=False)
def get_handler(request): def get_handler(request):
return text("OK") return text("OK")
request, response = app.test_client.get("http://" + site1 + "/get") request, response = test_client.get("http://" + site1 + "/get")
assert response.text == "OK" assert response.text == "OK"
@app.post("/post", host=[site1, "site2.com"], strict_slashes=False) @app.post("/post", host=[site1, "site2.com"], strict_slashes=False)
def post_handler(request): def post_handler(request):
return text("OK") return text("OK")
request, response = app.test_client.post("http://" + site1 + "/post") request, response = test_client.post("http://" + site1 + "/post")
assert response.text == "OK" assert response.text == "OK"
@app.put("/put", host=[site1, "site2.com"], strict_slashes=False) @app.put("/put", host=[site1, "site2.com"], strict_slashes=False)
def put_handler(request): def put_handler(request):
return text("OK") return text("OK")
request, response = app.test_client.put("http://" + site1 + "/put") request, response = test_client.put("http://" + site1 + "/put")
assert response.text == "OK" assert response.text == "OK"
@app.delete("/delete", host=[site1, "site2.com"], strict_slashes=False) @app.delete("/delete", host=[site1, "site2.com"], strict_slashes=False)
def delete_handler(request): def delete_handler(request):
return text("OK") return text("OK")
request, response = app.test_client.delete("http://" + site1 + "/delete") request, response = test_client.delete("http://" + site1 + "/delete")
assert response.text == "OK" assert response.text == "OK"
@@ -412,7 +414,8 @@ def test_dynamic_route_uuid(app):
assert response.text == "OK" assert response.text == "OK"
assert type(results[0]) is uuid.UUID assert type(results[0]) is uuid.UUID
request, response = app.test_client.get("/quirky/{}".format(uuid.uuid4())) generated_uuid = uuid.uuid4()
request, response = app.test_client.get(f"/quirky/{generated_uuid}")
assert response.status == 200 assert response.status == 200
request, response = app.test_client.get("/quirky/non-existing") request, response = app.test_client.get("/quirky/non-existing")
@@ -528,6 +531,19 @@ def test_add_webscoket_route(app, strict_slashes):
assert ev.is_set() assert ev.is_set()
def test_add_webscoket_route_with_version(app):
ev = asyncio.Event()
async def handler(request, ws):
assert ws.subprotocol is None
ev.set()
app.add_websocket_route(handler, "/ws", version=1)
request, response = app.test_client.websocket("/v1/ws")
assert response.opened is True
assert ev.is_set()
def test_route_duplicate(app): def test_route_duplicate(app):
with pytest.raises(RouteExists): with pytest.raises(RouteExists):
@@ -551,6 +567,35 @@ def test_route_duplicate(app):
pass pass
def test_double_stack_route(app):
@app.route("/test/1")
@app.route("/test/2")
async def handler1(request):
return text("OK")
request, response = app.test_client.get("/test/1")
assert response.status == 200
request, response = app.test_client.get("/test/2")
assert response.status == 200
@pytest.mark.asyncio
async def test_websocket_route_asgi(app):
ev = asyncio.Event()
@app.websocket("/test/1")
@app.websocket("/test/2")
async def handler(request, ws):
ev.set()
request, response = await app.asgi_client.websocket("/test/1")
first_set = ev.is_set()
ev.clear()
request, response = await app.asgi_client.websocket("/test/1")
second_set = ev.is_set()
assert first_set and second_set
def test_method_not_allowed(app): def test_method_not_allowed(app):
@app.route("/test", methods=["GET"]) @app.route("/test", methods=["GET"])
async def handler(request): async def handler(request):
@@ -738,55 +783,6 @@ def test_add_route_method_not_allowed(app):
assert response.status == 405 assert response.status == 405
def test_remove_static_route(app):
async def handler1(request):
return text("OK1")
async def handler2(request):
return text("OK2")
app.add_route(handler1, "/test")
app.add_route(handler2, "/test2")
request, response = app.test_client.get("/test")
assert response.status == 200
request, response = app.test_client.get("/test2")
assert response.status == 200
app.remove_route("/test")
app.remove_route("/test2")
request, response = app.test_client.get("/test")
assert response.status == 404
request, response = app.test_client.get("/test2")
assert response.status == 404
def test_remove_dynamic_route(app):
async def handler(request, name):
return text("OK")
app.add_route(handler, "/folder/<name>")
request, response = app.test_client.get("/folder/test123")
assert response.status == 200
app.remove_route("/folder/<name>")
request, response = app.test_client.get("/folder/test123")
assert response.status == 404
def test_remove_inexistent_route(app):
uri = "/test"
with pytest.raises(RouteDoesNotExist) as excinfo:
app.remove_route(uri)
assert str(excinfo.value) == "Route was not registered: {}".format(uri)
def test_removing_slash(app): def test_removing_slash(app):
@app.get("/rest/<resource>") @app.get("/rest/<resource>")
def get(_): def get(_):
@@ -799,59 +795,6 @@ def test_removing_slash(app):
assert len(app.router.routes_all.keys()) == 2 assert len(app.router.routes_all.keys()) == 2
def test_remove_unhashable_route(app):
async def handler(request, unhashable):
return text("OK")
app.add_route(handler, "/folder/<unhashable:[A-Za-z0-9/]+>/end/")
request, response = app.test_client.get("/folder/test/asdf/end/")
assert response.status == 200
request, response = app.test_client.get("/folder/test///////end/")
assert response.status == 200
request, response = app.test_client.get("/folder/test/end/")
assert response.status == 200
app.remove_route("/folder/<unhashable:[A-Za-z0-9/]+>/end/")
request, response = app.test_client.get("/folder/test/asdf/end/")
assert response.status == 404
request, response = app.test_client.get("/folder/test///////end/")
assert response.status == 404
request, response = app.test_client.get("/folder/test/end/")
assert response.status == 404
def test_remove_route_without_clean_cache(app):
async def handler(request):
return text("OK")
app.add_route(handler, "/test")
request, response = app.test_client.get("/test")
assert response.status == 200
app.remove_route("/test", clean_cache=True)
app.remove_route("/test/", clean_cache=True)
request, response = app.test_client.get("/test")
assert response.status == 404
app.add_route(handler, "/test")
request, response = app.test_client.get("/test")
assert response.status == 200
app.remove_route("/test", clean_cache=False)
request, response = app.test_client.get("/test")
assert response.status == 200
def test_overload_routes(app): def test_overload_routes(app):
@app.route("/overload", methods=["GET"]) @app.route("/overload", methods=["GET"])
async def handler1(request): async def handler1(request):

View File

@@ -1,6 +1,9 @@
import asyncio import asyncio
import signal import signal
from contextlib import closing
from socket import socket
import pytest import pytest
from sanic.testing import HOST, PORT from sanic.testing import HOST, PORT
@@ -22,7 +25,7 @@ skipif_no_alarm = pytest.mark.skipif(
def create_listener(listener_name, in_list): def create_listener(listener_name, in_list):
async def _listener(app, loop): async def _listener(app, loop):
print("DEBUG MESSAGE FOR PYTEST for {}".format(listener_name)) print(f"DEBUG MESSAGE FOR PYTEST for {listener_name}")
in_list.insert(0, app.name + listener_name) in_list.insert(0, app.name + listener_name)
return _listener return _listener
@@ -118,25 +121,30 @@ def test_create_server_trigger_events(app):
app.listener("after_server_stop")(after_stop) app.listener("after_server_stop")(after_stop)
loop = asyncio.get_event_loop() loop = asyncio.get_event_loop()
serv_coro = app.create_server(return_asyncio_server=True)
serv_task = asyncio.ensure_future(serv_coro, loop=loop)
server = loop.run_until_complete(serv_task)
server.after_start()
try:
loop.run_forever()
except KeyboardInterrupt as e:
loop.stop()
finally:
# Run the on_stop function if provided
server.before_stop()
# Wait for server to close # Use random port for tests
close_task = server.close() with closing(socket()) as sock:
loop.run_until_complete(close_task) sock.bind(("127.0.0.1", 0))
# Complete all tasks on the loop serv_coro = app.create_server(return_asyncio_server=True, sock=sock)
signal.stopped = True serv_task = asyncio.ensure_future(serv_coro, loop=loop)
for connection in server.connections: server = loop.run_until_complete(serv_task)
connection.close_if_idle() server.after_start()
server.after_stop() try:
assert flag1 and flag2 and flag3 loop.run_forever()
except KeyboardInterrupt as e:
loop.stop()
finally:
# Run the on_stop function if provided
server.before_stop()
# Wait for server to close
close_task = server.close()
loop.run_until_complete(close_task)
# Complete all tasks on the loop
signal.stopped = True
for connection in server.connections:
connection.close_if_idle()
server.after_stop()
assert flag1 and flag2 and flag3

View File

@@ -1,8 +1,13 @@
import asyncio import asyncio
import os
import signal
from queue import Queue from queue import Queue
from unittest.mock import MagicMock from unittest.mock import MagicMock
import pytest
from sanic.compat import ctrlc_workaround_for_windows
from sanic.response import HTTPResponse from sanic.response import HTTPResponse
from sanic.testing import HOST, PORT from sanic.testing import HOST, PORT
@@ -16,13 +21,19 @@ calledq = Queue()
def set_loop(app, loop): def set_loop(app, loop):
loop.add_signal_handler = MagicMock() global mock
mock = MagicMock()
if os.name == "nt":
signal.signal = mock
else:
loop.add_signal_handler = mock
def after(app, loop): def after(app, loop):
calledq.put(loop.add_signal_handler.called) calledq.put(mock.called)
@pytest.mark.skipif(os.name == "nt", reason="May hang CI on py38/windows")
def test_register_system_signals(app): def test_register_system_signals(app):
"""Test if sanic register system signals""" """Test if sanic register system signals"""
@@ -38,6 +49,7 @@ def test_register_system_signals(app):
assert calledq.get() is True assert calledq.get() is True
@pytest.mark.skipif(os.name == "nt", reason="May hang CI on py38/windows")
def test_dont_register_system_signals(app): def test_dont_register_system_signals(app):
"""Test if sanic don't register system signals""" """Test if sanic don't register system signals"""
@@ -51,3 +63,47 @@ def test_dont_register_system_signals(app):
app.run(HOST, PORT, register_sys_signals=False) app.run(HOST, PORT, register_sys_signals=False)
assert calledq.get() is False assert calledq.get() is False
@pytest.mark.skipif(os.name == "nt", reason="windows cannot SIGINT processes")
def test_windows_workaround():
"""Test Windows workaround (on any other OS)"""
# At least some code coverage, even though this test doesn't work on
# Windows...
class MockApp:
def __init__(self):
self.is_stopping = False
def stop(self):
assert not self.is_stopping
self.is_stopping = True
def add_task(self, func):
loop = asyncio.get_event_loop()
self.stay_active_task = loop.create_task(func(self))
async def atest(stop_first):
app = MockApp()
ctrlc_workaround_for_windows(app)
await asyncio.sleep(0.05)
if stop_first:
app.stop()
await asyncio.sleep(0.2)
assert app.is_stopping == stop_first
# First Ctrl+C: should call app.stop() within 0.1 seconds
os.kill(os.getpid(), signal.SIGINT)
await asyncio.sleep(0.2)
assert app.is_stopping
assert app.stay_active_task.result() == None
# Second Ctrl+C should raise
with pytest.raises(KeyboardInterrupt):
os.kill(os.getpid(), signal.SIGINT)
return "OK"
# Run in our private loop
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
res = loop.run_until_complete(atest(False))
assert res == "OK"
res = loop.run_until_complete(atest(True))
assert res == "OK"

View File

@@ -97,9 +97,7 @@ def test_static_file_content_type(app, static_file_directory, file_name):
def test_static_directory(app, file_name, base_uri, static_file_directory): def test_static_directory(app, file_name, base_uri, static_file_directory):
app.static(base_uri, static_file_directory) app.static(base_uri, static_file_directory)
request, response = app.test_client.get( request, response = app.test_client.get(uri=f"{base_uri}/{file_name}")
uri="{}/{}".format(base_uri, file_name)
)
assert response.status == 200 assert response.status == 200
assert response.body == get_file_content(static_file_directory, file_name) assert response.body == get_file_content(static_file_directory, file_name)
@@ -234,11 +232,11 @@ def test_static_content_range_invalid_unit(
) )
unit = "bit" unit = "bit"
headers = {"Range": "{}=1-0".format(unit)} headers = {"Range": f"{unit}=1-0"}
request, response = app.test_client.get("/testing.file", headers=headers) request, response = app.test_client.get("/testing.file", headers=headers)
assert response.status == 416 assert response.status == 416
assert response.text == "Error: {} is not a valid Range Type".format(unit) assert f"{unit} is not a valid Range Type" in response.text
@pytest.mark.parametrize("file_name", ["test.file", "decode me.txt"]) @pytest.mark.parametrize("file_name", ["test.file", "decode me.txt"])
@@ -252,13 +250,11 @@ def test_static_content_range_invalid_start(
) )
start = "start" start = "start"
headers = {"Range": "bytes={}-0".format(start)} headers = {"Range": f"bytes={start}-0"}
request, response = app.test_client.get("/testing.file", headers=headers) request, response = app.test_client.get("/testing.file", headers=headers)
assert response.status == 416 assert response.status == 416
assert response.text == "Error: '{}' is invalid for Content Range".format( assert f"'{start}' is invalid for Content Range" in response.text
start
)
@pytest.mark.parametrize("file_name", ["test.file", "decode me.txt"]) @pytest.mark.parametrize("file_name", ["test.file", "decode me.txt"])
@@ -272,13 +268,11 @@ def test_static_content_range_invalid_end(
) )
end = "end" end = "end"
headers = {"Range": "bytes=1-{}".format(end)} headers = {"Range": f"bytes=1-{end}"}
request, response = app.test_client.get("/testing.file", headers=headers) request, response = app.test_client.get("/testing.file", headers=headers)
assert response.status == 416 assert response.status == 416
assert response.text == "Error: '{}' is invalid for Content Range".format( assert f"'{end}' is invalid for Content Range" in response.text
end
)
@pytest.mark.parametrize("file_name", ["test.file", "decode me.txt"]) @pytest.mark.parametrize("file_name", ["test.file", "decode me.txt"])
@@ -295,7 +289,7 @@ def test_static_content_range_invalid_parameters(
request, response = app.test_client.get("/testing.file", headers=headers) request, response = app.test_client.get("/testing.file", headers=headers)
assert response.status == 416 assert response.status == 416
assert response.text == "Error: Invalid for Content Range parameters" assert "Invalid for Content Range parameters" in response.text
@pytest.mark.parametrize( @pytest.mark.parametrize(
@@ -369,7 +363,7 @@ def test_file_not_found(app, static_file_directory):
request, response = app.test_client.get("/static/not_found") request, response = app.test_client.get("/static/not_found")
assert response.status == 404 assert response.status == 404
assert response.text == "Error: File not found" assert "File not found" in response.text
@pytest.mark.parametrize("static_name", ["_static_name", "static"]) @pytest.mark.parametrize("static_name", ["_static_name", "static"])
@@ -377,20 +371,6 @@ def test_file_not_found(app, static_file_directory):
def test_static_name(app, static_file_directory, static_name, file_name): def test_static_name(app, static_file_directory, static_name, file_name):
app.static("/static", static_file_directory, name=static_name) app.static("/static", static_file_directory, name=static_name)
request, response = app.test_client.get("/static/{}".format(file_name)) request, response = app.test_client.get(f"/static/{file_name}")
assert response.status == 200 assert response.status == 200
@pytest.mark.parametrize("file_name", ["test.file"])
def test_static_remove_route(app, static_file_directory, file_name):
app.static(
"/testing.file", get_file_path(static_file_directory, file_name)
)
request, response = app.test_client.get("/testing.file")
assert response.status == 200
app.remove_route("/testing.file")
request, response = app.test_client.get("/testing.file")
assert response.status == 404

View File

@@ -1,5 +1,3 @@
import socket
from sanic.response import json, text from sanic.response import json, text
from sanic.testing import PORT, SanicTestClient from sanic.testing import PORT, SanicTestClient
@@ -29,7 +27,8 @@ def test_test_client_port_default(app):
return json(request.transport.get_extra_info("sockname")[1]) return json(request.transport.get_extra_info("sockname")[1])
test_client = SanicTestClient(app) test_client = SanicTestClient(app)
assert test_client.port == PORT assert test_client.port == PORT # Can be None before request
request, response = test_client.get("/get") request, response = test_client.get("/get")
assert response.json == PORT assert test_client.port > 0
assert response.json == test_client.port

238
tests/test_unix_socket.py Normal file
View File

@@ -0,0 +1,238 @@
import asyncio
import logging
import os
import subprocess
import sys
import httpcore
import httpx
import pytest
from sanic import Sanic
from sanic.response import text
pytestmark = pytest.mark.skipif(os.name != "posix", reason="UNIX only")
SOCKPATH = "/tmp/sanictest.sock"
SOCKPATH2 = "/tmp/sanictest2.sock"
@pytest.fixture(autouse=True)
def socket_cleanup():
try:
os.unlink(SOCKPATH)
except FileNotFoundError:
pass
try:
os.unlink(SOCKPATH2)
except FileNotFoundError:
pass
# Run test function
yield
try:
os.unlink(SOCKPATH2)
except FileNotFoundError:
pass
try:
os.unlink(SOCKPATH)
except FileNotFoundError:
pass
def test_unix_socket_creation(caplog):
from socket import AF_UNIX, socket
with socket(AF_UNIX) as sock:
sock.bind(SOCKPATH)
assert os.path.exists(SOCKPATH)
ino = os.stat(SOCKPATH).st_ino
app = Sanic(name=__name__)
@app.listener("after_server_start")
def running(app, loop):
assert os.path.exists(SOCKPATH)
assert ino != os.stat(SOCKPATH).st_ino
app.stop()
with caplog.at_level(logging.INFO):
app.run(unix=SOCKPATH)
assert (
"sanic.root",
logging.INFO,
f"Goin' Fast @ {SOCKPATH} http://...",
) in caplog.record_tuples
assert not os.path.exists(SOCKPATH)
def test_invalid_paths():
app = Sanic(name=__name__)
with pytest.raises(FileExistsError):
app.run(unix=".")
with pytest.raises(FileNotFoundError):
app.run(unix="no-such-directory/sanictest.sock")
def test_dont_replace_file():
with open(SOCKPATH, "w") as f:
f.write("File, not socket")
app = Sanic(name=__name__)
@app.listener("after_server_start")
def stop(app, loop):
app.stop()
with pytest.raises(FileExistsError):
app.run(unix=SOCKPATH)
def test_dont_follow_symlink():
from socket import AF_UNIX, socket
with socket(AF_UNIX) as sock:
sock.bind(SOCKPATH2)
os.symlink(SOCKPATH2, SOCKPATH)
app = Sanic(name=__name__)
@app.listener("after_server_start")
def stop(app, loop):
app.stop()
with pytest.raises(FileExistsError):
app.run(unix=SOCKPATH)
def test_socket_deleted_while_running():
app = Sanic(name=__name__)
@app.listener("after_server_start")
async def hack(app, loop):
os.unlink(SOCKPATH)
app.stop()
app.run(host="myhost.invalid", unix=SOCKPATH)
def test_socket_replaced_with_file():
app = Sanic(name=__name__)
@app.listener("after_server_start")
async def hack(app, loop):
os.unlink(SOCKPATH)
with open(SOCKPATH, "w") as f:
f.write("Not a socket")
app.stop()
app.run(host="myhost.invalid", unix=SOCKPATH)
def test_unix_connection():
app = Sanic(name=__name__)
@app.get("/")
def handler(request):
return text(f"{request.conn_info.server}")
@app.listener("after_server_start")
async def client(app, loop):
transport = httpcore.AsyncConnectionPool(uds=SOCKPATH)
try:
async with httpx.AsyncClient(transport=transport) as client:
r = await client.get("http://myhost.invalid/")
assert r.status_code == 200
assert r.text == os.path.abspath(SOCKPATH)
finally:
app.stop()
app.run(host="myhost.invalid", unix=SOCKPATH)
app_multi = Sanic(name=__name__)
def handler(request):
return text(f"{request.conn_info.server}")
async def client(app, loop):
try:
async with httpx.AsyncClient(uds=SOCKPATH) as client:
r = await client.get("http://myhost.invalid/")
assert r.status_code == 200
assert r.text == os.path.abspath(SOCKPATH)
finally:
app.stop()
def test_unix_connection_multiple_workers():
app_multi.get("/")(handler)
app_multi.listener("after_server_start")(client)
app_multi.run(host="myhost.invalid", unix=SOCKPATH, workers=2)
async def test_zero_downtime():
"""Graceful server termination and socket replacement on restarts"""
from signal import SIGINT
from time import monotonic as current_time
async def client():
transport = httpcore.AsyncConnectionPool(uds=SOCKPATH)
for _ in range(40):
async with httpx.AsyncClient(transport=transport) as client:
r = await client.get("http://localhost/sleep/0.1")
assert r.status_code == 200
assert r.text == f"Slept 0.1 seconds.\n"
def spawn():
command = [
sys.executable,
"-m",
"sanic",
"--unix",
SOCKPATH,
"examples.delayed_response.app",
]
DN = subprocess.DEVNULL
return subprocess.Popen(
command, stdin=DN, stdout=DN, stderr=subprocess.PIPE
)
try:
processes = [spawn()]
while not os.path.exists(SOCKPATH):
if processes[0].poll() is not None:
raise Exception("Worker did not start properly")
await asyncio.sleep(0.0001)
ino = os.stat(SOCKPATH).st_ino
task = asyncio.get_event_loop().create_task(client())
start_time = current_time()
while current_time() < start_time + 4:
# Start a new one and wait until the socket is replaced
processes.append(spawn())
while ino == os.stat(SOCKPATH).st_ino:
await asyncio.sleep(0.001)
ino = os.stat(SOCKPATH).st_ino
# Graceful termination of the previous one
processes[-2].send_signal(SIGINT)
# Wait until client has completed all requests
await task
processes[-1].send_signal(SIGINT)
for worker in processes:
try:
worker.wait(1.0)
except subprocess.TimeoutExpired:
raise Exception(
f"Worker would not terminate:\n{worker.stderr}"
)
finally:
for worker in processes:
worker.kill()
# Test for clean run and termination
assert len(processes) > 5
assert [worker.poll() for worker in processes] == len(processes) * [0]
assert not os.path.exists(SOCKPATH)

View File

@@ -0,0 +1,36 @@
from pathlib import Path
import pytest
_test_setting_as_dict = {"TEST_SETTING_VALUE": 1}
_test_setting_as_class = type("C", (), {"TEST_SETTING_VALUE": 1})
_test_setting_as_module = str(
Path(__file__).parent / "static/app_test_config.py"
)
@pytest.mark.parametrize(
"conf_object",
[
_test_setting_as_dict,
_test_setting_as_class,
pytest.param(
_test_setting_as_module,
marks=pytest.mark.dependency(
depends=["test_load_module_from_file_location"],
scope="session",
),
),
],
ids=["from_dict", "from_class", "from_file"],
)
def test_update(app, conf_object):
app.update_config(conf_object)
assert app.config["TEST_SETTING_VALUE"] == 1
def test_update_from_lowercase_key(app):
d = {"test_setting_value": 1}
app.update_config(d)
assert "test_setting_value" not in app.config

View File

@@ -20,30 +20,24 @@ URL_FOR_ARGS3 = dict(
arg1="v1", arg1="v1",
_anchor="anchor", _anchor="anchor",
_scheme="http", _scheme="http",
_server="{}:{}".format(test_host, test_port), _server=f"{test_host}:{test_port}",
_external=True, _external=True,
) )
URL_FOR_VALUE3 = "http://{}:{}/myurl?arg1=v1#anchor".format( URL_FOR_VALUE3 = f"http://{test_host}:{test_port}/myurl?arg1=v1#anchor"
test_host, test_port
)
URL_FOR_ARGS4 = dict( URL_FOR_ARGS4 = dict(
arg1="v1", arg1="v1",
_anchor="anchor", _anchor="anchor",
_external=True, _external=True,
_server="http://{}:{}".format(test_host, test_port), _server=f"http://{test_host}:{test_port}",
)
URL_FOR_VALUE4 = "http://{}:{}/myurl?arg1=v1#anchor".format(
test_host, test_port
) )
URL_FOR_VALUE4 = f"http://{test_host}:{test_port}/myurl?arg1=v1#anchor"
def _generate_handlers_from_names(app, l): def _generate_handlers_from_names(app, l):
for name in l: for name in l:
# this is the easiest way to generate functions with dynamic names # this is the easiest way to generate functions with dynamic names
exec( exec(
'@app.route(name)\ndef {}(request):\n\treturn text("{}")'.format( f'@app.route(name)\ndef {name}(request):\n\treturn text("{name}")'
name, name
)
) )
@@ -60,7 +54,7 @@ def test_simple_url_for_getting(simple_app):
for letter in string.ascii_letters: for letter in string.ascii_letters:
url = simple_app.url_for(letter) url = simple_app.url_for(letter)
assert url == "/{}".format(letter) assert url == f"/{letter}"
request, response = simple_app.test_client.get(url) request, response = simple_app.test_client.get(url)
assert response.status == 200 assert response.status == 200
assert response.text == letter assert response.text == letter
@@ -88,7 +82,7 @@ def test_simple_url_for_getting_with_more_params(app, args, url):
def test_url_for_with_server_name(app): def test_url_for_with_server_name(app):
server_name = "{}:{}".format(test_host, test_port) server_name = f"{test_host}:{test_port}"
app.config.update({"SERVER_NAME": server_name}) app.config.update({"SERVER_NAME": server_name})
path = "/myurl" path = "/myurl"
@@ -96,7 +90,7 @@ def test_url_for_with_server_name(app):
def passes(request): def passes(request):
return text("this should pass") return text("this should pass")
url = "http://{}{}".format(server_name, path) url = f"http://{server_name}{path}"
assert url == app.url_for("passes", _server=None, _external=True) assert url == app.url_for("passes", _server=None, _external=True)
request, response = app.test_client.get(url) request, response = app.test_client.get(url)
assert response.status == 200 assert response.status == 200
@@ -118,7 +112,7 @@ def test_fails_url_build_if_param_not_passed(app):
url = "/" url = "/"
for letter in string.ascii_letters: for letter in string.ascii_letters:
url += "<{}>/".format(letter) url += f"<{letter}>/"
@app.route(url) @app.route(url)
def fail(request): def fail(request):
@@ -182,7 +176,7 @@ def test_passes_with_negative_int_message(app):
@app.route("path/<possibly_neg:int>/another-word") @app.route("path/<possibly_neg:int>/another-word")
def good(request, possibly_neg): def good(request, possibly_neg):
assert isinstance(possibly_neg, int) assert isinstance(possibly_neg, int)
return text("this should pass with `{}`".format(possibly_neg)) return text(f"this should pass with `{possibly_neg}`")
u_plus_3 = app.url_for("good", possibly_neg=3) u_plus_3 = app.url_for("good", possibly_neg=3)
assert u_plus_3 == "/path/3/another-word", u_plus_3 assert u_plus_3 == "/path/3/another-word", u_plus_3
@@ -237,13 +231,13 @@ def test_passes_with_negative_number_message(app, number):
@app.route("path/<possibly_neg:number>/another-word") @app.route("path/<possibly_neg:number>/another-word")
def good(request, possibly_neg): def good(request, possibly_neg):
assert isinstance(possibly_neg, (int, float)) assert isinstance(possibly_neg, (int, float))
return text("this should pass with `{}`".format(possibly_neg)) return text(f"this should pass with `{possibly_neg}`")
u = app.url_for("good", possibly_neg=number) u = app.url_for("good", possibly_neg=number)
assert u == "/path/{}/another-word".format(number), u assert u == f"/path/{number}/another-word", u
request, response = app.test_client.get(u) request, response = app.test_client.get(u)
# For ``number``, it has been cast to a float - so a ``3`` becomes a ``3.0`` # For ``number``, it has been cast to a float - so a ``3`` becomes a ``3.0``
assert response.text == "this should pass with `{}`".format(float(number)) assert response.text == f"this should pass with `{float(number)}`"
def test_adds_other_supplied_values_as_query_string(app): def test_adds_other_supplied_values_as_query_string(app):
@@ -275,7 +269,7 @@ def blueprint_app(app):
@first_print.route("/foo/<param>") @first_print.route("/foo/<param>")
def foo_with_param(request, param): def foo_with_param(request, param):
return text("foo from first : {}".format(param)) return text(f"foo from first : {param}")
@second_print.route("/foo") # noqa @second_print.route("/foo") # noqa
def foo(request): def foo(request):
@@ -283,7 +277,7 @@ def blueprint_app(app):
@second_print.route("/foo/<param>") # noqa @second_print.route("/foo/<param>") # noqa
def foo_with_param(request, param): def foo_with_param(request, param):
return text("foo from second : {}".format(param)) return text(f"foo from second : {param}")
app.blueprint(first_print) app.blueprint(first_print)
app.blueprint(second_print) app.blueprint(second_print)

63
tests/test_url_for.py Normal file
View File

@@ -0,0 +1,63 @@
import asyncio
from sanic.blueprints import Blueprint
def test_routes_with_host(app):
@app.route("/")
@app.route("/", name="hostindex", host="example.com")
@app.route("/path", name="hostpath", host="path.example.com")
def index(request):
pass
assert app.url_for("index") == "/"
assert app.url_for("hostindex") == "/"
assert app.url_for("hostpath") == "/path"
assert app.url_for("hostindex", _external=True) == "http://example.com/"
assert (
app.url_for("hostpath", _external=True)
== "http://path.example.com/path"
)
def test_websocket_bp_route_name(app):
"""Tests that blueprint websocket route is named."""
event = asyncio.Event()
bp = Blueprint("test_bp", url_prefix="/bp")
@bp.get("/main")
async def main(request):
...
@bp.websocket("/route")
async def test_route(request, ws):
event.set()
@bp.websocket("/route2")
async def test_route2(request, ws):
event.set()
@bp.websocket("/route3", name="foobar_3")
async def test_route3(request, ws):
event.set()
app.blueprint(bp)
uri = app.url_for("test_bp.main")
assert uri == "/bp/main"
uri = app.url_for("test_bp.test_route")
assert uri == "/bp/route"
request, response = app.test_client.websocket(uri)
assert response.opened is True
assert event.is_set()
event.clear()
uri = app.url_for("test_bp.test_route2")
assert uri == "/bp/route2"
request, response = app.test_client.websocket(uri)
assert response.opened is True
assert event.is_set()
uri = app.url_for("test_bp.foobar_3")
assert uri == "/bp/route3"

View File

@@ -118,7 +118,7 @@ def test_static_directory(app, file_name, base_uri, static_file_directory):
app.static(base_uri2, static_file_directory, name="uploads") app.static(base_uri2, static_file_directory, name="uploads")
uri = app.url_for("static", name="static", filename=file_name) uri = app.url_for("static", name="static", filename=file_name)
assert uri == "{}/{}".format(base_uri, file_name) assert uri == f"{base_uri}/{file_name}"
request, response = app.test_client.get(uri) request, response = app.test_client.get(uri)
assert response.status == 200 assert response.status == 200
@@ -134,7 +134,7 @@ def test_static_directory(app, file_name, base_uri, static_file_directory):
assert uri2 == uri3 assert uri2 == uri3
assert uri3 == uri4 assert uri3 == uri4
assert uri5 == "{}/{}".format(base_uri2, file_name) assert uri5 == f"{base_uri2}/{file_name}"
assert uri5 == uri6 assert uri5 == uri6
bp = Blueprint("test_bp_static", url_prefix="/bp") bp = Blueprint("test_bp_static", url_prefix="/bp")
@@ -157,10 +157,10 @@ def test_static_directory(app, file_name, base_uri, static_file_directory):
"static", name="test_bp_static.uploads", filename="/" + file_name "static", name="test_bp_static.uploads", filename="/" + file_name
) )
assert uri == "/bp{}/{}".format(base_uri, file_name) assert uri == f"/bp{base_uri}/{file_name}"
assert uri == uri2 assert uri == uri2
assert uri4 == "/bp{}/{}".format(base_uri2, file_name) assert uri4 == f"/bp{base_uri2}/{file_name}"
assert uri4 == uri5 assert uri4 == uri5
request, response = app.test_client.get(uri) request, response = app.test_client.get(uri)

View File

@@ -48,17 +48,3 @@ def test_vhosts_with_defaults(app):
request, response = app.test_client.get("/") request, response = app.test_client.get("/")
assert response.text == "default" assert response.text == "default"
def test_remove_vhost_route(app):
@app.route("/", host="example.com")
async def handler1(request):
return text("You're at example.com!")
headers = {"Host": "example.com"}
request, response = app.test_client.get("/", headers=headers)
assert response.status == 200
app.remove_route("/", host="example.com")
request, response = app.test_client.get("/", headers=headers)
assert response.status == 404

Some files were not shown because too many files have changed in this diff Show More