Compare commits

...

923 Commits
0.4.1 ... 0.8.2

Author SHA1 Message Date
Channel Cat
d38fc17191 Update version to test pypi 2018-09-13 01:50:32 -07:00
Channel Cat
7ae0eb0dc3 Transfer ownership 2018-09-13 01:39:24 -07:00
Channel Cat
9082eb56a7 Update version to circumvent pypi upload errors 2018-09-06 13:51:31 -07:00
Ashley Sommer
30e6a310f1 Pausable response streams (#1179)
* This commit adds handlers for the asyncio/uvloop protocol callbacks for pause_writing and resume_writing.
These are needed for the correct functioning of built-in tcp flow-control provided by uvloop and asyncio.
This is somewhat of a breaking change, because the `write` function in user streaming callbacks now must be `await`ed.
This is necessary because it is possible now that the http protocol may be paused, and any calls to write may need to wait on an async event to be called to become unpaused.

Updated examples and tests to reflect this change.

This change does not apply to websocket connections. A change to websocket connections may be required to match this change.

* Fix a couple of PEP8 errors caused by previous rebase.

* update docs

add await syntax to response.write in response-streaming docs.

* remove commented out code from a test file
2018-08-18 18:12:13 -07:00
Eli Uriegas
a87934d434 Merge pull request #1292 from seemethere/increment_080
Increment to 0.8.0
2018-08-17 11:52:47 -07:00
Eli Uriegas
b398c1fe72 Increment to 0.8.0
Signed-off-by: Eli Uriegas <seemethere101@gmail.com>
2018-08-17 11:43:15 -07:00
Eli Uriegas
6f813f940e Merge pull request #1278 from ashleysommer/graceful_cancel
Gracefully handle when the request_handler_task is cancelled.
2018-08-17 11:41:39 -07:00
Eli Uriegas
d52498b787 Merge pull request #1284 from ashleysommer/aiohttp_update
Fix broken tests when aiohttp >= 3.3.0
2018-08-17 11:40:49 -07:00
Ashley Sommer
79e35bbdf6 Fix auto_reload in Linux (#1286)
* Fix two problems with the auto_reloader in Linux.
1) Change 'posix' to 'linux' in sys.plaform check, because 'posix' is an invalid value and 'linux' is the correct value to use here.
2) In kill_process_children, don't just kill the 2nd level procs, also kill the 1st level procs.
   Also in kill_process_children, catch and ignore errors in the case that the child proc is already killed.

* Fix flake8 formatting on PR
2018-08-16 23:30:03 -07:00
Innokenty Lebedev
1814ff05f4 Add sse extension (#1288) 2018-08-16 11:59:58 -07:00
Ashley Sommer
ec226e33cb Pin aiohttp <= 3.2.1 in requirements-dev.txt (fixes errors for new contributors checking out the code and setting up a dev environment)
Future-proof the some test cases so they work with aiohttp >= 3.3.0, in case we bump the aiohttp version in the future.
2018-08-16 15:00:23 +10:00
hqy
6abdf9f9c1 fixed #1143 (#1276)
* fixed #1143

* fixed build failed with create_serve call _helper failed
2018-08-15 10:23:04 -07:00
abuckenheimer
212da1029e disabled auto_reload by default in windows (#1280) 2018-08-07 11:48:18 -07:00
Ashley Sommer
afea15e4a7 Add a test for the graceful CancelledError handling. The user app should _never_ see a CancelledError bubble up, nor should they be able to catch it, because the response is already sent at that point. 2018-08-06 15:02:12 +10:00
Ashley Sommer
39ff02b6e4 Modifications the handle_request function to detect and gracefully handle the case that the request_handler Task is canceled by the sanic server while it is handling the request. One common occurrence of this is when the server issues a ResponseTimeout error, it also cancels the response_handler Task.
The Canceled exception handler purposely sets `response` to `None` to drop references to the handler coroutine, in an attempt to preemptively release resources.
This commit also fixes a possible reference-before-assignment of the `response` variable in the `handle_request` function.
Finally, another byproduct of this change is that ResponseMiddleware will no longer run if the `response` is `None`.
2018-08-06 14:12:30 +10:00
Cosmo Borsky
b238be54a4 Add content_type flag to Sanic.static (#1267)
* Add content_type flag to Sanic.static

Fixes #1266

* Fix flake8 error in travis

Add line to document `content_type` arg

* Fix content_type for file streams

Update tests

herp derp

* Remove content_type as an arg to HTTPResponse

`response.HTTPResponse` will default to `headers['Content-Type']` instead of `content_type`
https://github.com/channelcat/sanic/pull/1267#discussion_r204190913
2018-07-20 22:31:15 -07:00
Cosmo Borsky
377c9890a3 Support status code for file reponse (#1269)
Fixes #1268
2018-07-20 13:39:10 -07:00
ciscorn
599834b0e1 Add subprotocols param to add_websocket_route (#1261) 2018-07-16 12:20:26 -07:00
John Doe
a39a7ca9d5 Add url_bytes to Request (#1258)
We need to have access to the raw unparsed URL.
2018-07-16 12:13:27 -07:00
Ave
cd22745e6b Sanitize the URL before redirecting (#1260)
* URL Quote the URL before redirecting

* Use safe url instead of unsafe one

* Fix query params

* fix build

* Whitelist all reserved characters from rfc3986

* Add tests for redirect url sanitizing

* Remove check for resulting URL on header injection test

The thing the tests are testing for can be implemented in other
ways that don't redirect to 100% the same address, but they'll all have
to match the remaining parts of the test to succeed.
2018-07-12 21:31:33 -07:00
7
334649dfd4 Fix response ci header (#1244)
* add unit tests, which should fail

* fix CIDict

* moving CIDict to avoid circular imports

* fix unit tests

* use multidict for headers

* fix cookie

* add version constraint for multidict

* omit test coverage for __main__.py

* make flake8 happy

* consolidate check in for loop

* travisci retry build
2018-07-11 01:44:21 -07:00
fanjindong
becbc5f9ef fix one example and add one example (#1257) 2018-07-11 01:42:34 -07:00
7
f9b29fd7e7 py37 (#1256)
* add py37 to travisci

* use dist:xenial for py37

* sudo: true in .travici

* bump websockets version for py37 support and fix unit tests
2018-07-03 22:07:08 -07:00
Arnulfo Solís
9092ee9f0e HTTP Entity Headers (#1127)
* introduced basic entity and hopbyhop header identification

* removed entity headers

* coding style fixes

* remove unneeded header check

* moved from bytes to unicode in headers

* changed list to tuple in empty response statuses
2018-06-26 22:25:25 -07:00
GaryO
01257f65a6 Make auto reloader work on Mac (#1249) 2018-06-18 15:16:10 -07:00
Volodymyr Maksymiv
5ff481952d add UUID support (#1241) 2018-06-09 01:16:17 -07:00
7
baa689ad43 Fix failed build and add websockets version specifier (#1239)
* add websockets version constraint

* fix failed build
2018-06-07 10:07:26 -07:00
Philip Xu
2f30f4f69f Fixed #1231 - release resource no matter what (#1232) 2018-06-06 14:43:57 -07:00
Raphael Deem
202a4c6525 make request truthy if has transport (#1222) 2018-05-16 14:12:12 -07:00
Adam Hopkins
e1c9020268 Update extensions.md (#1205)
Changing the description of [Sanic JWT](https://github.com/ahopkins/sanic-jwt) to include permission scoping
2018-04-29 18:41:17 -07:00
Philip Xu
04a12b436e Added Sanic-Auth, Sanic-CookieSession and Sanic-WTF to Extensions doc (#1210) 2018-04-29 18:40:18 -07:00
Fantix King
818a8c2196 Added GINO to Extensions doc (#1200) 2018-04-21 21:02:49 -07:00
Arnulfo Solís
b6715464fd added init docs (#1167) 2018-04-01 20:53:08 -07:00
Raphael Deem
8f2d543d9f default to auto_reload in debug mode (#1159)
* default to auto_reload in debug mode

* disable auto-reload in testing client
2018-04-01 20:52:56 -07:00
Raphael Deem
6cf320bedb Merge pull request #1181 from kot83/patch-1
rename function in examples to post_json
2018-03-29 20:13:48 -07:00
kot83
a850ce5086 rename function to something else
function already defined
2018-03-29 15:57:10 -07:00
Raphael Deem
ef3bdf5408 Merge pull request #1180 from ashleysommer/fix_aiohttp_breakages
Fix failing tests when aiohttp>=3.1.0
2018-03-29 01:05:50 -07:00
Ashley Sommer
94b9bc7950 Some of the tests in Sanic (test_request_timout, test_response_timeout, test_keep_alive_timeout) use a custom SanicClient with modified methods. This relies on overriding internal aiohttp Client classes.
In aiohttp 3.1.0 there were some breaking changes that caused the custom methods to be no longer compatible with latest upstream aiohttp Client class.
See: 903073283f
and: b42e0ced46

This commit adds aiohttp version checks to adapt to these changes.
2018-03-29 11:54:59 +10:00
Raphael Deem
8a07463a67 Merge pull request #1175 from PyManiacGR/patch-1
Fix try_everything example.
2018-03-28 00:41:07 -07:00
PyManiac
2995b23929 Update try_everything.py 2018-03-24 15:55:15 +02:00
TheRubyDoggy
eb4276373b Fix try_everything example. 2018-03-24 15:34:41 +02:00
Raphael Deem
79df52e519 Merge pull request #1169 from charlax/patch-1
Clarify arguments to request/response middleware
2018-03-21 10:46:09 -07:00
Charles-Axel Dein
3dfb31b1b9 Clarify arguments to request/response middleware 2018-03-21 12:07:26 +01:00
Raphael Deem
c4c4ed70d9 Merge pull request #1163 from vopankov/master
Add __weakref__ to Request slots
2018-03-17 14:52:25 -07:00
Raphael Deem
45422df1b7 Merge pull request #1162 from yunstanford/fix-hang-build
Fix hang build and failed builds
2018-03-16 11:14:44 -07:00
Yun Xu
e0b7624414 fix hang build 2018-03-15 22:06:58 -07:00
Yun Xu
b0ecb3170f fix hang build 2018-03-15 22:03:36 -07:00
Yun Xu
fc8b5f378a migrate to trusty 2018-03-15 21:39:21 -07:00
Yun Xu
d42cb7ddb3 fix hang build 2018-03-15 21:28:52 -07:00
Панков Василий
6454ac0944 Add __weakref__ to Request slots 2018-03-14 13:37:15 +03:00
7
31cf83f10b Merge pull request #19 from channelcat/master
merge upstream master branch
2018-03-13 22:11:40 -07:00
Raphael Deem
cc84005593 Merge pull request #1157 from kinware/feature/add-route-streams
Allow streaming handlers in app.add_route()
2018-03-13 00:08:25 -07:00
Kinware
915d2732a1 Allow streaming handlers in add_route 2018-03-12 20:21:59 +01:00
Raphael Deem
44bc47361e Merge pull request #1149 from channelcat/travis-retry
use travis_retry on tox
2018-03-06 15:54:19 -08:00
Raphael Deem
3619b07843 Merge pull request #1146 from yunstanford/upgrade-test-client
Upgrade test client
2018-03-01 23:18:20 -08:00
Raphael Deem
ad3f588c79 use travis_retry on tox 2018-03-01 23:16:49 -08:00
Yun Xu
a2fc37121b migrating all to async syntax 2018-03-01 22:35:58 -08:00
Raphael Deem
7f36d20123 Merge pull request #1145 from yingshaoxo/patch-1
add an necessary import for better understanding
2018-02-28 01:19:29 -08:00
Yun Xu
d1a8e8b042 fixed unit tests 2018-02-27 22:25:38 -08:00
Yun Xu
c39ddd00d3 workaround fix for an issue in aiohttp.Client 2018-02-27 21:42:41 -08:00
Yun Xu
d55e453bd5 cleaning up 2018-02-27 20:26:49 -08:00
Raphael Deem
bffed27bdb Merge pull request #1142 from clarksun/patch-1
exception.md code sample miss 'async' prefix
2018-02-27 01:03:47 -08:00
7
fffcb158f1 Merge pull request #18 from channelcat/master
Merge upstream master branch
2018-02-26 22:19:30 -08:00
Yun Xu
eca98a54eb fixed all unit tests 2018-02-26 22:18:21 -08:00
Yun Xu
46ed2c5270 upgrade aiohttp for test_client 2018-02-26 22:08:05 -08:00
Sun Wei
23ea0b7ec9 exception.md code sample miss 'async' prefix 2018-02-26 16:09:26 +08:00
yingshaoxo
ef26cb283b add an necessary import for better understanding
add `from sanic.response import redirect`
2018-02-26 11:24:54 +08:00
Eli Uriegas
b8bb77eff6 Merge pull request #1137 from Julien00859/1136
sanic.handlers.ErrorHandler.response handler call was too restrictive
2018-02-21 09:55:52 -06:00
Julien00859
9c75ad3de1 close #1136 2018-02-21 00:50:27 +01:00
Eli Uriegas
0b38dea613 Merge pull request #1117 from abuckenheimer/env_dependent_ujson_uvloop
only install ujson and uvloop with cpython on non windows machines
2018-02-20 13:13:21 -06:00
Raphael Deem
7e4a9e3bc2 Merge pull request #1047 from Yaser-Amiri/master
Add auto reloading.
2018-02-16 11:11:49 -08:00
Raphael Deem
36f12c822f Merge pull request #1122 from knowsuchagency/master
add app.register_listener method
2018-02-15 16:58:27 -08:00
Raphael Deem
0cbea0f5d3 Merge pull request #1129 from cloudship/patch-1
raw requires a bytes-like object
2018-02-14 20:44:40 -08:00
panxb
e735fe54c3 raw requires a bytes-like object
raw requires a bytes-like object, or an object that implements __bytes__, not 'str'
2018-02-15 00:11:37 +08:00
Stephan Fitzpatrick
e911e2e1df updated doc 2018-02-13 23:58:03 -08:00
Stephan Fitzpatrick
1d75f6c2be changed docstring spacing 2018-02-13 10:15:16 -08:00
Raphael Deem
ad8a168469 Merge pull request #1121 from tandalf/issue-1120
Fixed bug when passing a list into route decorator's host argument #1120
2018-02-12 12:48:13 -08:00
Raphael Deem
74fc502089 Merge pull request #1124 from yunstanford/add-doc
Expose WebSocket Param and Add Doc
2018-02-11 02:02:13 -08:00
Yun Xu
dfc2166d8b add websocket.rst to index.rst 2018-02-10 12:21:23 -08:00
Yun Xu
2b70346db4 fix doc 2018-02-09 21:32:09 -08:00
Yun Xu
090df6f224 add websocket section in doc 2018-02-09 21:26:39 -08:00
Yun Xu
745a1d6e94 document websocket args 2018-02-09 21:03:21 -08:00
Yun Xu
0fe0796870 expose websocket protocol arguments 2018-02-09 20:44:02 -08:00
7
224b56bd3a Merge pull request #17 from channelcat/master
merge from upstream sanic
2018-02-09 20:12:29 -08:00
Stephan Fitzpatrick
571b5b544d added app.register_listener method w/test 2018-02-09 14:01:17 -08:00
Timothy Ebiuwhe
220b40f7f4 Added regression tests for issue #1120 2018-02-09 22:33:34 +01:00
Timothy Ebiuwhe
60774c5a49 Fixed bug that occurs on calling @app.route or any of it's variants
causes a route to be added twice. One without the slash, the other with the
Setting strict_slashes to false when a route does not end with slashes
slash. This is ok if the Router._add method runs linearly, but problematic
when it runs recursively. Unfortunately recursion is triggered when
the host param to the Router._add function is a list of hosts.
2018-02-09 22:27:20 +01:00
Raphael Deem
6d37ef7256 Merge pull request #1109 from DirkGuijt/master
fixed bug in multipart/form-data parser
2018-02-08 00:11:20 -08:00
Dirk Guijt
e083224df1 changed bewline formatting 2018-02-07 09:29:44 +01:00
Alec Buckenheimer
5ef567405f fixed platform from windows to win32 2018-02-06 19:56:25 -05:00
Raphael Deem
ea2521f430 Merge pull request #1112 from boboldehampsink/extend_websocketprotocol_arguments
Extend WebSocketProtocol arguments
2018-02-06 15:05:58 -08:00
Alec Buckenheimer
82cb182fe7 added pip requirement to only install ujson and uvloop with cpython on non windows machines 2018-02-06 09:57:16 -05:00
Raphael Deem
3fe31ff551 Merge pull request #1104 from arnulfojr/minor/keep-alive-timeout-log-level
KeepAlive Timeout log level change to debug
2018-02-02 18:54:24 -08:00
Dirk Guijt
48d45f1ca4 sorry, style issue again 2018-02-03 03:14:04 +01:00
Dirk Guijt
ddf2a604d1 changed 'file' variable to 'form_file' to prevent overwriting the reserved word 2018-02-03 03:07:07 +01:00
Raphael Deem
8b920d9d56 Merge pull request #1113 from arnulfojr/bugfix/content-length-header-on-X04
Content Length header on 204 and 304 responses
2018-02-02 13:18:08 -08:00
Arnulfo Solis
f2c0489452 replaced comparison for in operator 2018-02-02 20:19:15 +01:00
Arnulfo Solis
f5a2d19199 touch commit 2018-02-02 14:13:14 +01:00
Arnulfo Solis
86fed12d91 less flake8 warnings in response test 2018-02-02 14:05:57 +01:00
Arnulfo Solis
7ca3ad5d4c no body and content length to 0 when 304 response is returned 2018-02-02 13:24:51 +01:00
Dirk Guijt
1eecffce97 fixed minor flake8 style problem 2018-02-02 09:57:06 +01:00
Dirk Guijt
5c341a2b00 made field name mandatory in multipart/form-data headers
A field name in the Content-Disposition header is required by the multipart/form-data spec. If one field/part does not have it, it will be omitted from the request. When this happens, we log it to DEBUG.
2018-02-02 09:43:42 +01:00
Arnulfo Solis
0ab64e9803 simplified logic when handling the body 2018-02-02 09:29:54 +01:00
Dirk Guijt
27108334f1 Merge branch 'master' of https://github.com/DirkGuijt/sanic 2018-02-02 00:55:58 +01:00
Dirk Guijt
788253cbe8 changes based on discussion on PR #1109 2018-02-02 00:55:51 +01:00
Arnulfo Solis
4b6e89a526 added one more test 2018-02-01 20:00:32 +01:00
Arnulfo Solis
68fd1b66b5 Response model now handles the 204 no content 2018-02-01 17:51:51 +01:00
Bob Olde Hampsink
5806666949 Extend WebSocketProtocol arguments to accept all arguments of websockets.protocol.WebSocketCommonProtocol 2018-02-01 16:23:10 +01:00
DirkGuijt
a76d8108fe small code style change
changed double quotes to single quotes to match the coding style
2018-02-01 11:55:30 +01:00
Arnulfo Solis
2135294e2e changed None to return empty string instead of null string 2018-02-01 11:52:55 +01:00
Dirk Guijt
ed1c563d1f fixed bug in multipart/form-data parser
Sanic automatically assumes that a form field is a file if it has a content-type header, even though the header is text/plain or application/json. This is a fix for it, I took into account the RFC7578 specification regarding the defaults.
2018-02-01 11:30:24 +01:00
Raphael Deem
74efa3a108 Merge pull request #1105 from manisenkov/upgrade-status-to-beta
Upgrade development status to beta
2018-01-31 14:36:06 -08:00
manisenkov
49c29e6862 Upgrade development status to beta 2018-01-31 23:25:50 +01:00
Raphael Deem
17d7f24825 Merge pull request #1108 from manisenkov/pin-pytest-version
Pin pytest version to 3.3.2
2018-01-31 14:24:04 -08:00
manisenkov
f23c3da4ff Pin pytest version to 3.3.2 2018-01-31 22:58:48 +01:00
Arnulfo Solis
cabcf50fbe KeepAlive Timeout log level change to debug 2018-01-30 11:26:15 +01:00
Raphael Deem
8b23b5d389 Merge pull request #1101 from SirEdvin/master
Provide information about sanic-oauth extension
2018-01-29 15:00:11 -08:00
SirEdvin
37eb2c1db6 Provide information about sanic-oauth extension 2018-01-27 10:28:53 +02:00
Eli Uriegas
9751a37343 Merge pull request #1098 from shahinism/refactor/docker
Install Python 3.5 and 3.6 on docker container
2018-01-26 14:01:21 -08:00
Eli Uriegas
ed3bdd3443 Merge pull request #1100 from NyanKiyoshi/master
No longer raising a missing parameter when value is null
2018-01-26 13:56:52 -08:00
NyanKiyoshi
285ad9bdc1 No longer raising a missing parameter when value is null
When passing a null value as parameter (ex.: 0, None or False), Sanic said "Error: Required parameter `param` was not passed to url_for"

Example:

```
@app.route("/<idx>")
def route(rq, idx):
    pass
```

```
url_for("route", idx=0)
```

No longer raises: `Error: Required parameter `idx` was not passed to url_for`
2018-01-26 21:13:43 +01:00
Shahin
16f5914c90 Install Python 3.5 and 3.6 on docker container
To cover all supported versions of Python using Tox
2018-01-24 16:46:45 +03:30
Raphael Deem
72d56a89a2 Merge pull request #1094 from caitinggui/master
update class_based_views
2018-01-23 17:25:59 -08:00
Raphael Deem
1135c8c1b1 Merge pull request #1097 from howie6879/master
Add parameter check
2018-01-23 17:25:25 -08:00
caitinggui
ec4339bd47 update description 2018-01-24 09:02:07 +08:00
howie6879
6c0fbef843 Add parameter check 2018-01-24 08:17:55 +08:00
howie6879
040c85a43b Add parameter check 2018-01-24 08:11:47 +08:00
Raphael Deem
420554c737 Merge pull request #1096 from kyb3r/patch-1
Typo in readme?
2018-01-22 23:56:16 -08:00
howie6879
f20b854dd2 Add parameter check 2018-01-22 14:52:30 +08:00
howie6879
3844cec7a4 Add parameter check 2018-01-22 14:12:41 +08:00
howie.hu
0db49f7520 Merge pull request #4 from channelcat/master
Update
2018-01-22 13:36:48 +08:00
Kyber
f8dedcaa1e Typo in readme? 2018-01-22 15:55:29 +11:00
Yaser Amiri
f8b1122467 Revert "Change parsing cookies mechanism. (like Django instade of http.cookies.SimpleCookie)"
This reverts commit ba1dbacd35.
2018-01-21 09:10:15 +03:30
Raphael Deem
f3bf5e9a5c Merge pull request #1090 from yunstanford/patch-signal-handling
Patch signal handling
2018-01-20 14:03:23 -08:00
Yaser Amiri
ba1dbacd35 Change parsing cookies mechanism. (like Django instade of http.cookies.SimpleCookie) 2018-01-20 12:49:16 +03:30
caitinggui
4036f1c121 update class_based_views 2018-01-19 16:20:07 +08:00
Raphael Deem
22ad697d1f Merge pull request #1078 from eltrhn/master
Add support for blueprint groups and nesting
2018-01-18 17:26:52 -08:00
Eli
a10d7469cd Add blueprint groups + nesting 2018-01-18 17:20:51 -08:00
Raphael Deem
06c3153d22 Merge pull request #1092 from mattfox/patch-1
Add request.method to documentation
2018-01-17 16:10:42 -08:00
Matt Fox
9677158b75 Add request.method to documentation 2018-01-17 07:31:39 -08:00
Raphael Deem
226a73141b Merge pull request #1091 from yunstanford/patch-router-fix
Patch router fix
2018-01-17 00:15:33 -08:00
Raphael Deem
da3201bf35 Merge pull request #1088 from cosven/master
use single quote in readme.rst
2018-01-16 13:34:39 -08:00
Yun Xu
7daebc6aea fix Router.check_dynamic_route_exists 2018-01-15 17:53:37 -08:00
Yun Xu
d9002769cf fix a typo 2018-01-15 17:49:11 -08:00
Yun Xu
6d0b30953a add unit test which should fail on original code 2018-01-15 17:40:44 -08:00
Yun Xu
09d6452475 fixed unit test 2018-01-15 15:15:08 -08:00
Yun Xu
6a61fce84e worker process should ignore SIGINT when run_multiple 2018-01-15 11:53:15 -08:00
Yun Xu
11017902be signal handling 2018-01-15 11:23:49 -08:00
7
bd7333723e Merge pull request #16 from channelcat/master
Merge upstream master branch
2018-01-15 10:48:34 -08:00
cosven
6648250fb9 Merge pull request #1 from cosven/cosven-patch-1
use single quote in readme.rst
2018-01-15 14:56:52 +08:00
cosven
a94a2d46d0 use single quote in readme.rst
As we use single quote in sanic package, we may be supposed to use single quote in readme also?
2018-01-15 14:55:36 +08:00
Raphael Deem
ab97018c78 Merge pull request #1082 from channelcat/1042
fix exception handling
2018-01-13 17:06:46 -08:00
Raphael Deem
be702b0924 Merge pull request #1087 from Stranger6667/fix-get_socket
Fix typo
2018-01-13 17:06:03 -08:00
Dmitry Dygalo
c5c10cfb50 Fix typo 2018-01-13 17:56:29 +01:00
howie.hu
5682d642a6 Merge pull request #3 from channelcat/master
Update
2018-01-10 09:23:43 +08:00
Raphael Deem
62c6d7274c Merge pull request #1083 from bow/fix/log_response_host
Fix log_response to correctly output request ip and port
2018-01-09 13:55:14 -08:00
bow
4f8633375d Fix log_response to correctly output request ip and port 2018-01-09 13:47:01 +01:00
Raphael Deem
9f559818e5 Merge pull request #1081 from howie6879/master
Fix: the Chinese URI
2018-01-08 15:54:46 -08:00
howie6879
5f329f72ee Update test_routes.py 2018-01-08 08:38:54 +08:00
howie6879
7303a06f83 Fix: the Chinese URI 2018-01-07 12:07:18 +08:00
howie6879
e34de96b24 Fix: the Chinese URI 2018-01-07 12:06:21 +08:00
howie6879
42cd424274 Fix: the Chinese URI 2018-01-07 10:59:12 +08:00
howie.hu
9426e94314 Merge pull request #2 from channelcat/master
Update
2018-01-07 09:57:51 +08:00
r0fls
7a1dab3319 fix exception handling 2018-01-05 14:12:22 -08:00
Raphael Deem
d63ec84745 Merge pull request #1037 from youknowone/xdist
Boost test speed by pytest-xdist
2018-01-05 12:10:27 -08:00
Raphael Deem
0e7e2f4e5b Merge pull request #1080 from channelcat/1079
fix timeout bug when self.transport is None
2018-01-04 15:01:40 -08:00
r0fls
46521240a9 fix timeout bug when self.transport is None 2018-01-03 23:33:22 -08:00
Jeong YunWon
a8827a5d95 Boost test speed by pytest-xdist 2018-01-03 18:52:25 +09:00
Raphael Deem
ca0bc1cb7d Merge pull request #1076 from channelcat/1074
fix strict_slashes bug when route has slash
2018-01-01 12:47:48 -08:00
r0fls
8c28ce7d79 fix strict_slashes bug when route has slash 2018-01-01 02:24:48 -08:00
r0fls
5b051f0891 add test 2018-01-01 02:14:55 -08:00
Raphael Deem
c93de9450a Merge pull request #1073 from Kurlov/master
remove uvloop for windows setup
2017-12-29 14:56:38 -08:00
Aleksandr Kurlov
96976fa892 remove uvloop for windows setup 2017-12-29 23:04:22 +05:00
Raphael Deem
c14e99cef0 Merge pull request #1071 from r0fls/1062-docs
udpate docs with add_task app injection
2017-12-28 02:01:23 -08:00
Raphael Deem
c91a806774 udpate docs with add_task app injection 2017-12-28 01:59:16 -08:00
Raphael Deem
2f0076f429 Merge pull request #1063 from r0fls/1062
try to inject the app in add_task method
2017-12-27 11:40:05 -08:00
Raphael Deem
a1ffc6d55b try to inject the app in add_task method 2017-12-27 01:06:43 -08:00
Yaser Amiri
9bdf7a9980 Revert files those fixed for flake problems. 2017-12-26 23:35:54 +03:30
Yaser Amiri
81494453b0 Remove dependency on requests library.
Change auto reloader enviroment varible name to SANIC_SERVER_RUNNING
Fix some typo mistakes, flake uncompatibilities and such problems.
Raise NotImplementedError for operating systems except posix systems for auto reloading.
2017-12-26 19:17:13 +03:30
Raphael Deem
008cbe5ce7 Merge pull request #1069 from r0fls/1050
add samesite cookie to cookie keys
2017-12-24 02:50:37 -08:00
Raphael Deem
5ee35e7eeb add samesite cookie to cookie keys 2017-12-24 02:33:52 -08:00
Raphael Deem
1a98e70281 Merge pull request #1066 from r0fls/1065
allow add_task after server starts
2017-12-21 23:44:18 -08:00
Raphael Deem
8e3f3977bd allow add_task after server starts 2017-12-21 23:37:42 -08:00
Raphael Deem
04b04f094c Merge pull request #1064 from r0fls/1061
double quotes in unauthorized exception per rfc7230
2017-12-21 18:59:41 -08:00
Raphael Deem
9c02cdbad9 double quotes in unauthorized exception per rfc7230 2017-12-21 18:05:05 -08:00
Raphael Deem
c30e805623 Merge pull request #1060 from seemethere/fix_readthedocs_builds
Add sphinxcontrib-asyncio to environment.yml
2017-12-20 23:40:04 -08:00
Eli Uriegas
cd81538ce3 Add sphinxcontrib-asyncio to environment.yml
readthedocs builds were broken because they were missing this dependency

Signed-off-by: Eli Uriegas <seemethere101@gmail.com>
2017-12-19 22:55:50 -06:00
howie.hu
68cb280513 Merge pull request #1 from channelcat/master
Update
2017-12-20 09:08:52 +08:00
Raphael Deem
19466a15b4 Merge pull request #1055 from youknowone/cancel-timeout
Cancel request tasks when response timeout is triggered
2017-12-17 16:46:29 -08:00
Jeong YunWon
d54b406cba Cancel request tasks when response timeout is triggered
Before: Even after raising ResponseTimeout, server still processes
remaining tasks until it is done
After: Before raising ResponseTimeout, server stops working task.
2017-12-14 18:43:52 +09:00
Raphael Deem
72254a7af9 Merge pull request #1054 from r0fls/rfc7231
fix issues with method not allowed response
2017-12-13 23:42:37 -08:00
Raphael Deem
52feff266e fix edge case with methods as None 2017-12-13 23:23:04 -08:00
Raphael Deem
2c3f50e34a fix stream handling 2017-12-13 23:06:18 -08:00
Raphael Deem
2b0258c13a fix issues with method not allowed response 2017-12-11 20:12:26 -08:00
Raphael Deem
2585900692 Merge pull request #1053 from danpalmer/master
Fix ip and socket data format on V6
2017-12-11 20:24:52 -06:00
Dan Palmer
48c2dcb110 Fix ip and socket data format on V6 2017-12-11 22:16:03 +00:00
Dan Palmer
10a378bd46 Typo 2017-12-11 14:33:43 +00:00
Yaser Amiri
3fe3c2c79f Add test for auto reloading. 2017-12-07 20:19:40 +03:30
Yaser Amiri
52c2a8484e Add auto reloader. 2017-12-07 16:30:54 +03:30
Eli Uriegas
21435c1863 Merge pull request #1045 from seemethere/increment_070
Increment to 0.7.0
2017-12-05 19:27:11 -08:00
Eli Uriegas
1ea3ab7fe8 Increment to 0.7.0
Signed-off-by: Eli Uriegas <seemethere101@gmail.com>
2017-12-05 19:13:16 -08:00
Raphael Deem
1b0ad2c3cd Merge pull request #1035 from yunstanford/patch-N
Adopt new websockets interface
2017-12-02 01:27:09 -08:00
Raphael Deem
aa4821864a Merge pull request #1039 from lixxu/master
check request.ip before using it
2017-11-28 19:47:34 -08:00
lixxu
283762224c clean codes 2017-11-28 14:47:43 +08:00
lixxu
f50a37fc88 ignore error if request.ip is None 2017-11-28 14:44:32 +08:00
Yun Xu
076f0515ca Fix flake8 2017-11-25 21:14:18 -08:00
Yun Xu
049f12096d fix unit tests 2017-11-25 21:07:38 -08:00
Yun Xu
f09c0393ba adopt new websockets interface 2017-11-25 21:01:22 -08:00
7
472bbcf293 Merge pull request #15 from channelcat/master
Merge upstream master branch
2017-11-25 20:49:09 -08:00
Raphael Deem
7a3f9daccf Merge pull request #1025 from nkoshell/route-version-params
Route version params
2017-11-20 23:43:22 -08:00
Nikita Koshelev
76511d61e0 Added removing duplicate 'v' for Router.add() version parameter
Fix sanic/router.py:123:80: E501 line too long (80 > 79 characters)
2017-11-18 01:39:00 +03:00
Nikita Koshelev
8e7475ccf6 Added regex escaping for Router.add() version parameter 2017-11-18 01:22:42 +03:00
Raphael Deem
820d8c7bf5 Merge pull request #1021 from EdwardBetts/spelling
Correct spelling mistakes.
2017-11-16 16:26:40 -08:00
Edward Betts
cfc75b4f1a Correct spelling mistakes. 2017-11-15 15:46:39 +00:00
Raphael Deem
98567fe5a8 Merge pull request #1008 from youknowone/pytest-xdist
Let SanicTestClient has its own port
2017-11-10 10:50:01 -08:00
Raphael Deem
05bb812e2b Merge pull request #1010 from Yaser-Amiri/master
Change unit tests names with repeated names.
2017-11-08 20:25:45 -08:00
Yaser Amiri
c9876a6c88 Change unit tests names with repeated names. 2017-11-08 14:14:57 +03:30
Raphael Deem
979b5a52d3 Merge pull request #1005 from joar/feature/static-strict-slashes
Add strict_slashes to {app, blueprint}.static()
2017-11-07 07:49:32 -08:00
Joar Wandborg
e70535e8d7 Use .get instead of .pop 2017-11-07 10:34:17 +01:00
Jeong YunWon
ed8725bf6c Let SanicTestClient has its own port
For parallel test running, the servers must have different ports.
See examples/pytest_xdist.py for example.
2017-11-06 17:29:32 +09:00
Raphael Deem
098cd70e82 Merge pull request #1007 from furious-luke/master
Call connection_open after websocket handshake
2017-11-05 14:26:19 -08:00
Raphael Deem
969dac2033 Merge pull request #1004 from Stibbons/optionalize_log_config
Optionalize app.run dictConfig (fix #1000)
2017-11-04 12:35:38 -07:00
Gaetan Semet
49b1d667f1 Optionalize app.run dictConfig (fix #1000)
Signed-off-by: Gaetan Semet <gaetan@xeberon.net>
2017-11-04 15:58:27 +01:00
Luke Hodkinson
bca1e08411 Call connection_open after websocket handshake
It seems that due to (recent?) changes in the websocket library, we
now need to call "connection_open" to flag that the websocket is now
ready to use. I've added that call just after the call to
"connection_made".
2017-11-04 22:04:59 +11:00
Raphael Deem
bf6ed217c2 Merge pull request #1006 from r0fls/routing-fix
check if method is added in strict slash logic
2017-11-04 00:09:52 -07:00
Raphael Deem
bb8e9c6438 check if method is added in strict slash logic 2017-11-03 18:36:06 -07:00
Joar Wandborg
f128ed5b1f Set threshold to 1MiB instead of 0.97MiB
Reference: https://en.wikipedia.org/wiki/Mebibyte#Definition
2017-11-03 14:37:01 +01:00
Joar Wandborg
ff5786d61b pep8 2017-11-03 14:33:24 +01:00
Joar Wandborg
ca596c8ecd Add strict_slashes to {Sanic, Blueprint}().static() 2017-11-02 15:44:36 +01:00
Raphael Deem
c3bcafb514 Merge pull request #997 from ignatenkobrain/localhost
tests: do not assume that locahost == 127.0.0.1
2017-11-01 00:22:14 -07:00
Igor Gnatenko
a9c7d95e9b tests: do not assume that locahost == 127.0.0.1
Signed-off-by: Igor Gnatenko <ignatenkobrain@fedoraproject.org>
2017-10-31 09:39:09 +01:00
Raphael Deem
01042c1d98 Merge pull request #992 from r0fls/968
remove port from ip
2017-10-25 22:15:07 -07:00
Raphael Deem
5bf722c7ae remove bare exceptions 2017-10-25 21:58:31 -07:00
Raphael Deem
c2191153cf remove port from ip 2017-10-23 21:37:59 -07:00
davidtgq
5bcbc5a337 Replaced COMMON_STATUS_CODES with a simple 200 check for more fast (#982)
* Replaced COMMON_STATUS_CODES with a simple 200 check for more fast

* Added IPware algorithm

* Remove HTTP prefix from Django-style headers
Remove right_most_proxy because it's outside spec

* Remove obvious docstrings

* Revert "Replaced COMMON_STATUS_CODES with a simple 200 check for more fast"

This reverts commit 15b6980

* Revert "Added IPware algorithm"

This reverts commit bdf66cb

WTF HOW DO I GIT

* Revert "Revert "Replaced COMMON_STATUS_CODES with a simple 200 check for more fast""

This reverts commit d8df095

* Revert "Added IPware algorithm"

This reverts commit bdf66cb

* Delete ip.py
2017-10-19 16:43:07 -07:00
Eli Uriegas
f721f90add Merge pull request #820 from youknowone/worker-protocol
Protocol configurable gunicorn worker
2017-10-19 16:21:28 -07:00
Eli Uriegas
0e92d8ce2c Merge branch 'master' into worker-protocol 2017-10-19 16:21:18 -07:00
Eli Uriegas
727d6a1b61 Merge pull request #972 from lanfon72/patch-2
to fix condition error that used in `log_response`
2017-10-19 16:16:57 -07:00
Raphael Deem
666c0847b7 Merge pull request #976 from ashleysommer/fix_websocket_timeout
Fix Websocket protocol timeouts after #939
2017-10-18 21:20:52 -07:00
Raphael Deem
0a411f9bba Merge pull request #985 from ashleysommer/ashleysommer-docs-spf
Add Sanic-Plugins-Framework library to Extensions doc
2017-10-18 21:20:13 -07:00
Ashley Sommer
49f3ba39f9 Add Sanic-Plugins-Framework library to Extensions doc
I made a new tool for devs to use for easily and quickly creating Sanic Plugins (extensions), and for application builders to easily use those plugins in their app.
2017-10-18 17:52:03 +10:00
Raphael Deem
794128a053 Merge pull request #981 from kszucs/manifest
Include LICENSE file in manifest
2017-10-17 10:31:19 -07:00
Krisztián Szűcs
e6be3b2313 include LICENSE file in manifest 2017-10-17 16:05:24 +02:00
Raphael Deem
c5cdcf0f95 Merge pull request #975 from ashleysommer/timeouts_documentation
Add documentation for new Timeout values, after #939
2017-10-16 09:13:49 -07:00
Ashley Sommer
ea5b07f636 Update websocket protocol to accomodate changes in HTTP protocol from https://github.com/channelcat/sanic/pull/939
Fixes https://github.com/channelcat/sanic/issues/969
2017-10-16 11:06:33 +10:00
Ashley Sommer
477e6b8663 Add documentation for REQUEST_TIMEOUT, RESPONSE_TIMEOUT and KEEP_ALIVE_TIMEOUT config values.
Fixed some inconsistent default values.
2017-10-16 10:53:45 +10:00
Raphael Deem
a0d8418b40 Merge pull request #965 from samael500/master
fix issue #959
2017-10-13 14:58:46 -07:00
Raphael Deem
006fb08024 Merge pull request #966 from yunstanford/patch-M
Sanic routes should not pass angled params with empty names
2017-10-13 02:18:20 -07:00
lanf0n
4578f6016b to fix condition error that used in log_response
`request` class is derived from `dict`, so it will never be `True`.
2017-10-13 16:48:02 +08:00
Raphael Deem
5b06bcc57d Merge pull request #967 from samael500/custom_filename
Custom filename
2017-10-13 01:35:11 -07:00
Raphael Deem
d4bb14a511 Merge pull request #971 from pcinkh/socket_disconnects_speedup
Critical speedup websocket disconnects from O(N) to O(1)
2017-10-13 01:25:14 -07:00
pcinkh
6d2f5da506 Speedup websocket disconnects. 2017-10-11 14:02:26 +03:00
Yun Xu
c96df86111 make flake8 happy 2017-10-09 07:58:04 -07:00
Maks Skorokhod
86f87cf4ac 🔧 no use f'string' 2017-10-09 17:55:35 +03:00
Yun Xu
770a8fb288 raise exception for invalid param syntax 2017-10-09 07:54:39 -07:00
Maks Skorokhod
c4e3a98ea7 add test for custom filename 2017-10-09 17:45:42 +03:00
Maks Skorokhod
07e95dba4f 🔁 customize filename in file response 2017-10-09 17:45:22 +03:00
7
9bc1abcd00 Merge pull request #14 from channelcat/master
merge upstream master branch
2017-10-09 07:19:57 -07:00
Maks Skorokhod
4d515b05f3 fix missed assertion 2017-10-09 17:18:04 +03:00
Maks Skorokhod
64edf7ad9c upd test for connection lost error 2017-10-09 16:00:32 +03:00
Maks Skorokhod
7610c0fb2e 🔧 log Connection lost only if debug 2017-10-09 15:50:36 +03:00
Raphael Deem
0189e4ed59 Merge pull request #962 from ProstoMaxim/fix_logs
Fix logs
2017-10-08 20:16:11 -07:00
Raphael Deem
8018c9b91d Merge pull request #961 from r0fls/fix-920
fix false cookie encoding and output
2017-10-06 23:57:30 -07:00
Max Murashov
4b3920daba Fix logs 2017-10-06 16:53:30 +03:00
Raphael Deem
d876e3ed5c fix false cookie encoding and output 2017-10-05 22:20:50 -07:00
Raphael Deem
086b5daa53 Merge pull request #960 from piotrbulinski/refactor_server_access_log
Refactor access log for server.HttpProtocol
2017-10-05 20:20:28 -07:00
Piotr Buliński
4b877e3f6b Update server.py 2017-10-05 09:28:13 +02:00
Piotr Buliński
8ce749e339 Update server.py 2017-10-05 09:27:18 +02:00
Piotr Buliński
752ddfa7fc Merge branch 'master' into refactor_server_access_log 2017-10-05 09:26:19 +02:00
Raphael Deem
8700c96c4d Merge pull request #942 from yunstanford/patch-logging-refactor
Patch logging refactor
2017-10-05 00:22:02 -07:00
Piotr Bulinski
e3852ceeca Refactor access log for server 2017-10-04 12:50:57 +02:00
Yun Xu
225ea49b6f resolve conflicts again 2017-10-01 01:22:27 -07:00
Raphael Deem
15fd49037f Merge pull request #939 from ashleysommer/keepalive_timeout
Split RequestTimeout, ResponseTimeout, and KeepAliveTimeout into different timeouts
2017-09-30 22:15:50 -07:00
Raphael Deem
2fb4697e12 Merge pull request #952 from ahopkins/patch-1
Update extensions.md
2017-09-29 18:33:30 -07:00
Eli Uriegas
1a9f770317 Merge pull request #957 from lanfon72/master
add sphinx extension to add asyncio-specific markups
2017-09-29 11:17:29 -07:00
lanf0n
62871ec9b3 add sphinx extension to add asyncio-specific markups 2017-09-30 01:16:26 +08:00
Raphael Deem
39c64214ee Merge pull request #953 from r0fls/949
support vhosts in static routes
2017-09-27 01:28:43 -07:00
Raphael Deem
9aec5febb8 support vhosts in static routes 2017-09-27 01:24:49 -07:00
Adam Hopkins
91b2167eba Update extensions.md
Add - [JWT](https://github.com/ahopkins/sanic-jwt): Authentication extension for JSON Web Tokens (JWT) extension package.
2017-09-27 11:07:06 +03:00
Raphael Deem
00d40a35cd Merge pull request #951 from lixxu/master
fix bug and set scheme to http if not provided
2017-09-26 21:57:54 -07:00
lixxu
f96ab02767 set scheme to http if not provided 2017-09-27 09:59:49 +08:00
Raphael Deem
4ce699e57f Merge pull request #944 from blazehu/master
add __repr__ for sanic request
2017-09-25 13:58:09 -07:00
Raphael Deem
4ee042c330 Merge pull request #948 from chiuczek/json-dependency-injection
Use dependency injection to allow alternative json parser or encoder
2017-09-24 21:05:39 -07:00
Yun Xu
0b23f4ff81 resolve conflicts 2017-09-23 06:19:09 -07:00
Hugh McNamara
5cef1634ed use json_loads function in json property of request 2017-09-22 10:19:15 +01:00
Eli Uriegas
1b0286916e Merge pull request #947 from lanfon72/patch-1
to fix if platform is windows.
2017-09-19 10:15:35 -07:00
Hugh McNamara
a8f764c161 make method instead of property for alternative json decoding of request 2017-09-19 18:12:53 +01:00
Hugh McNamara
1d719252cb use dependency injection to allow alternative json parser or encoder 2017-09-19 14:58:49 +01:00
lanf0n
d8cebe1188 to fix if platform is windows. 2017-09-19 18:14:25 +08:00
Eli Uriegas
329ebf6a5d Merge pull request #946 from trthhrtz/patch-1
Update getting_started.md
2017-09-18 11:13:48 -07:00
Kuzma Leshakov
c836441a75 Update getting_started.md
Hello World example at the main Readme file (https://github.com/channelcat/sanic/blob/master/README.rst) is different, it returns json. Here is returned text. In the following examples, such as Routing (http://sanic.readthedocs.io/en/latest/sanic/routing.html) is again used json. Therefore I suggest to make examples the same, having json as output
2017-09-18 11:37:32 +03:00
huyuhan
074d36eeba add __repr__ for sanic request 2017-09-15 21:15:05 +08:00
huyuhan
f6eb35f67d add __repr__ for sanic request 2017-09-15 21:05:25 +08:00
huyuhan
77f70a0792 add __repr__ for sanic request 2017-09-15 20:56:44 +08:00
huyuhan
12dafd07b8 add __repr__ for sanic request 2017-09-15 18:34:56 +08:00
Eli Uriegas
9fb8bec715 Merge pull request #943 from crvv/master
fix #763, sanic can't decode latin1 encoded header value
2017-09-14 13:38:44 -07:00
Wèi Cōngruì
eb1146c6b6 fix #763, sanic can't decode latin1 encoded header value 2017-09-14 19:23:02 +08:00
Yun Xu
730f7c5e41 add doc for customizing logging config 2017-09-13 18:30:38 -07:00
Yun Xu
5cabc9cff2 update doc 2017-09-13 18:16:58 -07:00
Yun Xu
ddc039ed2e update doc 2017-09-13 18:14:46 -07:00
Eli Uriegas
a146ebd856 Merge pull request #941 from aiosin/master
add status codes and teapot example
2017-09-13 11:39:14 -07:00
Yun Xu
5ee7b6caeb fixing small issue 2017-09-13 10:35:34 -07:00
Yun Xu
9c4b0f7b15 fix flake8 2017-09-13 07:40:42 -07:00
aiosin
2e5d1ddff9 add status codes and teapot example 2017-09-13 14:08:29 +02:00
Yun Xu
24bdb1ce98 add unit tests/refactoring 2017-09-12 23:42:42 -07:00
Ashley Sommer
8eb59ad4dc Fixed error where the RequestTimeout test wasn't actually testing the correct behaviour
Fixed error where KeepAliveTimeout wasn't being triggered in the test suite, when using uvloop
Fixed test cases when using other asyncio loops such as uvloop
Fixed Flake8 linting errors
2017-09-13 10:18:36 +10:00
Raphael Deem
d8c8ccd180 Merge pull request #932 from lixxu/master
static files url building using url_for
2017-09-12 12:59:04 -07:00
Yun Xu
a46e004f07 apply new loggers 2017-09-11 22:12:49 -07:00
Ashley Sommer
173f94216a Fixed the delays, and expected responses, in the keepalive_timeout tests 2017-09-12 13:40:43 +10:00
Ashley Sommer
1a74accd65 finished the keepalive_timeout tests 2017-09-12 13:09:42 +10:00
Ashley Sommer
2979e03148 WIP - Split RequestTimeout, ResponseTimout, and KeepAliveTimeout into different timeouts, with different callbacks. 2017-09-11 17:17:33 +10:00
Yun Xu
4bdb9a2c8e prototype 2017-09-10 23:19:09 -07:00
Yun Xu
8f6fa5e9ff old logging cleanup 2017-09-10 18:44:54 -07:00
Yun Xu
986135ff76 remove DefaultFilter 2017-09-10 18:39:42 -07:00
Yun Xu
c9cbc00e36 use access_log as param 2017-09-10 18:38:52 -07:00
Yun Xu
c9a40c180a remove some logging stuff 2017-09-10 11:11:16 -07:00
7
125cb17fcb Merge pull request #13 from channelcat/master
sync from upstream master branch
2017-09-09 16:52:13 -07:00
Raphael Deem
53a5bd2319 Merge pull request #936 from yunstanford/patch-debug-logging
Patch debug logging
2017-09-08 20:49:16 -07:00
Raphael Deem
4fd68f9af3 Merge pull request #935 from iad42/patch-1
Added information on request.token
2017-09-08 20:49:01 -07:00
Yun Xu
c4417b399b fixing debug logging 2017-09-08 17:47:05 -07:00
7
c2a3e42a53 Merge pull request #12 from channelcat/master
merge upstream master branch
2017-09-08 17:39:50 -07:00
Anatoly Ivanov
73c04f5a89 Added information on request.token
The manual lacked info about request.token, which keeps authorization data. See https://github.com/channelcat/sanic/blob/master/sanic/request.py#L84 for details
2017-09-08 14:21:49 +03:00
lixxu
195f707f14 missing '/' in doc 2017-09-06 19:19:59 +08:00
lixxu
bc20dc5c62 use url_for for url building for static files 2017-09-06 19:17:52 +08:00
Raphael Deem
8b4ca51805 Merge pull request #931 from Tim-Erwin/envvar_prefix
make the prefix for environment variables alterable
2017-09-05 11:22:56 -07:00
Tim Mundt
e2e25eb751 fixed flake convention 2017-09-05 11:05:31 +02:00
Tim Mundt
9572ecc5ea test for env var prefix 2017-09-05 10:58:48 +02:00
Tim Mundt
97d8b9e908 documentation for env var prefix; allow passing in the prefix through the app constructor 2017-09-05 10:41:55 +02:00
Tim Mundt
c59a8a60eb make the prefix for environment variables alterable 2017-09-05 09:53:33 +02:00
Raphael Deem
158da0927a Merge pull request #901 from lixxu/master
add name option for route building
2017-08-31 15:29:03 -07:00
Eli Uriegas
78a7338346 Merge pull request #922 from timka/patch-1
Example logging X-Request-Id transparently
2017-08-31 10:35:48 -07:00
Eli Uriegas
90e5c8d39b Merge pull request #904 from jiaxiaolei/master
feat(examples): add `authorized_sanic.py`
2017-08-31 10:35:23 -07:00
Raphael Deem
7a6f2d8336 Merge pull request #926 from manisenkov/patch-1
Fix LICENSE date and name
2017-08-30 17:49:59 -07:00
Maksim Anisenkov
f49554aa57 Fix LICENSE date and name 2017-08-30 15:30:22 +02:00
Timur
0a72168f8f Example logging X-Request-Id transparently 2017-08-29 23:05:57 +03:00
Raphael Deem
5011bfef55 Merge pull request #917 from CharAct3/bugfix/fix_unauthorized
fix #914, change arguments of Unauthorized.__init__
2017-08-24 13:45:58 -07:00
Darren
6038813d03 fix #914, change arguments of Unauthorized.__init__ 2017-08-24 22:59:25 +08:00
Raphael Deem
fee9de96de Merge pull request #908 from Ezi4Zy/master
fix: error param
2017-08-23 16:22:13 -07:00
xmsun
35e028cd99 fix: error param 2017-08-22 16:40:42 +08:00
lixxu
145cdd5c1b Merge branch 'use-route-name-for-method' 2017-08-22 14:02:56 +08:00
lixxu
762b2782ee use name to define route name for different methods on same url 2017-08-22 14:02:38 +08:00
jiaxiaolei
91f031b661 feat(examples): add authorized_sanic.py
You can check a request if the client is authorized
to access a resource by the decorator `authorized`
2017-08-21 22:40:07 +08:00
lixxu
eab809d410 add name option for route building 2017-08-21 18:05:34 +08:00
Raphael Deem
826f1b4713 Merge pull request #898 from jiaxiaolei/master
feat(examples): add `add_task_sanic.py`
2017-08-21 00:32:38 -07:00
Raphael Deem
fa1a95ae91 Merge pull request #900 from yunstanford/patch-default-strict-slashes
Patch default strict slashes
2017-08-21 00:31:42 -07:00
Yun Xu
63babae63d add doc 2017-08-21 00:28:01 -07:00
Raphael Deem
db9924a399 Merge pull request #899 from hatarist/patch-2
Add a line on headers in the "Request Data" docs
2017-08-20 23:51:07 -07:00
Yun Xu
5d23c7644b add unit tests 2017-08-20 23:37:22 -07:00
Yun Xu
ef81a9f547 make strict_slashes default value configurable 2017-08-20 23:11:38 -07:00
7
747c21da70 Merge pull request #11 from channelcat/master
merge upstream master branch
2017-08-20 22:51:19 -07:00
Igor Hatarist
439ff11d13 Added a line on headers in the "Request Data" docs 2017-08-20 19:28:09 +03:00
jiaxiaolei
947364e15f feat(exapmles): add add_task_sanic.py 2017-08-20 11:11:14 +08:00
Eli Uriegas
750115b727 Merge pull request #894 from pkuphy/patch-1
fix typo
2017-08-18 10:44:35 -07:00
pkuphy
a55efc832d fix typo 2017-08-19 01:03:54 +08:00
Raphael Deem
c96bd21389 Merge pull request #892 from jiaxiaolei/master
docs(README): Make it clear and easy to read.
2017-08-18 02:09:19 -07:00
jiaxiaolei
dd241bd6fa docs(README): Make it clear and easy to read. 2017-08-18 17:00:34 +08:00
Raphael Deem
0dbde7400f Merge pull request #889 from dongweiming/doc
Fix blueprint doc
2017-08-16 00:19:58 -07:00
dongweiming
2587f6753d Fix blueprint doc 2017-08-15 22:04:25 +08:00
Raphael Deem
4155e76a81 Merge pull request #886 from yunstanford/fix-cov-report
Fix cov report
2017-08-14 16:21:50 -07:00
Yun Xu
756bd19181 do not fail if no files for coverage combine 2017-08-10 08:39:02 -07:00
Yun Xu
fbb2344895 fix cov report 2017-08-10 07:55:38 -07:00
7
bda6c85638 Merge pull request #10 from channelcat/master
merge upstream master branch
2017-08-10 07:47:45 -07:00
Raphael Deem
df4a149cd0 Merge pull request #885 from yunstanford/master
add triggers events when async create_server
2017-08-09 15:59:44 -07:00
Yun Xu
80f27b1db9 add unit tests and make flake8 happy 2017-08-08 22:21:40 -07:00
Yun Xu
d5d1d3b45a add trigger before_start events in create_server 2017-08-08 21:58:10 -07:00
Eli Uriegas
c797c3f22d Merge pull request #883 from miguelgrinberg/websocket-subprotocols
Weboscket subprotocol negotiation
2017-08-08 11:49:21 -07:00
Miguel Grinberg
375ed23216 Weboscket subprotocol negotiation
Fixes #874
2017-08-08 11:40:44 -07:00
Raphael Deem
7b66a56cad Merge pull request #870 from MichaelYusko/small-amendment
Did the small changes for better readable
2017-08-03 18:39:26 -07:00
MichaelYusko
7216bf7835 merge master into local branch 2017-08-03 12:11:47 +03:00
Eli Uriegas
8b24c35ac7 Merge pull request #878 from seemethere/increment_060
Increment to 0.6.0
2017-08-02 19:13:39 -07:00
Eli Uriegas
f80a6ae228 Increment to 0.6.0 2017-08-02 19:11:53 -07:00
7
9b3fbe4593 fixed small doc issue (#877) 2017-08-02 10:15:18 -07:00
Yun Xu
f99a723627 fixed small doc issue 2017-08-02 09:05:33 -07:00
7
181ffb00a7 Merge pull request #9 from channelcat/master
merging upstream master branch
2017-08-02 09:04:31 -07:00
Eli Uriegas
222eca64d7 Merge pull request #875 from cctse/patch-1
add some example links for readme
2017-08-01 09:36:37 -07:00
akc
1b687f3feb add some example links 2017-08-01 16:32:15 +08:00
Eli Uriegas
2228104bff Merge pull request #862 from zyguan/revert-599fbce
revert 599fbce
2017-07-31 13:51:04 -07:00
Raphael Deem
402c3752c4 Merge pull request #871 from Frzk/unauthorized-exception
Simplified the Unauthorized exception __init__ signature.
2017-07-31 12:23:12 -07:00
François KUBLER
69a8bb5e1f Fixed a trailing white space in the docstring. 2017-07-28 22:29:45 +02:00
Raphael Deem
0d76f9e030 Merge pull request #873 from yunstanford/fix-timeout-issue
Fix timeout issue
2017-07-28 09:49:22 -07:00
Yun Xu
eb8f65c58b switch to use dist: precise 2017-07-27 22:21:19 -07:00
7
e0e27a671e Merge pull request #8 from channelcat/master
merge upstream master branch
2017-07-27 20:01:59 -07:00
François KUBLER
b65eb69d9f Simplified the Unauthorized exception __init__ signature.
(again).
Use of **kwargs makes it more straight forward and easier to use.
2017-07-27 23:00:27 +02:00
Raphael Deem
8118e542fb Merge pull request #866 from Nikamura/patch-1
Fix typo in documentation
2017-07-26 13:18:52 -07:00
Raphael Deem
c866759bd4 Merge pull request #868 from cclauss/patch-2
Comment: F821 undefined name is done on purpose
2017-07-26 13:18:29 -07:00
MichaelYusko
429f7377cb Did the small changes for better readable 2017-07-26 19:32:23 +03:00
cclauss
40776e5324 Comment: F821 undefined name is done on purpose
Comment helps readers and `# noqa` silences linters
2017-07-26 12:44:30 +02:00
Karolis Mažukna
621343112d Fix typo in documentation 2017-07-25 13:29:17 +03:00
Raphael Deem
eb06e6ba51 Merge pull request #863 from zyguan/issue-760
handle keep-alive timeout gracefully
2017-07-25 00:54:43 -07:00
zyguan
da91b16244 add tests 2017-07-24 18:21:15 +08:00
zyguan
918e2ba8d0 Revert "fix #752"
This reverts commit 599fbcee6e.
2017-07-24 11:53:11 +08:00
zyguan
f50dc83829 handle keep-alive timeout gracefully 2017-07-24 01:37:36 +08:00
Raphael Deem
e8a9b4743b Merge pull request #861 from yunstanford/add-jinja-sanic
Add jinja2-sanic
2017-07-22 20:16:21 -07:00
Yun Xu
f34226425e add jinja2-sanic 2017-07-22 18:41:53 -07:00
7
e27c7ba36f Merge pull request #7 from channelcat/master
merge upstreaming master branch
2017-07-22 18:28:38 -07:00
Raphael Deem
1aad527956 Merge pull request #824 from Frzk/unauthorized-exception
Simplified the `Unauthorized.__init__` signature.
2017-07-21 23:50:36 -07:00
Raphael Deem
173c62acb6 Merge branch 'master' into unauthorized-exception 2017-07-21 01:54:45 -07:00
Raphael Deem
76605d7dfe Merge pull request #858 from mohd-akram/freebsd-syslog
Fix FreeBSD syslog path
2017-07-20 20:50:25 -07:00
Mohamed Akram
32be1a6496 Fix FreeBSD syslog path 2017-07-20 03:02:40 +04:00
Raphael Deem
c7d43aa544 Merge pull request #853 from yunstanford/patch-proxy-fix
proxy fix
2017-07-17 01:01:26 -07:00
Yun Xu
198bf55b7b flake8 fix 2017-07-14 17:23:18 -07:00
Yun Xu
75378d3567 add remote_addr property for proxy fix 2017-07-14 09:29:16 -07:00
7
55cb371569 Merge pull request #6 from channelcat/master
merge upstreaming master branch
2017-07-13 20:07:15 -07:00
Raphael Deem
5bb97d25d0 Merge pull request #839 from asvetlov/patch-1
Drop benchmarks from README
2017-07-13 14:41:57 -07:00
Andrew Svetlov
b2017cae77 Drop benchmarks from readme 2017-07-13 23:41:04 +02:00
Raphael Deem
35af903d4a Merge pull request #851 from zenixls2/issue-805
don't let default LOGGING to dictConfig influence already existed configs
2017-07-13 12:49:29 -07:00
zenix
426e00b6f4 dont let dictConfig influence already exists configs 2017-07-13 15:09:04 +09:00
Raphael Deem
8e62b3e438 Merge pull request #850 from r0fls/versioning
Versioning
2017-07-12 22:37:14 -07:00
Raphael Deem
4265ad5f23 add versioning 2017-07-12 22:19:42 -07:00
Raphael Deem
c181eb0539 Merge branch 'master' of https://github.com/channelcat/sanic 2017-07-12 21:12:15 -07:00
Raphael Deem
e0f06753c6 Merge pull request #831 from yunstanford/auto-doc
Improve Documentation.
2017-07-12 20:20:51 -07:00
Raphael Deem
8aafd72ef0 Merge branch 'auto-doc' of https://github.com/yunstanford/sanic 2017-07-12 20:19:30 -07:00
Raphael Deem
48549ce97b Merge pull request #847 from youknowone/gunicorn
ensure loop.close() and sys.exit() in gunicorn worker
2017-07-12 18:19:39 -07:00
Jeong YunWon
47abf83960 Protocol configurable gunicorn worker 2017-07-12 22:30:13 +09:00
Jeong YunWon
be0f3731b4 ensure loop.close() and sys.exit() in gunicorn worker 2017-07-12 22:26:58 +09:00
Raphael Deem
b755431b93 Merge pull request #844 from sfermigier/patch-1
Add missing code block qualifier
2017-07-10 15:26:57 -07:00
Stefane Fermigier
04ff393875 Add missing code block qualifier 2017-07-10 22:11:12 +02:00
Raphael Deem
7841274300 Merge pull request #843 from yunstanford/case-insensitive-check
Case insensitive check
2017-07-10 12:44:25 -07:00
Yun Xu
235687d983 should call lower just once 2017-07-10 12:37:21 -07:00
Yun Xu
3d75e6ed95 case-insensitive check for header fields 2017-07-10 12:29:47 -07:00
Andrew Svetlov
eb9af8bceb Drop aiohttp from benchmark table
The reason is: aiohttp with disabled access log shows about 16,000 RPS on sanic's own benchmark.
It's pretty much faster than 3,000 RPS from the table.

I'm not a Sanic dev team member. You should not trust users to update this table but manage periodic updates yourself.
If you don't want to do it --- it's up to you.
Please just drop very incorrect and outdated numbers from README in this case.
2017-07-09 08:18:45 +02:00
7
39ea434513 Merge pull request #5 from channelcat/master
merge upstream master branch
2017-07-08 14:23:00 -07:00
Raphael Deem
f0a956467c Merge pull request #815 from yunstanford/master
add graceful timeout when shutdown
2017-07-08 11:31:37 -07:00
Yun Xu
e48bd08095 make flake8 happy 2017-07-02 10:05:33 -07:00
Yun Xu
5d00717f39 improve doc and remove warnings 2017-07-02 10:02:04 -07:00
Yun Xu
3fff685c44 add auto-doc support 2017-07-01 23:46:34 -07:00
Raphael Deem
1e75265eed Merge pull request #756 from qwesda/master
fixes #755 fragmented headers
2017-06-30 18:24:51 -07:00
Eli Uriegas
b6ac3ef445 Merge pull request #826 from yunstanford/pytest-sanic
Pytest sanic
2017-06-30 18:17:15 -07:00
Raphael Deem
421f78f3e6 Merge pull request #814 from Frzk/forbidden-exception
Added a Forbidden exception
2017-06-30 18:11:23 -07:00
Eli Uriegas
b71fdcfc20 Merge pull request #829 from Frzk/config_doc
Fixed an error : `Sanic.__init__` doesn't have a `load_vars` parameter.
2017-06-30 10:06:30 -07:00
François KUBLER
021e9b228a Fixed a small error : Sanic.__init__ doesn't have a load_vars parameter.
It is `load_env`.
2017-06-30 16:24:41 +02:00
Raphael Deem
00d4533022 Merge pull request #821 from Frzk/bearer-support
Inverted the order of prefixes in Request.token property.
2017-06-29 09:43:34 -07:00
Yun Xu
fd5faeb5dd add an example 2017-06-29 09:14:21 -07:00
Yun Xu
e7c8035ed7 add pytest-sanic 2017-06-29 09:06:17 -07:00
François KUBLER
e427e38da8 Simplified the Unauthorized.__init__ signature.
It doesn't really make sense to have a `realm` parameter in the method signature.
Instead, one can simply set the realm in the `challenge` dict if necessary.

Also fixed the tests accordingly (and added a new one for "Bearer" auth-scheme).
2017-06-29 12:34:52 +02:00
François KUBLER
1f24abc3d2 Fixed support for "Bearer" and "Token" auth-schemes.
Removed the test for "Authentication: Bearer Token <TOKEN>" which was not supposed to exist (see https://github.com/channelcat/sanic/pull/821)
Also added a call to `split` when retrieving the token value to handle cases where there are leading or trailing spaces.
2017-06-29 10:23:49 +02:00
François
76e62779ba Merge branch 'master' into forbidden-exception 2017-06-28 17:25:40 +02:00
Eli Uriegas
1af343ef50 Merge pull request #823 from ojii/cookies-warning
Added a warning to the cookies documentation about security
2017-06-27 19:35:35 -07:00
Jonas Obrist
412ffd1592 Added a warning to the cookies documentation about security 2017-06-28 11:05:59 +09:00
Daniel Schwarz
b141fec573 Merge remote-tracking branch 'upstream/master'
# Conflicts:
#	sanic/server.py
2017-06-27 13:32:49 +02:00
François KUBLER
d2e14abfd5 Inverted the order of prefixes in Request.token property.
As suggested by @allan-simon
See: https://github.com/channelcat/sanic/pull/811#pullrequestreview-46144327
2017-06-27 12:57:47 +02:00
Raphael Deem
d4abca0480 Merge pull request #818 from youknowone/debug
Introduce debug mode for HTTP protocol
2017-06-26 22:02:37 -07:00
Raphael Deem
529f5822ee Merge pull request #819 from r0fls/817
convert environment vars to int if digits
2017-06-26 21:54:54 -07:00
Raphael Deem
395d85a12f use try/except 2017-06-26 21:35:01 -07:00
Raphael Deem
4379a4b067 float logic 2017-06-26 20:59:59 -07:00
Raphael Deem
ad8e1cbf62 convert environment vars to int if digits 2017-06-26 20:49:41 -07:00
Jeong YunWon
dc5a70b0de Introduce debug mode for HTTP protocol 2017-06-26 21:13:13 +09:00
Yun Xu
b5d1f52ea4 make flake8 happy 2017-06-25 10:22:40 -07:00
Yun Xu
221cf235b5 fix a unit test 2017-06-25 01:03:28 -07:00
Yun Xu
7720e31a31 add unit test 2017-06-25 00:51:59 -07:00
Yun Xu
d812affef0 add graceful_shutdown_timeout to gunicorn worker 2017-06-25 00:51:14 -07:00
Yun Xu
5c19eb34bf add graceful_shutdown_timeout 2017-06-24 19:00:33 -07:00
7
e18ebaee3d Merge pull request #4 from channelcat/master
merge upstream master branch
2017-06-24 18:21:13 -07:00
Eli Uriegas
dbcbf12456 Merge pull request #811 from Frzk/bearer-support
Added support for 'Authorization: Bearer <TOKEN>' header...
2017-06-23 10:32:21 -07:00
Eli Uriegas
c04b44057c Merge pull request #813 from Frzk/unauthorized-exception
Added an Unauthorized exception
2017-06-23 10:30:51 -07:00
François KUBLER
60aa60f48e Fixed the test for the new Unauthorized exception. 2017-06-23 17:16:31 +02:00
François KUBLER
2848d7c80e Added a Forbidden exception
Also added a small test.
2017-06-23 16:44:57 +02:00
François KUBLER
9fcdacb624 Modified the name of an argument. 2017-06-23 16:29:04 +02:00
François KUBLER
cf1713b085 Added a Unauthorized exception.
Also added a few tests related to this new exception.
2017-06-23 16:12:15 +02:00
7
f049a4ca67 Recycling gunicorn worker (#800)
* add recycling feature to gunicorn worker

* add unit tests

* add more unit tests, and remove redundant trigger_events call

* fixed up unit tests

* make flake8 happy

* address feedbacks

* make flake8 happy

* add doc
2017-06-22 13:26:50 -07:00
François KUBLER
55f860da2f Added support for 'Authorization: Bearer <TOKEN>' header in Request.token property.
Also added a test case for that kind of header.
2017-06-22 18:11:23 +02:00
Eli Uriegas
b5369e611c Merge pull request #764 from stopspazzing/master
Clean up of examples.
2017-06-20 14:53:37 -07:00
Jeremy Zimmerman
3d1dd1c6ac re-add extensions.md to fix merge conflict. 2017-06-20 14:49:12 -07:00
Eli Uriegas
10a363b275 Merge pull request #809 from jrocketfingers/feature/allow-textual-responses
Allow textual responses when using test_client and aiohttp 2
2017-06-20 10:30:26 -07:00
Nikola Kolevski
d865c5e2b6 Conform to pep8 2017-06-20 13:22:28 +02:00
Nikola Kolevski
9fac37588c Allow textual responses when using test_client and aiohttp 2 2017-06-20 13:15:30 +02:00
Jeremy Zimmerman
aac0d58417 Delete extensions.md
non-core content moved to wiki
2017-06-19 15:18:32 -07:00
Eli Uriegas
b37e6187d4 Merge pull request #802 from yunstanford/add-match-info
Add match_info property to request class
2017-06-18 10:15:46 -07:00
Yun Xu
20138ee85f add match_info to request 2017-06-17 09:47:58 -07:00
Raphael Deem
6dc569cde5 Merge pull request #795 from ekampf/patch-1
Prevent `run` from overriding logging config set in constructor
2017-06-15 23:39:09 -07:00
Eran Kampf
77cf0b678a Fix has_log value 2017-06-15 11:21:08 -07:00
Eran Kampf
2dfb061063 Prevent run from overriding logging config set in constructor
When creating the `Sanic` instance I provide it with a customized `log_config`.
Calling `run` overrides these settings unless I provide it *again* with the same `log_config`.
This is confusing and error prone. `run` shouldnt override configurations set in the `Sanic` constructor...
2017-06-15 10:39:00 -07:00
7
e4669e2581 Merge pull request #3 from channelcat/master
merge upstream master branch
2017-06-12 22:15:31 -07:00
Eli Uriegas
df47cf72d3 Merge pull request #791 from seemethere/add_docs_requirements
Add docs requirements
2017-06-12 10:55:45 -07:00
Eli Uriegas
ba1b34e375 Add docs requirements
Closes channelcat/sanic#787
2017-06-12 10:29:38 -07:00
Eli Uriegas
950b5ee529 Merge pull request #789 from yunstanford/coverage-report
Coverage report
2017-06-11 23:38:04 -07:00
Raphael Deem
041c48de19 Merge pull request #790 from r0fls/752
fix #752
2017-06-11 23:24:32 -07:00
Raphael Deem
599fbcee6e fix #752 2017-06-11 23:20:04 -07:00
Yun Xu
ce2df8030c quick fix for test_gunicorn_worker test 2017-06-11 09:06:48 -07:00
Yun Xu
47e761bbe2 add coverage report 2017-06-11 08:49:35 -07:00
7
0646baa18d Merge pull request #2 from channelcat/master
remote-tracking with upstream
2017-06-10 11:54:11 -07:00
Eli Uriegas
38997c1b47 Merge pull request #786 from yunstanford/handle_stream_404
Handle stream 404
2017-06-10 10:03:54 -07:00
Yun Xu
acaafabc23 retry build 2017-06-10 09:57:32 -07:00
Yun Xu
6a80bdafa6 add unit tests 2017-06-10 09:48:30 -07:00
Yun Xu
cf30ed745c also should handle InvalidUsage exception 2017-06-10 09:42:48 -07:00
Eli Uriegas
a399fb4044 Merge pull request #785 from youknowone/gunicorn
Gunicorn worker hints app weather it is being terminated
2017-06-09 11:00:12 -07:00
Yun Xu
24b946e850 make flake8 happy 2017-06-09 08:43:23 -07:00
Yun Xu
236daf48ff add unit tests 2017-06-09 08:42:48 -07:00
Yun Xu
4942af27dc handle NotFound 2017-06-09 08:33:34 -07:00
7
3adb90071b Merge pull request #1 from channelcat/master
merge upstream project
2017-06-09 08:23:14 -07:00
Jeong YunWon
29b4a2a08c Gunicorn worker hints app weather it is being terminated
For now, `Sanic.is_running` is set when the worker is started but not
unset when it is about to stopped. Setting the flag for quit signal
will not affect working requests, but the `Sanic.is_running` flag still
can be used to support graceful termination.
2017-06-09 14:51:15 +09:00
Eli Uriegas
e1331fc0a2 Merge pull request #783 from yunstanford/master
add content_type property in request
2017-06-08 17:30:07 -07:00
Yun Xu
3802f8ff65 unit tests 2017-06-08 17:25:22 -07:00
Eli Uriegas
4b0abdbe7c Merge pull request #782 from yunstanford/sanic-transmute-doc
add sanic-transmute
2017-06-08 14:27:29 -07:00
Yun Xu
81889fd7a3 add unit tests 2017-06-07 20:48:07 -07:00
Yun Xu
aac99c45c0 add content_type property in request 2017-06-07 20:46:48 -07:00
Yun Xu
566a6369a5 add sanic-transmute 2017-06-07 20:27:54 -07:00
Eli Uriegas
4fdf340d04 Merge pull request #773 from mbatchkarov/await-json-too
Testing: store JSON response in local request
2017-06-07 12:05:30 -07:00
Miroslav Batchkarov
ddd7145153 check json is None if body is not JSON 2017-06-07 10:03:27 +01:00
Miroslav Batchkarov
3f22b644b6 wrap call to json in try-except to make tests pass 2017-06-07 09:57:07 +01:00
Eli Uriegas
639c9f579d Merge pull request #774 from algtmatt/feature/logging_doc_fixes
Small logging docs fixes
2017-06-05 10:56:00 -07:00
Matthew Snyder
735b8665f1 Small logging docs fixes 2017-06-05 10:42:17 -07:00
Miroslav Batchkarov
199fa50a9d also store json result in local request 2017-06-05 16:24:23 +01:00
Jeremy Zimmerman
aac5ad8504 Merge remote-tracking branch 'origin/master' 2017-06-01 16:53:36 -07:00
Jeremy Zimmerman
349c108ebc re-added request_stream example. 2017-06-01 16:52:56 -07:00
Eli Uriegas
3b464782ef Merge pull request #765 from ttopholm/master
Added content_type to be set for son response
2017-06-01 16:29:08 -05:00
Tue Topholm
3d97fd8d2a Removed whitespace 2017-06-01 23:09:37 +02:00
Tue Topholm
c102e76146 Fixed line width 2017-06-01 23:01:27 +02:00
Jeremy Zimmerman
beee7b68bf reverted back to default 0.0.0.0 host 2017-06-01 14:01:13 -07:00
Tue Topholm
f47e571d92 Added content_type to be set for son response 2017-06-01 22:53:56 +02:00
Jeremy Zimmerman
4b5320a8f0 Clean up of examples. Removes non-core examples, optimizes and restyles remaining to strictly follow PEP 8 styling guidelines. Non-Core examples will be moved to Wiki. 2017-06-01 11:53:05 -07:00
Daniel Schwarz
30c2c89c6b fix partial url parsing 2017-05-30 16:13:49 +02:00
Raphael Deem
4a1d1a0dc1 Merge pull request #750 from xenu256/patch-1
aiomysql has DictCursor
2017-05-29 22:54:09 -07:00
Raphael Deem
360adc9130 Merge pull request #757 from monobot/asyncOrmV020
update asyncorm version example to 0.2.0
2017-05-29 22:51:46 -07:00
Raphael Deem
cc21abe843 Merge pull request #751 from ak04nv/patch-1
Update jinja_example.py
2017-05-28 23:50:16 -07:00
monobot
9a27555763 update asyncorm version example to 0.2.0 2017-05-29 00:01:56 +01:00
Daniel Schwarz
aaef2fbd01 fix flake8 errors 2017-05-28 18:46:07 +02:00
Anton Kochnev
5bb640ca17 Update jinja_example.py
Added python version check for enabling async mode.
2017-05-28 14:37:41 +08:00
Daniel Schwarz
0e5c7a62cb remove debug messages 2017-05-27 22:36:08 +02:00
Daniel Schwarz
1b33e05f74 fix debug log messages 2017-05-27 16:32:39 +02:00
Daniel Schwarz
53a04309ff add header_fragment handeling 2017-05-27 16:28:57 +02:00
Daniel Schwarz
dc411651b6 add check for header and value 2017-05-27 15:36:57 +02:00
Daniel Schwarz
514540b90b add debug for header values 2017-05-27 15:32:37 +02:00
Tadas Talaikis
a5249d1f5d aiomysql has DictCursor 2017-05-27 11:06:45 +03:00
Eli Uriegas
21aa3f6578 Merge pull request #748 from messense/feature/websocket-config
Add websocket max_size and max_queue configuration
2017-05-26 10:44:49 -05:00
messense
0024edbbb9 Add websocket max_size and max_queue configuration 2017-05-26 11:15:28 +08:00
Raphael Deem
23cb39b557 Merge pull request #744 from algtmatt/feature/from_file_doc_fix
Update config file loader docs
2017-05-24 18:16:10 -07:00
Eli Uriegas
48de321869 Merge pull request #697 from 38elements/stream
Add Request.stream
2017-05-24 16:22:52 -07:00
Raphael Deem
c6d68009d2 Merge pull request #745 from messense/feature/gunicorn-worker-test-case
Add a simple integration test for Gunicorn worker
2017-05-23 20:53:01 -07:00
Raphael Deem
2cab267405 Merge pull request #734 from ashleysommer/static_large_file_stream
Add option to static helper to use streaming for large files.
2017-05-23 20:52:12 -07:00
messense
6bdc0d2e5e Fix Gunicorn worker 2017-05-23 11:28:12 +08:00
messense
3eed81c1eb Add a simple integration test for Gunicorn worker 2017-05-23 11:04:27 +08:00
Raphael Deem
b447807b36 Merge pull request #742 from r0fls/700
changes required for unix socket support
2017-05-22 19:29:32 -07:00
Eli Uriegas
2771c8c32e Merge pull request #738 from messense/feature/conduct
Add Code of Conduct
2017-05-22 15:34:01 -07:00
Eli Uriegas
21a88bc2d3 Add maintainers email address 2017-05-22 15:31:05 -07:00
Matthew Snyder
57c1838f68 Update config file loader docs 2017-05-22 14:10:08 -07:00
Raphael Deem
52b0254ec6 unix socket support; fixes #700 2017-05-21 03:15:06 -07:00
Raphael Deem
49631542ce Merge pull request #732 from jrocketfingers/feature/explicit-register-middleware
Extract register_middleware into a method.
2017-05-21 03:06:12 -07:00
Raphael Deem
4b80ffb9eb Merge pull request #740 from r0fls/739
add abort function
2017-05-21 02:20:02 -07:00
Raphael Deem
9efa7c116d remove redundant code; decode response 2017-05-20 23:27:00 -07:00
Raphael Deem
5d9c8d59a0 add abort() test 2017-05-20 14:43:57 -07:00
Raphael Deem
1a60201f68 flake8 spacing 2017-05-20 10:44:09 -07:00
Raphael Deem
f3186abf09 SANIC_EXCEPTIONS -> _sanic_exceptions 2017-05-20 10:29:00 -07:00
Raphael Deem
6bcc0d3c7f Merge remote-tracking branch 'upstream/master' 2017-05-20 02:17:26 -07:00
Raphael Deem
57b9a57dde Update README.rst 2017-05-20 02:17:12 -07:00
Raphael Deem
28994f4b64 update todo 2017-05-20 02:15:45 -07:00
Raphael Deem
588b4712bf add exception decorator 2017-05-20 01:24:34 -07:00
Raphael Deem
d3b6208057 add abort function 2017-05-19 18:52:19 -07:00
Ashley Sommer
ef80953b1b Fix flake8 line length error. 2017-05-20 09:56:05 +10:00
ashleysommer
72db1188c7 Add an option to the static() helper to switch on streaming for large files.
By default uses a 1M threshold.
ie. if the static file to serve is >= 1M it will stream the file.
This threshold value is configurable by passing an int instead of a bool to `stream_large_files` parameter of `static()`.
2017-05-20 09:56:05 +10:00
Raphael Deem
0858d3c544 Merge pull request #733 from ashleysommer/file_stream
Add file_stream response handler
2017-05-19 16:48:12 -07:00
Ashley Sommer
5c5656f981 Moved file_stream tests to test_responses.py 2017-05-20 09:41:36 +10:00
Raphael Deem
58a9c92d75 fix 739 2017-05-19 13:35:04 -07:00
Eli Uriegas
a6dc4646db Merge pull request #737 from 38elements/deploying
Fix Running via Gunicorn in deploying.md
2017-05-19 12:10:36 -07:00
messense
8ff553e926 Add Code of Conduct 2017-05-19 23:09:44 +08:00
38elements
848a5c61f0 Fix Running via Gunicorn in deploying.md 2017-05-19 23:22:57 +09:00
Raphael Deem
d49000e9f4 Merge pull request #736 from fanjindong/examples_read
debug 'Blueprint names must be unique'
2017-05-19 01:45:06 -07:00
fanjindong
a82145c4e6 debug 'Blueprint names must be unique' 2017-05-19 16:26:56 +08:00
Ashley Sommer
181edb7235 Test file() and file_stream() response helpers.
Added test for `file()` response helper and `file_stream()` response helper.
2017-05-19 13:01:21 +10:00
Ashley Sommer
ff2ae11ac8 Remove exception print(e) statement. 2017-05-19 13:00:01 +10:00
Johnny Rocketfingers
3f841f3b21 Switch to non-hardcoded register_middleware. 2017-05-18 22:08:44 +02:00
ashleysommer
181977ad4e Added brief documentation with an example for file_stream
Added test to ensure `file_stream()` works in the test suite.
2017-05-18 18:12:26 +10:00
ashleysommer
e155fe403d Add file_stream response handler
For streaming large static files
Like `file()` but breaks the file into chunks and sends it with a `StreamingHTTPResponse`
Chunk size is configurable, but defaults to 4k, this seemed to be the sweet spot in my testing.
Also supports ContentRange same as `file()` does.
2017-05-18 18:04:28 +10:00
Johnny Rocketfingers
bf5438d573 Extract register_middleware into a method. 2017-05-18 06:36:11 +02:00
Raphael Deem
0e4aaf8856 Merge pull request #731 from jrocketfingers/fix/token-missing-auth-headers
Check that the Authorization headers are actually provided.
2017-05-17 13:10:12 -07:00
Raphael Deem
5c44ce1637 Merge pull request #719 from messense/feature/worker-uvloop
Gunicorn worker should not require uvloop
2017-05-17 12:47:19 -07:00
Raphael Deem
974fe25a11 Merge pull request #722 from messense/feature/ci-without-ext
Add py3*-no-ext test env
2017-05-17 12:47:05 -07:00
Johnny
58bae83558 Add a regression test. 2017-05-17 11:15:45 +02:00
Johnny
5d309af86f Check that the headers are actually provided. 2017-05-17 11:08:50 +02:00
messense
ec857d1c53 Drop tox-travis 2017-05-17 12:21:56 +08:00
messense
2f84cdd708 Fix websocket handler bug on Python3.5 with no uvloop 2017-05-17 12:12:25 +08:00
messense
7cc02e84ed Fix json loads bug on Python 3.5 2017-05-17 12:12:25 +08:00
Raphael Deem
87c2a5bc97 Merge pull request #724 from suoning/doc-logger
update logging doc
2017-05-16 21:10:57 -07:00
Eli Uriegas
826c2e0f4e Merge pull request #725 from 38elements/contributing
Add rule in CONTRIBUTING.md
2017-05-15 14:45:57 -07:00
Eli Uriegas
b5e25e13b7 Merge pull request #727 from argaen/update_aiocache_example
Update aiocache example to latest version
2017-05-15 11:39:02 -07:00
argaen
f9653114d1 Update aiocache example to latest version 2017-05-15 20:30:52 +02:00
Eli Uriegas
6b7e19891b Get rid of un-needed s, Fix some formatting. 2017-05-15 10:54:47 -07:00
38elements
a677f14423 Add rule in CONTRIBUTING.md 2017-05-15 21:28:35 +09:00
suoning
dddce3f30d update logging, Remove the comments 2017-05-15 13:59:03 +08:00
Raphael Deem
be93d670a3 Merge pull request #717 from jrocketfingers/fix/ipv6-access-log
Fix "TypeError: not all arguments converted during string formatting"
2017-05-14 20:28:07 -07:00
suoning
68d4bb6ffe update logging doc 2017-05-15 10:54:30 +08:00
suoning
a27471178a update logger doc 2017-05-15 10:25:19 +08:00
messense
66fcb0cc8f Add py3*-no-ext test env 2017-05-15 10:10:50 +08:00
messense
05d0ddc281 Gunicorn worker should not require uvloop 2017-05-15 00:01:51 +08:00
Johnny Rocketfingers
b1890f50b6 Conform to pep8 2017-05-14 10:15:11 +02:00
Johnny Rocketfingers
b44c707e94 Prevent incorrect tuple size on get_extra_info errors
According to https://docs.python.org/3/library/asyncio-protocol.html#asyncio.BaseTransport.get_extra_info,
get_extra_info fails by returning None. This is an attempt in
normalization of the response in cases of AF_INET, AF_INET6 and
erroneous return values.
2017-05-14 09:56:56 +02:00
Johnny
4c7675939a Fix "TypeError: not all arguments converted during string formatting"
socket.getpeername() returns AF_INET6 address family four-tuple, with
flowid and scopeid.

In server's write_response, an exception is raised when an IPv6 client
connects due to four-tuple elements having two unused elements (flowid
and scopeid).

This makes sure that only the first two (host and port) are used in log
string formatting.
2017-05-13 17:35:04 +02:00
Eli Uriegas
fa1b7de52a Merge pull request #706 from messense/feature/remove-log-file
Remove timedRotatingFile log config
2017-05-12 10:56:19 -07:00
Eli Uriegas
666f8c8d3c Merge pull request #712 from stopspazzing/master
Fixed plotly_example, now works
2017-05-12 10:39:57 -07:00
Jeremy Zimmerman
996c0b3280 Fixed with a working example
Remember:
K.I.S.S
2017-05-11 13:40:16 -07:00
Eli Uriegas
f9d428de8b Merge pull request #711 from stopspazzing/master
Use of register_blueprint will be deprecated, why not upgrade?
2017-05-11 12:53:43 -07:00
Jeremy Zimmerman
a17b3f1b84 Use of register_blueprint will be deprecated, why not upgrade? 2017-05-11 12:33:57 -07:00
Jeremy Zimmerman
f39512aa63 double if statement (#707)
* Migrated `%` string formating

* double if statement

combined double 'if' to a single 'if' with 'and'

* Revert "Fix "Prefer `format()` over string interpolation operator" issue"
2017-05-11 11:49:32 -07:00
Raphael Deem
23ee9f64d4 Merge pull request #710 from monobot/asyncorm_update
modify the asyncorm example, with the new lazy querysets
2017-05-11 10:35:52 -07:00
monobot
ad68739df7 modify the asyncorm example, for the new lazy querysets 2017-05-11 18:15:04 +01:00
38elements
a50d8421b8 Add heading in streaming.md 2017-05-11 19:18:58 +09:00
messense
3ea1b07906 Revert "Add access.log and error.log to .gitignore"
This reverts commit fc0d69616c.
2017-05-11 11:19:03 +08:00
messense
c3683662c2 Remove timedRotatingFile log config 2017-05-11 11:18:59 +08:00
Eli Uriegas
861865807a Merge pull request #705 from stopspazzing/patch-2
spelling mistake
2017-05-10 10:52:29 -07:00
Jeremy Zimmerman
18930082e2 spelling mistake
fixed incorrect spelling
2017-05-10 09:38:57 -07:00
38elements
6a14e49479 Replace stream decorator to stream parameter 2017-05-09 22:31:15 +09:00
Raphael Deem
c530d5f016 Merge pull request #704 from seemethere/fix_camel_case
Fix camel case module
2017-05-08 20:50:39 -07:00
Eli Uriegas
ac5e1a6ebd Fix import 2017-05-08 20:47:20 -07:00
Eli Uriegas
bb6de53f28 Fix docs 2017-05-08 20:44:19 -07:00
Eli Uriegas
bfcd499cc2 Remove default_filter module, put into logging 2017-05-08 20:41:34 -07:00
Eli Uriegas
3e88ec18e2 Actually add file >.> 2017-05-08 20:37:44 -07:00
Eli Uriegas
1fb640c313 Fix camel case module 2017-05-08 20:36:57 -07:00
Eli Uriegas
bece3d2bcf Merge pull request #702 from seemethere/increment_054
Increment to 0.5.4
2017-05-08 17:44:50 -07:00
Eli Uriegas
307d866bb6 Increment to 0.5.4 2017-05-08 17:44:23 -07:00
Eli Uriegas
95c9514a44 Merge pull request #701 from seemethere/increment_053
Increment to 0.5.3
2017-05-08 17:42:54 -07:00
Eli Uriegas
768be433d6 Increment to 0.5.3 2017-05-08 17:42:26 -07:00
38elements
4d4f38fb35 is_request_stream for CompositionView and HTTPMethodView 2017-05-09 01:04:03 +09:00
38elements
15ad07f03d Fix streaming.md 2017-05-08 00:10:36 +09:00
38elements
0b53c413a7 Add stream decorator for HTTPMethodView 2017-05-07 21:33:15 +09:00
38elements
931397c7e1 Add stream for CompositionView 2017-05-07 18:38:48 +09:00
38elements
ef2cc7ebf5 Add Request.stream 2017-05-07 18:38:48 +09:00
Eli Uriegas
8d3fd75ec2 Merge pull request #695 from kszucs/aiopeewee
aiopeewee example
2017-05-05 11:31:37 -07:00
Szucs Krisztian
a42b254c33 use async version of model_to_dict 2017-05-05 10:51:12 +02:00
Szucs Krisztian
d24e1ae110 removed lines from distributed example 2017-05-05 08:25:18 +02:00
Szucs Krisztian
2e7badab4e aiopeewee example 2017-05-05 08:19:09 +02:00
Raphael Deem
7cf3d49f00 Merge pull request #690 from 38elements/sanic_endpoint_test
Remove utils.py
2017-05-03 23:56:13 -07:00
38elements
25037006bf Remove utils.py 2017-05-04 15:52:18 +09:00
Eli Uriegas
f611eb2c2b Merge pull request #686 from 38elements/loop
Remove loop argument in run() and create_server()
2017-05-03 15:27:16 -07:00
38elements
e12c10b087 Remove loop argument in run() and create_server() 2017-05-03 17:52:19 +09:00
Raphael Deem
9527e5ded8 Merge pull request #685 from r0fls/document-create-server
document create_server method
2017-05-02 22:45:51 -07:00
Raphael Deem
23b4b20b4f document create_server method 2017-05-02 22:44:42 -07:00
Eli Uriegas
7f9ecd659c Merge pull request #682 from graingert/fix-readme.rst-encoding
fix README.rst -> long_description encoding
2017-05-02 10:48:45 -07:00
Eli Uriegas
bb31d465f2 Add distribution types 2017-05-02 10:47:55 -07:00
Thomas Grainger
834468e8e7 fix README.rst -> long_description encoding 2017-05-02 17:37:17 +01:00
Eli Uriegas
4720513672 Merge pull request #679 from graingert/verify-readme-rst-pypi
verify readme for PyPI
2017-05-02 09:31:01 -07:00
Eli Uriegas
a480110d43 Merge pull request #681 from 38elements/deprecation
Remove before_start, before_stop, after_start and after_stop
2017-05-02 09:28:33 -07:00
38elements
0a2c95cc10 Remove before_start, before_stop, after_start and after_stop 2017-05-02 23:07:09 +09:00
Thomas Grainger
9d2e32902d check readme in travis 2017-05-02 10:17:24 +01:00
Thomas Grainger
77b6413526 validate readme for PyPI 2017-05-02 10:05:06 +01:00
Thomas Grainger
9e502099e0 add readme to package directly 2017-05-02 10:05:05 +01:00
Eli Uriegas
c6a7e44ae7 Merge pull request #678 from stopspazzing/patch-1
misspelling
2017-05-01 17:27:31 -07:00
Jeremy Zimmerman
1bf06312b8 misspelling
acutal -> actual
2017-05-01 17:07:38 -07:00
Eli Uriegas
c35721abbd Merge pull request #670 from ticosax/set-exit-code-on-error
In case of error when starting sanic
2017-05-01 15:16:02 -07:00
Eli Uriegas
7f3c417078 Merge pull request #677 from abuckenheimer/master
added exception chain rendering in debug #675
2017-05-01 15:15:34 -07:00
Alec Buckenheimer
69511c2783 added exception chain rendering in debug #675 2017-05-01 12:56:33 -04:00
Eli Uriegas
158a94d34c Merge pull request #674 from 38elements/test-for-uri-template
Add test for uri_template
2017-04-30 20:22:51 -07:00
Eli Uriegas
db58bd68f5 Merge pull request #673 from 38elements/routing
Fix docs/sanic/routing.md
2017-04-30 20:22:30 -07:00
38elements
e65f08a2c8 Fix docs/sanic/routing.md 2017-04-30 22:03:16 +09:00
38elements
ab8f616385 Add test for uri_template 2017-04-30 21:57:32 +09:00
Eli Uriegas
6dc6f9bbb5 Merge pull request #671 from banteg/uri-template
Expose matched request uri template
2017-04-28 14:22:59 -07:00
banteg
7754bb995b expose matched request uri template 2017-04-29 02:39:56 +07:00
Nicolas Delaby
d1fefce61c fixup! In case of error when starting sanic 2017-04-28 20:06:44 +02:00
Nicolas Delaby
c3abdab9c4 The main process that spawn sub processes doesn't run any loop.
let's not try to stop one
2017-04-28 19:57:49 +02:00
Nicolas Delaby
8b13e103fd In case of error when starting sanic
Don't exit with code 0
2017-04-28 19:08:51 +02:00
Eli Uriegas
5f94f65f4f Merge pull request #669 from 38elements/log
Add access.log and error.log to .gitignore
2017-04-28 07:40:42 -07:00
38elements
fc0d69616c Add access.log and error.log to .gitignore 2017-04-28 18:52:15 +09:00
Raphael Deem
656f5b93d6 Merge pull request #668 from seemethere/stop_workers_on_sigint
Add the killing of children
2017-04-27 22:39:48 -07:00
Eli Uriegas
436d37c079 Add the killing of children
Kills children processes when parent process receives a signal to
shutdown.

Solves for #594
2017-04-27 17:47:08 -07:00
Eli Uriegas
ed0081fcf7 Merge pull request #625 from zenixls2/master
based on issue #608, create access log
2017-04-27 14:31:53 -07:00
Eli Uriegas
140062f8a3 Merge pull request #662 from rsrdesarrollo/master
invariant: body after request is processed must be binary
2017-04-27 14:31:28 -07:00
Raphael Deem
75f5fa7c06 Merge pull request #666 from pyx/issue-665
Fix #665 - ImportError not preserved in __main__.py
2017-04-27 00:32:09 -07:00
Philip Xu
9152a1a266 Improved on wording 2017-04-27 03:16:38 -04:00
Philip Xu
ade89ab795 Fix #665 - ImportError not preserved in __main__.py 2017-04-26 22:57:19 -04:00
zenix
95cfdee8b8 Merge branch 'master' of https://github.com/channelcat/sanic 2017-04-26 14:50:42 +09:00
zenix
63a27cc5e2 add document on logging 2017-04-26 14:50:21 +09:00
Eli Uriegas
472face796 Add link to issue tracking sanic projects! 2017-04-25 21:50:49 -07:00
Raphael Deem
8d537a6d0b Merge pull request #663 from mmaybeno/fix_jinja_example_typo
Fix typo for jinja example and converted to dir
2017-04-25 20:29:12 -07:00
Matt Maybeno
b3101d339e Fix typo for jinja example and converted to dir 2017-04-25 20:10:46 -07:00
Raúl Sampedro
85acddddba invariant: body after request is processed must be binary 2017-04-25 12:00:27 +02:00
zenix
c9d747d97f fix merge error 2017-04-25 11:46:13 +09:00
zenix
0bba267808 Merge branch 'master' of https://github.com/channelcat/sanic 2017-04-25 11:07:40 +09:00
Eli Uriegas
1036242064 Merge pull request #661 from seemethere/increment_052
Increment to 0.5.2
2017-04-24 13:39:39 -05:00
Eli Uriegas
5fd62098bd Increment to 0.5.2 2017-04-24 11:37:04 -07:00
Raphael Deem
74cc7be922 Merge branch 'master' into master 2017-04-24 00:47:01 -07:00
Raphael Deem
b3814ca89a Merge pull request #646 from r0fls/637
allow disabling keep alive
2017-04-24 00:43:16 -07:00
Raphael Deem
b75a321e4a Merge pull request #644 from kszucs/master
Example to use dask distibuted
2017-04-22 01:14:34 -07:00
Raphael Deem
9caa4fec4a Merge pull request #656 from r0fls/token-rework
update token attribute
2017-04-21 22:44:42 -07:00
Raphael Deem
a0cba1aee1 accept token directly in auth header 2017-04-21 22:36:45 -07:00
Raphael Deem
97018ad62f Merge pull request #655 from seemethere/fix_gunicorn_worker
Fix duplicate signal settings for gunicorn worker
2017-04-21 22:16:43 -07:00
Eli Uriegas
a7d17fae44 Fix duplicate signal settings for gunicorn worker 2017-04-21 17:06:52 -05:00
Raphael Deem
6ce0050979 Merge pull request #652 from 38elements/signal-stopped
Fix `this.signal.stopped` is `True` #639
2017-04-20 13:06:28 -07:00
38elements
bc035fca78 Remove unnecessary variables 2017-04-20 18:27:28 +09:00
38elements
df914a92e4 Fix this.signal.stopped is True #639 2017-04-19 11:19:01 +09:00
Szucs Krisztian
1b939a6823 work with distributed 1.16.1 2017-04-17 11:05:19 +02:00
Raphael Deem
81b6d988ec NO_KEEP_ALIVE -> KEEP_ALIVE 2017-04-16 22:43:49 -07:00
Raphael Deem
7e9b65feca Merge pull request #647 from r0fls/645
use absolute path in static root
2017-04-16 22:06:40 -07:00
Raphael Deem
6f098b3d21 add no_keep_alive setting to docs 2017-04-16 22:05:34 -07:00
Raphael Deem
5ddb0488f2 allow disabling keep alive 2017-04-16 22:03:20 -07:00
Raphael Deem
3e87314adf use absolute path in static root 2017-04-16 21:58:10 -07:00
Raphael Deem
f6d4a06661 Merge pull request #643 from aryeh/allow_unknown_status_codes
Allow unknown status codes
2017-04-16 18:42:59 -07:00
Raphael Deem
ff17fc95e6 Merge pull request #632 from messense/feature/path-route
Add path type for router
2017-04-16 18:39:10 -07:00
Eli Uriegas
c5a46f1cea Merge pull request #638 from seemethere/add_contributing_rules
Add new contributing rules
2017-04-16 13:15:01 -05:00
Eli Uriegas
0b072189c4 Merge pull request #641 from TomIsPrettyCool/master
Use render_async and a template env with the Jinja2 example.
2017-04-16 13:14:35 -05:00
Szucs Krisztian
5b22d1486a fix syntax error in comment 2017-04-16 18:13:00 +02:00
Szucs Krisztian
9eb48c2b0d dask distributed example 2017-04-16 18:11:24 +02:00
aryeh
ff0632001c prevent crash for unknown response codes
set text for unknown status code, otherwise when None is used exception occurs.
2017-04-16 10:07:29 -04:00
aryeh
28bd09a2ea Merge pull request #1 from channelcat/master
pull master
2017-04-16 09:48:59 -04:00
Tom Haines
c6aaa9b09c Use render_async and a template env with jinja2 2017-04-16 13:50:07 +01:00
Eli Uriegas
20d9ec1fd2 Fix the numbered list 2017-04-14 14:38:45 -05:00
Eli Uriegas
2c45c2d3c0 Add new contributing rules 2017-04-14 14:35:28 -05:00
Eli Uriegas
18829e648a Merge pull request #635 from yeahx/master
fix directory traversal flaw
2017-04-14 14:01:29 -05:00
Eli Uriegas
a64c636a33 Merge pull request #627 from Sniedes722/master
Updating examples for 0.5.0
2017-04-14 13:21:43 -05:00
Shawn Niederriter
5796f211c1 Added detailed plotly example project 2017-04-14 17:17:23 +00:00
lazydog
ae09dec05e fixed UnboundLocalError 2017-04-14 03:38:55 +08:00
lazydog
afd51e0823 fix directory traversal flaw 2017-04-14 02:55:39 +08:00
zenix
0bbf826b21 fix typo 2017-04-13 16:52:40 +09:00
zenix
02d1900e2f try to fix container error 2017-04-13 16:49:36 +09:00
zenix
73da11b04c switch to use streaming for access log and error log 2017-04-13 16:26:13 +09:00
zenix
4af07e3731 change naming of default log config 2017-04-13 13:49:45 +09:00
zenix
7f60f85cd4 Merge branch 'master' of https://github.com/channelcat/sanic 2017-04-13 13:35:37 +09:00
messense
4c66cb1854 Fix static files router 2017-04-13 12:11:38 +08:00
messense
35b92e1511 Add path type for router 2017-04-13 11:34:35 +08:00
Raphael Deem
e5d3fe52c5 Merge pull request #630 from 38elements/keep_alive
Refactor keep_alive
2017-04-12 19:46:33 -07:00
Raphael Deem
63fe7c0a86 Merge pull request #631 from adamserafini/fix-installation
Fix installation on Ubuntu 16.10 #629
2017-04-12 18:53:23 -07:00
zenix
c5f137c715 fix original code logic 2017-04-12 18:52:01 +09:00
zenix
66923bc0e3 remove unused param 2017-04-12 18:48:16 +09:00
zenix
8bf7b5a323 remove unused dependency 2017-04-12 18:45:38 +09:00
zenix
36d4d85849 change to use default python config code 2017-04-12 18:44:47 +09:00
zenix
5f0e05f3bf fix flake8 2017-04-12 18:08:06 +09:00
adam.serafini
235e5511eb Bump version 2017-04-12 11:02:13 +02:00
zenix
6fb60ae0b1 Merge branch 'master' of https://github.com/channelcat/sanic 2017-04-12 18:01:12 +09:00
adam.serafini
6b2883074b Fix installation on Ubuntu 16.10
Fixes issue #629. Printing a unicode string at the end of the
setup.py script is asking for trouble. It's also redundant:
the pip tool itself tells the user whether the installation was
successful or not.
2017-04-12 10:59:03 +02:00
38elements
7fe418d1b7 Refactor keep_alive 2017-04-12 17:55:22 +09:00
zenix
f872ceb0d9 fix bug in access logging when error happens 2017-04-12 17:39:17 +09:00
Shawn Niederriter
0f10a36b40 Added url_for example 2017-04-12 06:20:35 +00:00
Shawn Niederriter
3c45c9170f Fixed to merge with #626 2017-04-11 21:55:45 +00:00
Shawn Niederriter
a0730aeb44 Merge https://github.com/channelcat/sanic
Version 0.0.5
2017-04-11 21:47:08 +00:00
Eli Uriegas
e5fdc7fdd0 Merge pull request #626 from seemethere/increment_050
Increment to 0.5.0
2017-04-11 16:05:26 -05:00
Eli Uriegas
015c87b5e1 Add traceback for better debugging 2017-04-11 16:02:57 -05:00
Eli Uriegas
d20a49e500 Lock chardet for now... 2017-04-11 16:02:49 -05:00
Shawn Niederriter
adb7331670 Updated examples for 0.5.0 2017-04-11 20:34:55 +00:00
Eli Uriegas
084f0d27a3 Increment to 0.5.0 2017-04-11 15:19:00 -05:00
Eli Uriegas
522a0beec0 Merge pull request #622 from aryeh/provide_request_object
Allow a custom Request class to be passed in to Sanic
2017-04-11 15:13:05 -05:00
zenix
bf46bcf376 Merge branch 'logging' 2017-04-11 19:03:35 +09:00
zenix
f330c3f8c5 add logging based on issue #608, add default config 2017-04-11 18:59:07 +09:00
Raphael Deem
77a51c1e05 Merge pull request #617 from Sniedes722/master
Update Examples for 0.4.2
2017-04-10 19:07:29 -07:00
Raphael Deem
144f215705 Merge pull request #623 from qwIvan/master
Fixed #615
2017-04-10 18:58:31 -07:00
ivan
51b01b6b44 Merge branch 'master' of github.com:qwIvan/sanic 2017-04-10 18:33:43 +08:00
ivan
09885534c6 fixed #615 2017-04-10 18:31:28 +08:00
aryeh
b9dfec38c2 Break long line (> 80 chars) into 2 lines 2017-04-09 13:38:36 -04:00
aryeh
2ef8120073 Allow a custom Request class to be passed in to Sonic
Allowing a custom Request class to be defined would enable either a different Request class or a subclass of Request to be used, providing more flexibility.
2017-04-09 13:29:21 -04:00
Raphael Deem
52ff2e0e63 Merge pull request #621 from r0fls/620
fix python -m method of running
2017-04-08 13:32:17 -07:00
Raphael Deem
8cf7dce33f fix python -m method of running 2017-04-08 13:31:17 -07:00
Shawn Niederriter
9d3bb4a37a Updated examples in-line with response docs 2017-04-06 19:47:25 +00:00
Shawn Niederriter
c30437448b Updated aiohttp & run_async examples, added redirect 2017-04-06 19:42:05 +00:00
Raphael Deem
7e3496f8aa Merge pull request #614 from dkruchinin/middleware
Response middleware should be called even if server replies with an error
2017-04-06 11:49:44 -07:00
Shawn Niederriter
46ac79f4dc Update run_async demo 2017-04-06 14:39:54 -04:00
Shawn Niederriter
833b14e353 Updated aiohttp example. 2017-04-06 13:33:29 -04:00
Eli Uriegas
e9eca25792 Merge pull request #616 from r0fls/610
use socket.set_inheritable instead of os version
2017-04-04 15:17:31 -05:00
Raphael Deem
1854ad133c use socket.set_inheritable instead of os version 2017-04-04 13:13:52 -07:00
Eli Uriegas
2b5e723ea5 Merge pull request #519 from youknowone/recall-379
Add #379 again and make related test rework
2017-04-04 15:08:57 -05:00
Raphael Deem
9a18906edd Merge pull request #612 from dkruchinin/prom
Add prometheus extension
2017-04-04 11:06:43 -07:00
Raphael Deem
93cb7582c2 Merge branch 'master' into prom 2017-04-04 11:06:32 -07:00
Raphael Deem
b4529639f6 Merge pull request #611 from ashleysommer/master
Add Sanic-RestPlus to extensions.
2017-04-04 11:06:02 -07:00
Dan Kruchinin
f0a59fccf8 flake8-related fixes 2017-04-04 17:19:45 +01:00
Dan Kruchinin
46dbaf95a6 Response middleware should be called even if server replies with error 2017-04-04 15:55:43 +01:00
Dan Kruchinin
d418b03708 Add prometheus extension 2017-04-04 14:22:31 +01:00
Ashley Sommer
765e90ecfa update extensions
Add Sanic-RestPlus!
2017-04-04 13:40:59 +10:00
Ashley Sommer
ff1e88dde6 Merge pull request #2 from channelcat/master
update to latest sanic
2017-04-04 13:39:21 +10:00
Eli Uriegas
62ebcba647 Add graphql integration extension
Closes #579
2017-04-03 14:45:18 -05:00
Jeong YunWon
429e90183b Add #379 again and make related test rework
Original PR: https://github.com/channelcat/sanic/pull/379
2017-04-03 18:56:39 +09:00
Raphael Deem
875790e862 Merge pull request #606 from monobot/master
Fix URL Parse Error #599
2017-04-02 12:16:24 -07:00
monobot
25edbe6805 update docs 2017-04-02 02:28:16 +01:00
monobot
e148b50d6a Merge remote-tracking branch 'upstream/master' 2017-04-02 00:02:49 +01:00
Raphael Deem
06d46d56cd Merge pull request #602 from jkbbwr/fix/env-install
Flake8 cleanup. Setup environmental variables.
2017-03-31 10:42:21 -07:00
Jakob Bowyer
edd8770c67 Restored tests to upstream/master 2017-03-31 08:53:46 +01:00
Jakob Bowyer
daedda8547 Checked out original tests 2017-03-31 08:51:12 +01:00
Raphael Deem
df9d897e75 Merge pull request #607 from nosahama/hotfix/docs-cookie-typo-fix
Typo Fix in docs/sanic/cookies.md
2017-03-30 16:24:58 -07:00
nosaevb
fcd8e5e5ad Typo Fix in docs/sanic/cookies.md 2017-03-30 23:02:46 +01:00
monobot
6c003f71f4 Merge remote-tracking branch 'upstream/master' 2017-03-29 23:54:11 +01:00
monobot
5b704478d9 raw_args for request objects 2017-03-29 22:06:54 +01:00
Eli Uriegas
60eb528d68 Merge pull request #491 from r0fls/remove-stop-event
remove stop_event
2017-03-29 07:08:55 -05:00
Jakob Bowyer
1cf730d957 Added usage documentation for optional installs 2017-03-29 10:12:24 +01:00
Raphael Deem
171110b445 Merge pull request #604 from seemethere/add_docker_unittest_support
Fixing the unittests
2017-03-29 02:03:54 -07:00
Jakob Bowyer
22699db855 Moved skips to seperate pull request 2017-03-29 09:16:53 +01:00
Eli Uriegas
18405b3908 There was a line missing here? 2017-03-28 22:57:58 -05:00
Eli Uriegas
f0a55b5cbb Fix line length again... 2017-03-28 22:51:23 -05:00
Eli Uriegas
04a0774ee5 Fix line length 2017-03-28 22:51:23 -05:00
Eli Uriegas
3a8cfb1f45 Make these tests not so far apart 2017-03-28 22:51:23 -05:00
Eli Uriegas
dcc19d17d4 Lock to aiohttp 1.3.5 for now 2017-03-28 22:51:23 -05:00
Eli Uriegas
1ef69adc6f Simplify this as well, it replicated effort 2017-03-28 22:51:23 -05:00
Eli Uriegas
75a4df0f32 Simplify this, it had a lot of fluff 2017-03-28 22:51:23 -05:00
Eli Uriegas
8ba1b5fc35 Add docker support for local unit testing
Addresses consistency across different OS's by making it very similar to
the base Travis image.
2017-03-28 22:51:23 -05:00
Eli Uriegas
a09471ac6c Merge pull request #600 from monobot/master
added asyncorm example
2017-03-28 22:50:20 -05:00
Eli Uriegas
a916eea684 Merge pull request #601 from SakuraSound/master
Detailed example with logging, database access, environment variables, and basic middleware
2017-03-28 22:49:26 -05:00
Eli Uriegas
511998d8e1 Merge pull request #573 from r0fls/env-config
allow setting config from individual env variables
2017-03-28 22:24:37 -05:00
Joir-dan Gumbs
e3cf50f791 Changed out redis middleware for redis listeners (open/close). Fleshed out the payloads of both endpoints. Added comment about required packages. 2017-03-28 15:00:23 -07:00
Jakob Bowyer
42ba5298a7 Flake8 cleanup. Setup environmental variables.
Skipping broken tests unrelated.
2017-03-28 10:50:09 +01:00
Joir-dan Gumbs
ee79750a22 Cleaned up functions. Added extra middleware function to log endpoint being called. Added documentation to make easier to understand. 2017-03-28 01:22:36 -07:00
Raphael Deem
1787f8617f Merge pull request #592 from weargoggles/patch-1
Document synchronous response.write in streaming
2017-03-27 20:14:20 -07:00
Joir-dan Gumbs
748ca28185 Created detailed example of using sanic. Adds configurations based on various environment variables, handles database access (using aioredis), uses middleware to check for db object and attach it to request object, and logs events to a logfile (which is set using environment variables). 2017-03-27 15:42:13 -07:00
monobot
9c68d713ba added asyncorm example 2017-03-27 22:47:35 +01:00
Raphael Deem
fc69678206 Merge remote-tracking branch 'upstream/master' into remove-stop-event 2017-03-26 15:59:31 -07:00
Raphael Deem
aebd717039 fix merge conflict 2017-03-26 15:49:58 -07:00
Raphael Deem
1ddb01ac44 remove stop_event 2017-03-26 15:48:41 -07:00
Raphael Deem
3e279cd670 Merge pull request #593 from itielshwartz/master
add sanic-nginx-docker-example to extensions.md
2017-03-26 11:50:45 -07:00
Raphael Deem
724c03630a Update extensions.md 2017-03-26 11:49:15 -07:00
itiel
b00b2561e5 add sanic-nginx-docker-example to extensions.md 2017-03-26 21:16:03 +03:00
Raphael Deem
c5b50fe3cf allow setting config from individual env variables 2017-03-25 17:45:55 -07:00
Raphael Deem
df9884de3c Merge pull request #576 from matuusu/master
fix http status code not propagating in response
2017-03-24 19:23:20 -07:00
Pete Wildsmith
65ae7669f9 Document synchronous response.write in streaming
The Streaming section of the docs was updated to make clear that a synchronous write should be used in the callback, but this section was not updated.
2017-03-24 10:11:30 +00:00
Raphael Deem
179606feb1 Merge pull request #590 from r0fls/blueprint-strict-slash
add blueprint strict_slashes
2017-03-23 18:42:21 -07:00
Raphael Deem
536140340e Merge pull request #585 from subyraman/listener-docs
add docs for server lifecycle listeners and `add_task`
2017-03-23 18:38:48 -07:00
Raphael Deem
5d293df64b add blueprint strict_slashes 2017-03-23 18:37:06 -07:00
Suby Raman
6188891a53 add-listeners-docs 2017-03-23 15:49:23 -04:00
Raphael Deem
9774661cfe Merge pull request #580 from skytoup/master
Fix testing not support binary file
2017-03-23 12:22:08 -07:00
Raphael Deem
563bc34fb5 Merge pull request #578 from messense/feature/gunicorn-deploy-doc
Add documentation for Gunicorn worker
2017-03-23 12:20:00 -07:00
Raphael Deem
9c95ab3a28 Merge pull request #584 from subyraman/add-decorators-info
add decorator docs
2017-03-23 12:19:23 -07:00
Suby Raman
b776c37b36 add decorator docs 2017-03-23 15:16:46 -04:00
Eli Uriegas
5b3f92b70f Merge pull request #577 from seemethere/fix_tests_after_aiohttp_2
Hotfixes tests failing from URL object change
2017-03-23 13:56:16 -05:00
ivan
1562b81522 add arg load_body in testing 2017-03-23 20:48:57 +08:00
ivan
be1016ace6 merge upstream 2017-03-23 20:08:19 +08:00
ivan
ee27c689e1 commented aiohttp load response body in testing 2017-03-23 20:06:39 +08:00
skytoup
fdbf452ced 1. try...catch aiohttp encode response body to text in test_client
2. add tests static binary file
2017-03-23 15:22:00 +08:00
messense
3d9927dee0 Add documentation for Gunicorn worker 2017-03-23 09:12:23 +08:00
Raphael Deem
1456b128d2 Merge pull request #572 from sourcepirate/master
Removed raw string in cookies documentation
2017-03-22 14:55:20 -07:00
Eli Uriegas
166f77cb86 Merge pull request #545 from messense/feature/gunicorn-worker
Addition of a gunicorn worker
2017-03-22 16:27:06 -05:00
Eli Uriegas
5577838905 Hotfixes tests failing from URL object change
aiohttp decided to use yarl for their new URL objects so that they
aren't plain strings anymore which means that this single test fails.
Not a huge change but this should fix the testing suite.
2017-03-22 16:21:35 -05:00
matuusu
9c15982299 Update response.py
fix status code not propagating from response.stream to response.StreamingHTTPResponse
2017-03-22 12:40:40 +01:00
sourcepirate
63c24122db Removed raw string in cookies documentation 2017-03-22 06:39:23 +05:30
messense
1396ca903d Fix before_stop event 2017-03-20 14:27:02 +08:00
messense
d1fb5bdc30 Fix async before_server_start hook bug 2017-03-20 14:25:32 +08:00
messense
e27812bf3e Set signal.stopped = True on closing 2017-03-20 14:25:32 +08:00
messense
11a3cf9b99 Add signal handling 2017-03-20 14:25:32 +08:00
messense
a90d70feae Check connections is not None 2017-03-20 14:25:32 +08:00
messense
466b34735c Set app.is_running to True 2017-03-20 14:25:32 +08:00
messense
7ca9116e37 Trigger before_stop before closing server, after_stop before closing
loop
2017-03-20 14:25:31 +08:00
messense
decd3e737c Flake8 fix 2017-03-20 14:25:31 +08:00
messense
f35442ad1b Fix RuntimeError: this event loop is already running 2017-03-20 14:25:31 +08:00
messense
2b296435b3 Trigger events 2017-03-20 14:25:31 +08:00
messense
19ee1dfecc Gunicorn worker 2017-03-20 14:25:31 +08:00
Raphael Deem
7da4596ef8 Merge pull request #567 from Sniedes722/master
Added Sanic-OAuth to extensions.
2017-03-19 14:20:19 -07:00
Shawn Niederriter
a379ef6781 Added Sanic-OAuth to extensions. 2017-03-18 23:56:11 -04:00
Raphael Deem
7beb065be3 Merge pull request #562 from lixxu/master
update function name as it not halt request actually
2017-03-18 13:17:24 -07:00
Raphael Deem
38b9091513 Merge pull request #443 from sourcepirate/master
Fixed sanic peewee example.
2017-03-17 13:12:39 -07:00
Raphael Deem
96db3c9601 Merge pull request #564 from messense/feature/aioredis-example
Add an aioredis example
2017-03-17 13:11:19 -07:00
Raphael Deem
43c4fc8e33 Merge pull request #559 from r0fls/557
accept strict_slash routes
2017-03-17 13:10:26 -07:00
messense
986ff101ce Add an aioredis example 2017-03-17 14:16:13 +08:00
lixxu
94c83c445f fix broken table 2017-03-17 14:01:54 +08:00
lixxu
625865412f update function name as it not halt request actually 2017-03-17 13:12:17 +08:00
Raphael Deem
46677e69ce accept strict_slash routes 2017-03-16 11:46:07 -07:00
Eli Uriegas
5fbca5b823 Merge pull request #561 from kdelwat/master
Fix ReadTheDocs build errors
2017-03-16 09:02:39 -05:00
Eli Uriegas
879fab120f Merge pull request #560 from miguelgrinberg/cancel-websocket-tasks
cancel websocket tasks if server is stopped
2017-03-16 09:02:30 -05:00
Cadel Watson
391b24bc17 Add websockets dependency to ReadTheDocs environment 2017-03-16 17:01:49 +11:00
Cadel Watson
d713533d26 Fix docstring formatting errors 2017-03-16 16:52:18 +11:00
Cadel Watson
24f745a334 Fix formatting errors in RST files 2017-03-16 16:51:02 +11:00
Cadel Watson
86f3101861 Add autodoc extension to Sphinx configuration 2017-03-16 16:50:33 +11:00
Miguel Grinberg
fd823c63ab cancel websocket tasks if server is stopped 2017-03-15 22:45:19 -07:00
Raphael Deem
fa69892f70 Merge pull request #558 from Zheaoli/master
Add a new example by using aiomysql
2017-03-15 20:05:37 -07:00
lizheao
cfc53d0d26 Change some code in sanic aiomysql code 2017-03-15 14:42:22 +08:00
lizheao
97c2056e4a Change some code in sanic aiomysql code 2017-03-15 14:41:54 +08:00
lizheao
0ad0164171 Change some code in sanic aiomysql code 2017-03-15 14:41:38 +08:00
lizheao
df0e285b6f Add a new example 2017-03-15 14:12:37 +08:00
Raphael Deem
e92f1b8c28 Merge pull request #556 from AntonDnepr/fix-414
Small changes to the docs and tests for the #414
2017-03-14 21:07:55 -07:00
Anton Zhyrnyi
410f86c960 fix for docs&tests 2017-03-14 20:53:58 +02:00
Raphael Deem
85f27320e7 Merge pull request #553 from ashleysommer/ashleysommer-add-dispatch-extension
Add Dispatcher Extension to Extensions page in Docs
2017-03-13 17:00:46 -07:00
Raphael Deem
9a3fac90e1 Merge pull request #551 from messense/feature/documentation-links
Add hyperlinks in response documentation
2017-03-13 17:00:00 -07:00
Raphael Deem
6984f6eec4 Merge pull request #549 from ai0/websocket-scheme
add websocket scheme in request
2017-03-13 16:59:47 -07:00
Ashley Sommer
d05f502fc8 Add Dispatcher Extension
Adds a link to the extension
https://github.com/ashleysommer/sanic-dispatcher
And a very short description
2017-03-14 08:37:53 +10:00
Jing Su
ba41ab8f67 fix typo 2017-03-13 18:36:22 +08:00
Jing Su
250bb7e29d add websocket secure scheme in request @messense 2017-03-13 18:34:43 +08:00
messense
48a26fd5df Add hyperlinks in response documentation 2017-03-13 16:20:12 +08:00
Jing Su
3af26540ec add websocket scheme in request 2017-03-13 13:28:35 +08:00
Raphael Deem
7d9de068d9 Merge pull request #544 from messense/feature/document-response
Add response documentation
2017-03-11 22:49:53 -08:00
messense
d174917a07 Add response documentation 2017-03-12 14:31:51 +08:00
Eli Uriegas
af398fc4c4 Merge pull request #543 from r0fls/windows-setup
windows setup
2017-03-11 21:57:15 -08:00
Raphael Deem
878ef446a2 refactor redundant print logic 2017-03-11 21:54:07 -08:00
Raphael Deem
668f6477bb fix spacing 2017-03-11 21:46:31 -08:00
Raphael Deem
01a770cbca windows setup 2017-03-11 19:32:38 -08:00
Raphael Deem
23a1174aa2 Merge pull request #542 from nszceta/master
Faster asyncpg queries via a global connection pool
2017-03-11 16:07:16 -08:00
Raphael Deem
414020e75b Merge pull request #541 from r0fls/speedup
remove default host attribute in router
2017-03-11 16:07:05 -08:00
Adam Gradzki
ed74bccad6 Faster asyncpg queries via a global connection pool
In my benchmarks I was able to obtain a 17% performance
improvement over the current asyncpg demo code with a shared
connection pool.

Resolves: #540
See also: #531
2017-03-11 17:55:57 -06:00
Raphael Deem
0eedde445c remove default host attribute in router 2017-03-11 15:39:48 -08:00
Raphael Deem
88bf78213f Merge pull request #512 from subyraman/fix-url-building
Fix `request.url` and other url properties
2017-03-10 00:38:16 -08:00
Raphael Deem
d342461a51 Merge pull request #535 from ai0/master
Update blueprints example
2017-03-10 00:36:02 -08:00
Raphael Deem
dffaaf8751 Merge pull request #533 from 38elements/patch-1
Fix bail_out()
2017-03-10 00:35:54 -08:00
Raphael Deem
313edadf47 Merge pull request #528 from r0fls/523
allow running with SSL via commandline
2017-03-10 00:35:46 -08:00
Raphael Deem
c9ce33dfe6 Merge pull request #524 from r0fls/exception-list
allow exceptions to be a list
2017-03-10 00:35:38 -08:00
Raphael Deem
0f50ac7205 Merge pull request #517 from r0fls/empty-json
return valid json in request.json
2017-03-10 00:35:26 -08:00
Raphael Deem
e807c08275 Merge pull request #536 from lixxu/master
add sanic-babel extension
2017-03-09 12:28:22 -08:00
Lix Xu
893977365c Update extensions.md
add babel extension
2017-03-09 23:58:02 +08:00
Jing Su
0cac45809f add blueprints websocket example 2017-03-09 18:33:34 +08:00
Jing Su
489ca3c207 use blueprint method instead of deprecated register_blueprint 2017-03-09 18:31:19 +08:00
38elements
313535c599 Fix bail_out() 2017-03-09 13:36:01 +09:00
Raphael Deem
7f1e0557c9 Merge pull request #532 from r0fls/example-asyncpg-close
close connection in asyncpg example
2017-03-08 18:51:44 -08:00
Raphael Deem
0860f84a39 close connection in asyncpg example 2017-03-08 18:50:41 -08:00
Raphael Deem
2fe9e78b6d Merge pull request #525 from r0fls/518
rename TestClient
2017-03-08 18:24:13 -08:00
Raphael Deem
2ba30f2022 allow running with SSL via commandline 2017-03-07 19:57:10 -08:00
Raphael Deem
b3b27cab34 Merge pull request #527 from r0fls/asyncpg-example
update asyncpg example
2017-03-07 19:34:10 -08:00
Raphael Deem
694207a86d update asyncpg example 2017-03-07 19:32:26 -08:00
Raphael Deem
90138c4bae return valid json in request.json 2017-03-07 18:03:45 -08:00
Raphael Deem
58a833e987 rename TestClient 2017-03-07 17:13:23 -08:00
Raphael Deem
86c5a569d5 allow exceptions to be a list 2017-03-07 16:22:23 -08:00
Eli Uriegas
19592e8eea Merge pull request #473 from subyraman/explore-streams-v2
Add `stream` method for streaming content, add docs and examples
2017-03-05 17:51:44 -08:00
Eli Uriegas
8e6678d526 Merge pull request #469 from miguelgrinberg/websocket-support
websocket support
2017-03-05 17:42:49 -08:00
Suby Raman
e792a1e030 add host test 2017-03-03 14:51:13 -05:00
Suby Raman
f0e818a28c add host test 2017-03-03 13:32:32 -05:00
Suby Raman
69bd63b742 add docs 2017-03-03 11:59:33 -05:00
Suby Raman
b40f30f2e5 fix tests 2017-03-03 11:49:35 -05:00
Suby Raman
1fbde87ec2 initial commit 2017-03-03 11:44:50 -05:00
Eli Uriegas
f9dc34c8fa Merge pull request #510 from jamesstidard/patch-2
Sanic EnvConfig
2017-03-02 22:21:16 -06:00
James Stidard
f7186f5331 Sanic EnvConfig
Services like Heroku force your hand into using environment variables. Made this to help.
2017-03-02 21:22:07 +00:00
Eli Uriegas
6a680e4db0 Merge pull request #509 from kcy1019/master
Fix exception_monitoring example
2017-03-02 10:48:05 -06:00
고창영(Chang-young Koh)
f6b69f412f Fix exception_monitoring example 2017-03-03 00:55:08 +09:00
Eli Uriegas
5aed18862d Merge pull request #506 from zenixls2/bugfix/bind-listener
special handling when sock is provided and number of workers > 1
2017-03-02 09:08:59 -06:00
(Zenix) Han-Sheng Huang
62bf213a6e pass flake8 2017-03-02 13:42:55 +09:00
(Zenix) Han-Sheng Huang
e5c32e9b48 special handling when sock is provided and number of workers > 1 2017-03-02 13:27:52 +09:00
Raphael Deem
b87dc37fbb Merge pull request #500 from r0fls/revert-498
Revert "add reuse_port to create_server"
2017-02-28 19:29:29 -08:00
Raphael Deem
002d4cb37c Revert "add reuse_port to create_server"
This reverts commit c3386dec84.
2017-02-28 19:27:42 -08:00
Raphael Deem
ff321fc355 Merge pull request #499 from r0fls/498
add reuse_port to create_server
2017-02-28 19:24:35 -08:00
Raphael Deem
c3386dec84 add reuse_port to create_server 2017-02-28 19:21:53 -08:00
Eli Uriegas
927d2761f7 Merge pull request #496 from bohea/master
rate limiting for sanic
2017-02-28 09:03:03 -06:00
bohea
3289e8403a Update extensions.md 2017-02-28 17:34:40 +08:00
James Stidard
104a7c7d05 Added app to websocket request (#1) 2017-02-27 22:35:28 -08:00
Miguel Grinberg
7560660ec7 handle timeouts and disconnects properly 2017-02-27 22:35:28 -08:00
Miguel Grinberg
40ccb4a0dd websocket documentation 2017-02-27 22:35:28 -08:00
Miguel Grinberg
f90288f5dc websocket routes in blueprints 2017-02-27 22:35:28 -08:00
Miguel Grinberg
3bf79898d9 websocket unit test 2017-02-27 22:35:28 -08:00
Miguel Grinberg
1d6e11ca10 addressed feedback 2017-02-27 22:35:28 -08:00
Miguel Grinberg
6e903ee7d5 websocket support, using websockets package 2017-02-27 22:35:28 -08:00
Raphael Deem
2dca53a696 remove stop_event 2017-02-26 16:37:48 -08:00
Suby Raman
d8a6d7e02f response.write should be synchronous for performance reasons 2017-02-22 10:42:16 -05:00
Suby Raman
1a8961587c more info in docs 2017-02-21 11:38:57 -05:00
Suby Raman
fa13ad8849 clean up imports 2017-02-21 11:36:45 -05:00
Suby Raman
8b23dec322 improve performance 2017-02-21 11:28:45 -05:00
Suby Raman
4e8aac4b41 rebase 2017-02-21 11:05:06 -05:00
sourcepirate
e6a828572a Changed method name from create to new 2017-02-16 11:07:49 +05:30
plasmashadow
6ea43d8e6d Fixed sanic_peewee error and added a simple interface to query and create 2017-02-16 10:58:52 +05:30
134 changed files with 9239 additions and 1576 deletions

View File

@@ -1,7 +1,7 @@
[run]
branch = True
source = sanic
omit = site-packages, sanic/utils.py
omit = site-packages, sanic/utils.py, sanic/__main__.py
[html]
directory = coverage

1
.gitignore vendored
View File

@@ -14,3 +14,4 @@ settings.py
docs/_build/
docs/_api/
build/*
.DS_Store

View File

@@ -1,10 +1,32 @@
sudo: false
language: python
python:
- '3.5'
- '3.6'
install: pip install tox-travis
script: tox
cache:
directories:
- $HOME/.cache/pip
matrix:
include:
- env: TOX_ENV=py35
python: 3.5
- env: TOX_ENV=py35-no-ext
python: 3.5
- env: TOX_ENV=py36
python: 3.6
- env: TOX_ENV=py36-no-ext
python: 3.6
- env: TOX_ENV=py37
python: 3.7
dist: xenial
sudo: true
- env: TOX_ENV=py37-no-ext
python: 3.7
dist: xenial
sudo: true
- env: TOX_ENV=flake8
python: 3.6
- env: TOX_ENV=check
python: 3.6
install: pip install -U tox
script: travis_retry tox -e $TOX_ENV
deploy:
provider: pypi
user: channelcat
@@ -12,3 +34,4 @@ deploy:
secure: OgADRQH3+dTL5swGzXkeRJDNbLpFzwqYnXB4iLD0Npvzj9QnKyQVvkbaeq6VmV9dpEFb5ULaAKYQq19CrXYDm28yanUSn6jdJ4SukaHusi7xt07U6H7pmoX/uZ2WZYqCSLM8cSp8TXY/3oV3rY5Jfj/AibE5XTbim5/lrhsvW6NR+ALzxc0URRPAHDZEPpojTCjSTjpY0aDsaKWg4mXVRMFfY3O68j6KaIoukIZLuoHfePLKrbZxaPG5VxNhMHEaICdxVxE/dO+7pQmQxXuIsEOHK1QiVJ9YrSGcNqgEqhN36kYP8dqMeVB07sv8Xa6o/Uax2/wXS2HEJvuwP1YD6WkoZuo9ZB85bcMdg7BV9jJDbVFVPJwc75BnTLHrMa3Q1KrRlKRDBUXBUsQivPuWhFNwUgvEayq2qSI3aRQR4Z0O+DfboEhXYojSoD64/EWBTZ7vhgbvOTGEdukUQSYrKj9P8jc1s8exomTsAiqdFxTUpzfiammUSL+M93lP4urtahl1jjXFX7gd3DzdEEb0NsGkx5lm/qdsty8/TeAvKUmC+RVU6T856W6MqN0P+yGbpWUARcSE7fwztC3SPxwAuxvIN3BHmRhOUHoORPNG2VpfbnscIzBKJR4v0JKzbpi0IDa66K+tCGsCEvQuL4cxVOtoUySPWNSUAyUWWUrGM2k=
on:
tags: true
distributions: "sdist bdist_wheel"

74
CONDUCT.md Normal file
View File

@@ -0,0 +1,74 @@
# Contributor Covenant Code of Conduct
## Our Pledge
In the interest of fostering an open and welcoming environment, we as
contributors and maintainers pledge to making participation in our project and
our community a harassment-free experience for everyone, regardless of age, body
size, disability, ethnicity, gender identity and expression, level of experience,
nationality, personal appearance, race, religion, or sexual identity and
orientation.
## Our Standards
Examples of behavior that contributes to creating a positive environment
include:
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery and unwelcome sexual attention or
advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic
address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Our Responsibilities
Project maintainers are responsible for clarifying the standards of acceptable
behavior and are expected to take appropriate and fair corrective action in
response to any instances of unacceptable behavior.
Project maintainers have the right and responsibility to remove, edit, or
reject comments, commits, code, wiki edits, issues, and other contributions
that are not aligned to this Code of Conduct, or to ban temporarily or
permanently any contributor for other behaviors that they deem inappropriate,
threatening, offensive, or harmful.
## Scope
This Code of Conduct applies both within project spaces and in public spaces
when an individual is representing the project or its community. Examples of
representing a project or community include using an official project e-mail
address, posting via an official social media account, or acting as an appointed
representative at an online or offline event. Representation of a project may be
further defined and clarified by project maintainers.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported by contacting the project team at sanic-maintainers@googlegroups.com. All
complaints will be reviewed and investigated and will result in a response that
is deemed necessary and appropriate to the circumstances. The project team is
obligated to maintain confidentiality with regard to the reporter of an incident.
Further details of specific enforcement policies may be posted separately.
Project maintainers who do not follow or enforce the Code of Conduct in good
faith may face temporary or permanent repercussions as determined by other
members of the project's leadership.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
available at [http://contributor-covenant.org/version/1/4][version]
[homepage]: http://contributor-covenant.org
[version]: http://contributor-covenant.org/version/1/4/

72
CONTRIBUTING.md Normal file
View File

@@ -0,0 +1,72 @@
# Contributing
Thank you for your interest! Sanic is always looking for contributors. If you
don't feel comfortable contributing code, adding docstrings to the source files
is very appreciated.
We are committed to providing a friendly, safe and welcoming environment for all,
regardless of gender, sexual orientation, disability, ethnicity, religion,
or similar personal characteristic.
Our [code of conduct](./CONDUCT.md) sets the standards for behavior.
## Installation
To develop on sanic (and mainly to just run the tests) it is highly recommend to
install from sources.
So assume you have already cloned the repo and are in the working directory with
a virtual environment already set up, then run:
```bash
python setup.py develop && pip install -r requirements-dev.txt
```
## Running tests
To run the tests for sanic it is recommended to use tox like so:
```bash
tox
```
See it's that simple!
## Pull requests!
So the pull request approval rules are pretty simple:
1. All pull requests must pass unit tests.
2. All pull requests must be reviewed and approved by at least
one current collaborator on the project.
3. All pull requests must pass flake8 checks.
4. All pull requests must be consistent with the existing code.
5. If you decide to remove/change anything from any common interface
a deprecation message should accompany it.
6. If you implement a new feature you should have at least one unit
test to accompany it.
7. An example must be one of the following:
* Example of how to use Sanic
* Example of how to use Sanic extensions
* Example of how to use Sanic and asynchronous library
## Documentation
Sanic's documentation is built
using [sphinx](http://www.sphinx-doc.org/en/1.5.1/). Guides are written in
Markdown and can be found in the `docs` folder, while the module reference is
automatically generated using `sphinx-apidoc`.
To generate the documentation from scratch:
```bash
sphinx-apidoc -fo docs/_api/ sanic
sphinx-build -b html docs docs/_build
```
The HTML documentation will be created in the `docs/_build` folder.
## Warning
One of the main goals of Sanic is speed. Code that lowers the performance of
Sanic without significant gains in usability, security, or features may not be
merged. Please don't let this intimidate you! If you have any concerns about an
idea, open an issue for discussion and help.

View File

@@ -1,6 +1,6 @@
MIT License
Copyright (c) [year] [fullname]
Copyright (c) 2016-present Channel Cat
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
@@ -18,4 +18,4 @@ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
SOFTWARE.

7
MANIFEST.in Normal file
View File

@@ -0,0 +1,7 @@
include README.rst
include MANIFEST.in
include LICENSE
include setup.py
recursive-exclude * __pycache__
recursive-exclude * *.py[co]

4
Makefile Normal file
View File

@@ -0,0 +1,4 @@
test:
find . -name "*.pyc" -delete
docker build -t sanic/test-image -f docker/Dockerfile .
docker run -t sanic/test-image tox

View File

@@ -1,5 +1,5 @@
Sanic
=================================
=====
|Join the chat at https://gitter.im/sanic-python/Lobby| |Build Status| |PyPI| |PyPI version|
@@ -7,35 +7,9 @@ Sanic is a Flask-like Python 3.5+ web server that's written to go fast. It's ba
On top of being Flask-like, Sanic supports async request handlers. This means you can use the new shiny async/await syntax from Python 3.5, making your code non-blocking and speedy.
Sanic is developed `on GitHub <https://github.com/channelcat/sanic/>`_. Contributions are welcome!
Sanic is developed `on GitHub <https://github.com/huge-success/sanic/>`_. Contributions are welcome!
Benchmarks
----------
All tests were run on an AWS medium instance running ubuntu, using 1
process. Each script delivered a small JSON response and was tested with
wrk using 100 connections. Pypy was tested for Falcon and Flask but did
not speed up requests.
+-----------+-----------------------+----------------+---------------+
| Server | Implementation | Requests/sec | Avg Latency |
+===========+=======================+================+===============+
| Sanic | Python 3.5 + uvloop | 33,342 | 2.96ms |
+-----------+-----------------------+----------------+---------------+
| Wheezy | gunicorn + meinheld | 20,244 | 4.94ms |
+-----------+-----------------------+----------------+---------------+
| Falcon | gunicorn + meinheld | 18,972 | 5.27ms |
+-----------+-----------------------+----------------+---------------+
| Bottle | gunicorn + meinheld | 13,596 | 7.36ms |
+-----------+-----------------------+----------------+---------------+
| Flask | gunicorn + meinheld | 4,988 | 20.08ms |
+-----------+-----------------------+----------------+---------------+
| Kyoukai | Python 3.5 + uvloop | 3,889 | 27.44ms |
+-----------+-----------------------+----------------+---------------+
| Aiohttp | Python 3.5 + uvloop | 2,979 | 33.42ms |
+-----------+-----------------------+----------------+---------------+
| Tornado | Python 3.5 | 2,138 | 46.66ms |
+-----------+-----------------------+----------------+---------------+
If you have a project that utilizes Sanic make sure to comment on the `issue <https://github.com/huge-success/sanic/issues/396>`_ that we use to track those projects!
Hello World Example
-------------------
@@ -47,17 +21,24 @@ Hello World Example
app = Sanic()
@app.route("/")
@app.route('/')
async def test(request):
return json({"hello": "world"})
return json({'hello': 'world'})
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8000)
if __name__ == '__main__':
app.run(host='0.0.0.0', port=8000)
Installation
------------
- ``python -m pip install sanic``
- ``pip install sanic``
To install sanic without uvloop or ujson using bash, you can provide either or both of these environmental variables
using any truthy string like `'y', 'yes', 't', 'true', 'on', '1'` and setting the NO_X to true will stop that features
installation.
- ``SANIC_NO_UVLOOP=true SANIC_NO_UJSON=true pip install sanic``
Documentation
-------------
@@ -66,19 +47,29 @@ Documentation
.. |Join the chat at https://gitter.im/sanic-python/Lobby| image:: https://badges.gitter.im/sanic-python/Lobby.svg
:target: https://gitter.im/sanic-python/Lobby?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge
.. |Build Status| image:: https://travis-ci.org/channelcat/sanic.svg?branch=master
:target: https://travis-ci.org/channelcat/sanic
.. |Build Status| image:: https://travis-ci.org/huge-success/sanic.svg?branch=master
:target: https://travis-ci.org/huge-success/sanic
.. |Documentation| image:: https://readthedocs.org/projects/sanic/badge/?version=latest
:target: http://sanic.readthedocs.io/en/latest/?badge=latest
.. |PyPI| image:: https://img.shields.io/pypi/v/sanic.svg
:target: https://pypi.python.org/pypi/sanic/
.. |PyPI version| image:: https://img.shields.io/pypi/pyversions/sanic.svg
:target: https://pypi.python.org/pypi/sanic/
Examples
--------
`Non-Core examples <https://github.com/huge-success/sanic/wiki/Examples/>`_. Examples of plugins and Sanic that are outside the scope of Sanic core.
`Extensions <https://github.com/huge-success/sanic/wiki/Extensions/>`_. Sanic extensions created by the community.
`Projects <https://github.com/huge-success/sanic/wiki/Projects/>`_. Sanic in production use.
TODO
----
* Streamed file processing
* http2
* http2
Limitations
-----------
* No wheels for uvloop and httptools on Windows :(

28
docker/Dockerfile Normal file
View File

@@ -0,0 +1,28 @@
FROM alpine:3.7
RUN apk add --no-cache --update \
curl \
bash \
build-base \
ca-certificates \
git \
bzip2-dev \
linux-headers \
ncurses-dev \
openssl \
openssl-dev \
readline-dev \
sqlite-dev
RUN update-ca-certificates
RUN rm -rf /var/cache/apk/*
ENV PYENV_ROOT="/root/.pyenv"
ENV PATH="$PYENV_ROOT/bin:$PATH"
ADD . /app
WORKDIR /app
RUN /app/docker/bin/install_python.sh 3.5.4 3.6.4
ENTRYPOINT ["./docker/bin/entrypoint.sh"]

11
docker/bin/entrypoint.sh Executable file
View File

@@ -0,0 +1,11 @@
#!/bin/bash
set -e
eval "$(pyenv init -)"
eval "$(pyenv virtualenv-init -)"
source /root/.pyenv/completions/pyenv.bash
pip install tox
exec $@

17
docker/bin/install_python.sh Executable file
View File

@@ -0,0 +1,17 @@
#!/bin/bash
set -e
export CFLAGS='-O2'
export EXTRA_CFLAGS="-DTHREAD_STACK_SIZE=0x100000"
curl -L https://raw.githubusercontent.com/pyenv/pyenv-installer/master/bin/pyenv-installer | bash
eval "$(pyenv init -)"
for ver in $@
do
pyenv install $ver
done
pyenv global $@
pip install --upgrade pip
pyenv rehash

View File

@@ -1,20 +1,225 @@
# Minimal makefile for Sphinx documentation
# Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
SPHINXPROJ = Sanic
SOURCEDIR = .
PAPER =
BUILDDIR = _build
# Put it first so that "make" without argument is like "make help".
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
.PHONY: help
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " applehelp to make an Apple Help Book"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " epub3 to make an epub3"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
@echo " text to make text files"
@echo " man to make manual pages"
@echo " texinfo to make Texinfo files"
@echo " info to make Texinfo files and run them through makeinfo"
@echo " gettext to make PO message catalogs"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " xml to make Docutils-native XML files"
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
@echo " coverage to run coverage check of the documentation (if enabled)"
@echo " dummy to check syntax errors of document sources"
.PHONY: help Makefile
.PHONY: clean
clean:
rm -rf $(BUILDDIR)/*
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: html
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
.PHONY: dirhtml
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
.PHONY: singlehtml
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
.PHONY: pickle
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
.PHONY: json
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
.PHONY: htmlhelp
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
.PHONY: qthelp
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/aiographite.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/aiographite.qhc"
.PHONY: applehelp
applehelp:
$(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp
@echo
@echo "Build finished. The help book is in $(BUILDDIR)/applehelp."
@echo "N.B. You won't be able to view it unless you put it in" \
"~/Library/Documentation/Help or install it in your application" \
"bundle."
.PHONY: devhelp
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/aiographite"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/aiographite"
@echo "# devhelp"
.PHONY: epub
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
.PHONY: epub3
epub3:
$(SPHINXBUILD) -b epub3 $(ALLSPHINXOPTS) $(BUILDDIR)/epub3
@echo
@echo "Build finished. The epub3 file is in $(BUILDDIR)/epub3."
.PHONY: latex
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
.PHONY: latexpdf
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
.PHONY: latexpdfja
latexpdfja:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through platex and dvipdfmx..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
.PHONY: text
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
.PHONY: man
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
.PHONY: texinfo
texinfo:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
@echo "Run \`make' in that directory to run these through makeinfo" \
"(use \`make info' here to do that automatically)."
.PHONY: info
info:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo "Running Texinfo files through makeinfo..."
make -C $(BUILDDIR)/texinfo info
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
.PHONY: gettext
gettext:
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
@echo
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
.PHONY: changes
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
.PHONY: linkcheck
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
.PHONY: doctest
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."
.PHONY: coverage
coverage:
$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
@echo "Testing of coverage in the sources finished, look at the " \
"results in $(BUILDDIR)/coverage/python.txt."
.PHONY: xml
xml:
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
@echo
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
.PHONY: pseudoxml
pseudoxml:
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
@echo
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
.PHONY: dummy
dummy:
$(SPHINXBUILD) -b dummy $(ALLSPHINXOPTS) $(BUILDDIR)/dummy
@echo
@echo "Build finished. Dummy builder generates no files."

View File

@@ -13,6 +13,9 @@ import sys
# Add support for Markdown documentation using Recommonmark
from recommonmark.parser import CommonMarkParser
# Add support for auto-doc
from recommonmark.transform import AutoStructify
# Ensure that sanic is present in the path, to allow sphinx-apidoc to
# autogenerate documentation from docstrings
root_directory = os.path.dirname(os.getcwd())
@@ -22,7 +25,7 @@ import sanic
# -- General configuration ------------------------------------------------
extensions = []
extensions = ['sphinx.ext.autodoc', 'sphinxcontrib.asyncio']
templates_path = ['_templates']
@@ -68,7 +71,6 @@ pygments_style = 'sphinx'
# If true, `todo` and `todoList` produce output, else they produce nothing.
todo_include_todos = False
# -- Options for HTML output ----------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
@@ -80,13 +82,11 @@ html_theme = 'sphinx_rtd_theme'
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
# -- Options for HTMLHelp output ------------------------------------------
# Output file base name for HTML help builder.
htmlhelp_basename = 'Sanicdoc'
# -- Options for LaTeX output ---------------------------------------------
latex_elements = {
@@ -110,21 +110,14 @@ latex_elements = {
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
(master_doc, 'Sanic.tex', 'Sanic Documentation',
'Sanic contributors', 'manual'),
]
latex_documents = [(master_doc, 'Sanic.tex', 'Sanic Documentation',
'Sanic contributors', 'manual'), ]
# -- Options for manual page output ---------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
(master_doc, 'sanic', 'Sanic Documentation',
[author], 1)
]
man_pages = [(master_doc, 'sanic', 'Sanic Documentation', [author], 1)]
# -- Options for Texinfo output -------------------------------------------
@@ -132,13 +125,10 @@ man_pages = [
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
(master_doc, 'Sanic', 'Sanic Documentation',
author, 'Sanic', 'One line description of project.',
'Miscellaneous'),
(master_doc, 'Sanic', 'Sanic Documentation', author, 'Sanic',
'One line description of project.', 'Miscellaneous'),
]
# -- Options for Epub output ----------------------------------------------
# Bibliographic Dublin Core info.
@@ -150,8 +140,15 @@ epub_copyright = copyright
# A list of files that should not be packed into the epub file.
epub_exclude_files = ['search.html']
# -- Custom Settings -------------------------------------------------------
suppress_warnings = ['image.nonlocal_uri']
# app setup hook
def setup(app):
app.add_config_value('recommonmark_config', {
'enable_eval_rst': True,
'enable_auto_doc_ref': True,
}, True)
app.add_transform(AutoStructify)

View File

@@ -1,78 +0,0 @@
# Configuration
Any reasonably complex application will need configuration that is not baked into the acutal code. Settings might be different for different environments or installations.
## Basics
Sanic holds the configuration in the `config` attribute of the application object. The configuration object is merely an object that can be modified either using dot-notation or like a dictionary:
```
app = Sanic('myapp')
app.config.DB_NAME = 'appdb'
app.config.DB_USER = 'appuser'
```
Since the config object actually is a dictionary, you can use its `update` method in order to set several values at once:
```
db_settings = {
'DB_HOST': 'localhost',
'DB_NAME': 'appdb',
'DB_USER': 'appuser'
}
app.config.update(db_settings)
```
In general the convention is to only have UPPERCASE configuration parameters. The methods described below for loading configuration only look for such uppercase parameters.
## Loading Configuration
There are several ways how to load configuration.
### From an Object
If there are a lot of configuration values and they have sensible defaults it might be helpful to put them into a module:
```
import myapp.default_settings
app = Sanic('myapp')
app.config.from_object(myapp.default_settings)
```
You could use a class or any other object as well.
### From a File
Usually you will want to load configuration from a file that is not part of the distributed application. You can load configuration from a file using `from_file(/path/to/config_file)`. However, that requires the program to know the path to the config file. So instead you can specify the location of the config file in an environment variable and tell Sanic to use that to find the config file:
```
app = Sanic('myapp')
app.config.from_envvar('MYAPP_SETTINGS')
```
Then you can run your application with the `MYAPP_SETTINGS` environment variable set:
```
$ MYAPP_SETTINGS=/path/to/config_file python3 myapp.py
INFO: Goin' Fast @ http://0.0.0.0:8000
```
The config files are regular Python files which are executed in order to load them. This allows you to use arbitrary logic for constructing the right configuration. Only uppercase varibales are added to the configuration. Most commonly the configuration consists of simple key value pairs:
```
# config_file
DB_HOST = 'localhost'
DB_NAME = 'appdb'
DB_USER = 'appuser'
```
## Builtin Configuration Values
Out of the box there are just a few predefined values which can be overwritten when creating the application.
| Variable | Default | Description |
| ----------------- | --------- | --------------------------------- |
| REQUEST_MAX_SIZE | 100000000 | How big a request may be (bytes) |
| REQUEST_TIMEOUT | 60 | How long a request can take (sec) |

View File

@@ -9,19 +9,25 @@ Guides
sanic/getting_started
sanic/routing
sanic/request_data
sanic/response
sanic/static_files
sanic/exceptions
sanic/middleware
sanic/blueprints
sanic/websocket
sanic/config
sanic/cookies
sanic/decorators
sanic/streaming
sanic/class_based_views
sanic/custom_protocol
sanic/ssl
sanic/logging
sanic/testing
sanic/deploying
sanic/extensions
sanic/contributing
sanic/api_reference
Module Documentation
@@ -30,4 +36,5 @@ Module Documentation
.. toctree::
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

View File

@@ -1,19 +1,64 @@
@ECHO OFF
pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=.
set BUILDDIR=_build
set SPHINXPROJ=Sanic
set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% .
set I18NSPHINXOPTS=%SPHINXOPTS% .
if NOT "%PAPER%" == "" (
set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS%
set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS%
)
if "%1" == "" goto help
%SPHINXBUILD% >NUL 2>NUL
if "%1" == "help" (
:help
echo.Please use `make ^<target^>` where ^<target^> is one of
echo. html to make standalone HTML files
echo. dirhtml to make HTML files named index.html in directories
echo. singlehtml to make a single large HTML file
echo. pickle to make pickle files
echo. json to make JSON files
echo. htmlhelp to make HTML files and a HTML help project
echo. qthelp to make HTML files and a qthelp project
echo. devhelp to make HTML files and a Devhelp project
echo. epub to make an epub
echo. epub3 to make an epub3
echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter
echo. text to make text files
echo. man to make manual pages
echo. texinfo to make Texinfo files
echo. gettext to make PO message catalogs
echo. changes to make an overview over all changed/added/deprecated items
echo. xml to make Docutils-native XML files
echo. pseudoxml to make pseudoxml-XML files for display purposes
echo. linkcheck to check all external links for integrity
echo. doctest to run all doctests embedded in the documentation if enabled
echo. coverage to run coverage check of the documentation if enabled
echo. dummy to check syntax errors of document sources
goto end
)
if "%1" == "clean" (
for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i
del /q /s %BUILDDIR%\*
goto end
)
REM Check if sphinx-build is available and fallback to Python version if any
%SPHINXBUILD% 1>NUL 2>NUL
if errorlevel 9009 goto sphinx_python
goto sphinx_ok
:sphinx_python
set SPHINXBUILD=python -m sphinx.__init__
%SPHINXBUILD% 2> nul
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
@@ -26,11 +71,211 @@ if errorlevel 9009 (
exit /b 1
)
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
goto end
:sphinx_ok
:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
if "%1" == "html" (
%SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/html.
goto end
)
if "%1" == "dirhtml" (
%SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml.
goto end
)
if "%1" == "singlehtml" (
%SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml.
goto end
)
if "%1" == "pickle" (
%SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can process the pickle files.
goto end
)
if "%1" == "json" (
%SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can process the JSON files.
goto end
)
if "%1" == "htmlhelp" (
%SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can run HTML Help Workshop with the ^
.hhp project file in %BUILDDIR%/htmlhelp.
goto end
)
if "%1" == "qthelp" (
%SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can run "qcollectiongenerator" with the ^
.qhcp project file in %BUILDDIR%/qthelp, like this:
echo.^> qcollectiongenerator %BUILDDIR%\qthelp\aiographite.qhcp
echo.To view the help file:
echo.^> assistant -collectionFile %BUILDDIR%\qthelp\aiographite.ghc
goto end
)
if "%1" == "devhelp" (
%SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished.
goto end
)
if "%1" == "epub" (
%SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The epub file is in %BUILDDIR%/epub.
goto end
)
if "%1" == "epub3" (
%SPHINXBUILD% -b epub3 %ALLSPHINXOPTS% %BUILDDIR%/epub3
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The epub3 file is in %BUILDDIR%/epub3.
goto end
)
if "%1" == "latex" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
if errorlevel 1 exit /b 1
echo.
echo.Build finished; the LaTeX files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "latexpdf" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
cd %BUILDDIR%/latex
make all-pdf
cd %~dp0
echo.
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "latexpdfja" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
cd %BUILDDIR%/latex
make all-pdf-ja
cd %~dp0
echo.
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "text" (
%SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The text files are in %BUILDDIR%/text.
goto end
)
if "%1" == "man" (
%SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The manual pages are in %BUILDDIR%/man.
goto end
)
if "%1" == "texinfo" (
%SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo.
goto end
)
if "%1" == "gettext" (
%SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The message catalogs are in %BUILDDIR%/locale.
goto end
)
if "%1" == "changes" (
%SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes
if errorlevel 1 exit /b 1
echo.
echo.The overview file is in %BUILDDIR%/changes.
goto end
)
if "%1" == "linkcheck" (
%SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck
if errorlevel 1 exit /b 1
echo.
echo.Link check complete; look for any errors in the above output ^
or in %BUILDDIR%/linkcheck/output.txt.
goto end
)
if "%1" == "doctest" (
%SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest
if errorlevel 1 exit /b 1
echo.
echo.Testing of doctests in the sources finished, look at the ^
results in %BUILDDIR%/doctest/output.txt.
goto end
)
if "%1" == "coverage" (
%SPHINXBUILD% -b coverage %ALLSPHINXOPTS% %BUILDDIR%/coverage
if errorlevel 1 exit /b 1
echo.
echo.Testing of coverage in the sources finished, look at the ^
results in %BUILDDIR%/coverage/python.txt.
goto end
)
if "%1" == "xml" (
%SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The XML files are in %BUILDDIR%/xml.
goto end
)
if "%1" == "pseudoxml" (
%SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml.
goto end
)
if "%1" == "dummy" (
%SPHINXBUILD% -b dummy %ALLSPHINXOPTS% %BUILDDIR%/dummy
if errorlevel 1 exit /b 1
echo.
echo.Build finished. Dummy builder generates no files.
goto end
)
:end
popd

View File

@@ -0,0 +1,150 @@
API Reference
=============
Submodules
----------
sanic.app module
----------------
.. automodule:: sanic.app
:members:
:undoc-members:
:show-inheritance:
sanic.blueprints module
-----------------------
.. automodule:: sanic.blueprints
:members:
:undoc-members:
:show-inheritance:
sanic.config module
-------------------
.. automodule:: sanic.config
:members:
:undoc-members:
:show-inheritance:
sanic.constants module
----------------------
.. automodule:: sanic.constants
:members:
:undoc-members:
:show-inheritance:
sanic.cookies module
--------------------
.. automodule:: sanic.cookies
:members:
:undoc-members:
:show-inheritance:
sanic.exceptions module
-----------------------
.. automodule:: sanic.exceptions
:members:
:undoc-members:
:show-inheritance:
sanic.handlers module
---------------------
.. automodule:: sanic.handlers
:members:
:undoc-members:
:show-inheritance:
sanic.log module
----------------
.. automodule:: sanic.log
:members:
:undoc-members:
:show-inheritance:
sanic.request module
--------------------
.. automodule:: sanic.request
:members:
:undoc-members:
:show-inheritance:
sanic.response module
---------------------
.. automodule:: sanic.response
:members:
:undoc-members:
:show-inheritance:
sanic.router module
-------------------
.. automodule:: sanic.router
:members:
:undoc-members:
:show-inheritance:
sanic.server module
-------------------
.. automodule:: sanic.server
:members:
:undoc-members:
:show-inheritance:
sanic.static module
-------------------
.. automodule:: sanic.static
:members:
:undoc-members:
:show-inheritance:
sanic.testing module
--------------------
.. automodule:: sanic.testing
:members:
:undoc-members:
:show-inheritance:
sanic.views module
------------------
.. automodule:: sanic.views
:members:
:undoc-members:
:show-inheritance:
sanic.websocket module
----------------------
.. automodule:: sanic.websocket
:members:
:undoc-members:
:show-inheritance:
sanic.worker module
-------------------
.. automodule:: sanic.worker
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: sanic
:members:
:undoc-members:
:show-inheritance:

View File

@@ -51,17 +51,89 @@ will look like:
[Route(handler=<function bp_root at 0x7f908382f9d8>, methods=None, pattern=re.compile('^/$'), parameters=[])]
```
## Blueprint groups and nesting
Blueprints may also be registered as part of a list or tuple, where the registrar will recursively cycle through any sub-sequences of blueprints and register them accordingly. The `Blueprint.group` method is provided to simplify this process, allowing a 'mock' backend directory structure mimicking what's seen from the front end. Consider this (quite contrived) example:
```
api/
├──content/
│ ├──authors.py
│ ├──static.py
│ └──__init__.py
├──info.py
└──__init__.py
app.py
```
Initialization of this app's blueprint hierarchy could go as follows:
```python
# api/content/authors.py
from sanic import Blueprint
authors = Blueprint('content_authors', url_prefix='/authors')
```
```python
# api/content/static.py
from sanic import Blueprint
static = Blueprint('content_static', url_prefix='/static')
```
```python
# api/content/__init__.py
from sanic import Blueprint
from .static import static
from .authors import authors
content = Blueprint.group(assets, authors, url_prefix='/content')
```
```python
# api/info.py
from sanic import Blueprint
info = Blueprint('info', url_prefix='/info')
```
```python
# api/__init__.py
from sanic import Blueprint
from .content import content
from .info import info
api = Blueprint.group(content, info, url_prefix='/api')
```
And registering these blueprints in `app.py` can now be done like so:
```python
# app.py
from sanic import Sanic
from .api import api
app = Sanic(__name__)
app.blueprint(api)
```
## Using blueprints
Blueprints have much the same functionality as an application instance.
### WebSocket routes
WebSocket handlers can be registered on a blueprint using the `@bp.websocket`
decorator or `bp.add_websocket_route` method.
### Middleware
Using blueprints allows you to also register middleware globally.
```python
@bp.middleware
async def halt_request(request):
async def print_on_request(request):
print("I am a spy")
@bp.middleware('request')
@@ -88,7 +160,14 @@ def ignore_404s(request, exception):
Static files can be served globally, under the blueprint prefix.
```python
bp.static('/folder/to/serve', '/web/path')
# suppose bp.name == 'bp'
bp.static('/web/path', '/folder/to/serve')
# also you can pass name parameter to it for url_for
bp.static('/web/path', '/folder/to/server', name='uploads')
app.url_for('static', name='bp.uploads', filename='file.txt') == '/bp/web/path/file.txt'
```
## Start and stop
@@ -111,7 +190,7 @@ bp = Blueprint('my_blueprint')
async def setup_connection(app, loop):
global database
database = mysql.connect(host='127.0.0.1'...)
@bp.listener('after_server_stop')
async def close_connection(app, loop):
await database.close()
@@ -137,7 +216,7 @@ blueprint_v2 = Blueprint('v2', url_prefix='/v2')
@blueprint_v1.route('/')
async def api_v1_root(request):
return text('Welcome to version 1 of our documentation')
@blueprint_v2.route('/')
async def api_v2_root(request):
return text('Welcome to version 2 of our documentation')
@@ -164,10 +243,10 @@ app.run(host='0.0.0.0', port=8000, debug=True)
If you wish to generate a URL for a route inside of a blueprint, remember that the endpoint name
takes the format `<blueprint_name>.<handler_name>`. For example:
```
```python
@blueprint_v1.route('/')
async def root(request):
url = app.url_for('v1.post_handler', post_id=5) # --> '/v1/post/5'
url = request.app.url_for('v1.post_handler', post_id=5) # --> '/v1/post/5'
return redirect(url)

View File

@@ -48,6 +48,24 @@ app.add_route(SimpleView.as_view(), '/')
```
You can also use `async` syntax.
```python
from sanic import Sanic
from sanic.views import HTTPMethodView
from sanic.response import text
app = Sanic('some_name')
class SimpleAsyncView(HTTPMethodView):
async def get(self, request):
return text('I am async get method')
app.add_route(SimpleAsyncView.as_view(), '/')
```
## URL parameters
If you need any URL parameters, as discussed in the routing guide, include them
@@ -74,10 +92,27 @@ class ViewWithDecorator(HTTPMethodView):
def get(self, request, name):
return text('Hello I have a decorator')
def post(self, request, name):
return text("Hello I also have a decorator")
app.add_route(ViewWithDecorator.as_view(), '/url')
```
#### URL Building
But if you just want to decorate some functions and not all functions, you can do as follows:
```python
class ViewWithSomeDecorator(HTTPMethodView):
@staticmethod
@some_decorator_here
def get(request, name):
return text("Hello I have a decorator")
def post(self, request, name):
return text("Hello I don't have any decorators")
```
## URL Building
If you wish to build a URL for an HTTPMethodView, remember that the class name will be the endpoint
that you will pass into `url_for`. For example:
@@ -128,4 +163,4 @@ view.add(['POST', 'PUT'], lambda request: text('I am a post/put method'))
app.add_route(view, '/')
```
Note: currently you cannot build a URL for a CompositionView using `url_for`.
Note: currently you cannot build a URL for a CompositionView using `url_for`.

119
docs/sanic/config.md Normal file
View File

@@ -0,0 +1,119 @@
# Configuration
Any reasonably complex application will need configuration that is not baked into the actual code. Settings might be different for different environments or installations.
## Basics
Sanic holds the configuration in the `config` attribute of the application object. The configuration object is merely an object that can be modified either using dot-notation or like a dictionary:
```
app = Sanic('myapp')
app.config.DB_NAME = 'appdb'
app.config.DB_USER = 'appuser'
```
Since the config object actually is a dictionary, you can use its `update` method in order to set several values at once:
```
db_settings = {
'DB_HOST': 'localhost',
'DB_NAME': 'appdb',
'DB_USER': 'appuser'
}
app.config.update(db_settings)
```
In general the convention is to only have UPPERCASE configuration parameters. The methods described below for loading configuration only look for such uppercase parameters.
## Loading Configuration
There are several ways how to load configuration.
### From Environment Variables
Any variables defined with the `SANIC_` prefix will be applied to the sanic config. For example, setting `SANIC_REQUEST_TIMEOUT` will be loaded by the application automatically and fed into the `REQUEST_TIMEOUT` config variable. You can pass a different prefix to Sanic:
```python
app = Sanic(load_env='MYAPP_')
```
Then the above variable would be `MYAPP_REQUEST_TIMEOUT`. If you want to disable loading from environment variables you can set it to `False` instead:
```python
app = Sanic(load_env=False)
```
### From an Object
If there are a lot of configuration values and they have sensible defaults it might be helpful to put them into a module:
```
import myapp.default_settings
app = Sanic('myapp')
app.config.from_object(myapp.default_settings)
```
You could use a class or any other object as well.
### From a File
Usually you will want to load configuration from a file that is not part of the distributed application. You can load configuration from a file using `from_pyfile(/path/to/config_file)`. However, that requires the program to know the path to the config file. So instead you can specify the location of the config file in an environment variable and tell Sanic to use that to find the config file:
```
app = Sanic('myapp')
app.config.from_envvar('MYAPP_SETTINGS')
```
Then you can run your application with the `MYAPP_SETTINGS` environment variable set:
```
$ MYAPP_SETTINGS=/path/to/config_file python3 myapp.py
INFO: Goin' Fast @ http://0.0.0.0:8000
```
The config files are regular Python files which are executed in order to load them. This allows you to use arbitrary logic for constructing the right configuration. Only uppercase variables are added to the configuration. Most commonly the configuration consists of simple key value pairs:
```
# config_file
DB_HOST = 'localhost'
DB_NAME = 'appdb'
DB_USER = 'appuser'
```
## Builtin Configuration Values
Out of the box there are just a few predefined values which can be overwritten when creating the application.
| Variable | Default | Description |
| ------------------ | --------- | --------------------------------------------- |
| REQUEST_MAX_SIZE | 100000000 | How big a request may be (bytes) |
| REQUEST_TIMEOUT | 60 | How long a request can take to arrive (sec) |
| RESPONSE_TIMEOUT | 60 | How long a response can take to process (sec) |
| KEEP_ALIVE | True | Disables keep-alive when False |
| KEEP_ALIVE_TIMEOUT | 5 | How long to hold a TCP connection open (sec) |
### The different Timeout variables:
A request timeout measures the duration of time between the instant when a new open TCP connection is passed to the Sanic backend server, and the instant when the whole HTTP request is received. If the time taken exceeds the `REQUEST_TIMEOUT` value (in seconds), this is considered a Client Error so Sanic generates a HTTP 408 response and sends that to the client. Adjust this value higher if your clients routinely pass very large request payloads or upload requests very slowly.
A response timeout measures the duration of time between the instant the Sanic server passes the HTTP request to the Sanic App, and the instant a HTTP response is sent to the client. If the time taken exceeds the `RESPONSE_TIMEOUT` value (in seconds), this is considered a Server Error so Sanic generates a HTTP 503 response and sets that to the client. Adjust this value higher if your application is likely to have long-running process that delay the generation of a response.
### What is Keep Alive? And what does the Keep Alive Timeout value do?
Keep-Alive is a HTTP feature indroduced in HTTP 1.1. When sending a HTTP request, the client (usually a web browser application) can set a Keep-Alive header to indicate for the http server (Sanic) to not close the TCP connection after it has send the response. This allows the client to reuse the existing TCP connection to send subsequent HTTP requests, and ensures more efficient network traffic for both the client and the server.
The `KEEP_ALIVE` config variable is set to `True` in Sanic by default. If you don't need this feature in your application, set it to `False` to cause all client connections to close immediately after a response is sent, regardless of the Keep-Alive header on the request.
The amount of time the server holds the TCP connection open is decided by the server itself. In Sanic, that value is configured using the `KEEP_ALIVE_TIMEOUT` value. By default, it is set to 5 seconds, this is the same default setting as the Apache HTTP server and is a good balance between allowing enough time for the client to send a new request, and not holding open too many connections at once. Do not exceed 75 seconds unless you know your clients are using a browser which supports TCP connections held open for that long.
For reference:
```
Apache httpd server default keepalive timeout = 5 seconds
Nginx server default keepalive timeout = 75 seconds
Nginx performance tuning guidelines uses keepalive = 15 seconds
IE (5-9) client hard keepalive limit = 60 seconds
Firefox client hard keepalive limit = 115 seconds
Opera 11 client hard keepalive limit = 120 seconds
Chrome 13+ client keepalive limit > 300+ seconds
```

View File

@@ -4,10 +4,39 @@ Thank you for your interest! Sanic is always looking for contributors. If you
don't feel comfortable contributing code, adding docstrings to the source files
is very appreciated.
## Installation
To develop on sanic (and mainly to just run the tests) it is highly recommend to
install from sources.
So assume you have already cloned the repo and are in the working directory with
a virtual environment already set up, then run:
```bash
python setup.py develop && pip install -r requirements-dev.txt
```
## Running tests
* `python -m pip install pytest`
* `python -m pytest tests`
To run the tests for sanic it is recommended to use tox like so:
```bash
tox
```
See it's that simple!
## Pull requests!
So the pull request approval rules are pretty simple:
1. All pull requests must pass unit tests
* All pull requests must be reviewed and approved by at least
one current collaborator on the project
* All pull requests must pass flake8 checks
* If you decide to remove/change anything from any common interface
a deprecation message should accompany it.
* If you implement a new feature you should have at least one unit
test to accompany it.
## Documentation

View File

@@ -1,75 +0,0 @@
# Cookies
Cookies are pieces of data which persist inside a user's browser. Sanic can
both read and write cookies, which are stored as key-value pairs.
## Reading cookies
A user's cookies can be accessed `Request` object's `cookie` dictionary.
```python
from sanic.response import text
@app.route("/cookie")
async def test(request):
test_cookie = request.cookies.get('test')
return text("Test cookie set to: {}".format(test_cookie))
```
## Writing cookies
When returning a response, cookies can be set on the `Response` object.
```python
from sanic.response import text
@app.route("/cookie")
async def test(request):
response = text("There's a cookie up in this response")
response.cookies['test'] = 'It worked!'
response.cookies['test']['domain'] = '.gotta-go-fast.com'
response.cookies['test']['httponly'] = True
return response
```
## Deleting cookies
Cookies can be removed semantically or explicitly.
```python
from sanic.response import text
@app.route("/cookie")
async def test(request):
response = text("Time to eat some cookies muahaha")
# This cookie will be set to expire in 0 seconds
del response.cookies['kill_me']
# This cookie will self destruct in 5 seconds
response.cookies['short_life'] = 'Glad to be here'
response.cookies['short_life']['max-age'] = 5
del response.cookies['favorite_color']
# This cookie will remain unchanged
response.cookies['favorite_color'] = 'blue'
response.cookies['favorite_color'] = 'pink'
del response.cookies['favorite_color']
return response
```
Response cookies can be set like dictionary values and have the following
parameters available:
- `expires` (datetime): The time for the cookie to expire on the
client's browser.
- `path` (string): The subset of URLs to which this cookie applies. Defaults to /.
- `comment` (string): A comment (metadata).
- `domain` (string): Specifies the domain for which the cookie is valid. An
explicitly specified domain must always start with a dot.
- `max-age` (number): Number of seconds the cookie should live for.
- `secure` (boolean): Specifies whether the cookie will only be sent via
HTTPS.
- `httponly` (boolean): Specifies whether the cookie cannot be read by
Javascript.

87
docs/sanic/cookies.rst Normal file
View File

@@ -0,0 +1,87 @@
Cookies
=======
Cookies are pieces of data which persist inside a user's browser. Sanic can
both read and write cookies, which are stored as key-value pairs.
.. warning::
Cookies can be freely altered by the client. Therefore you cannot just store
data such as login information in cookies as-is, as they can be freely altered
by the client. To ensure data you store in cookies is not forged or tampered
with by the client, use something like `itsdangerous`_ to cryptographically
sign the data.
Reading cookies
---------------
A user's cookies can be accessed via the ``Request`` object's ``cookies`` dictionary.
.. code-block:: python
from sanic.response import text
@app.route("/cookie")
async def test(request):
test_cookie = request.cookies.get('test')
return text("Test cookie set to: {}".format(test_cookie))
Writing cookies
---------------
When returning a response, cookies can be set on the ``Response`` object.
.. code-block:: python
from sanic.response import text
@app.route("/cookie")
async def test(request):
response = text("There's a cookie up in this response")
response.cookies['test'] = 'It worked!'
response.cookies['test']['domain'] = '.gotta-go-fast.com'
response.cookies['test']['httponly'] = True
return response
Deleting cookies
----------------
Cookies can be removed semantically or explicitly.
.. code-block:: python
from sanic.response import text
@app.route("/cookie")
async def test(request):
response = text("Time to eat some cookies muahaha")
# This cookie will be set to expire in 0 seconds
del response.cookies['kill_me']
# This cookie will self destruct in 5 seconds
response.cookies['short_life'] = 'Glad to be here'
response.cookies['short_life']['max-age'] = 5
del response.cookies['favorite_color']
# This cookie will remain unchanged
response.cookies['favorite_color'] = 'blue'
response.cookies['favorite_color'] = 'pink'
del response.cookies['favorite_color']
return response
Response cookies can be set like dictionary values and have the following
parameters available:
- ``expires`` (datetime): The time for the cookie to expire on the client's browser.
- ``path`` (string): The subset of URLs to which this cookie applies. Defaults to /.
- ``comment`` (string): A comment (metadata).
- ``domain`` (string): Specifies the domain for which the cookie is valid. An
explicitly specified domain must always start with a dot.
- ``max-age`` (number): Number of seconds the cookie should live for.
- ``secure`` (boolean): Specifies whether the cookie will only be sent via HTTPS.
- ``httponly`` (boolean): Specifies whether the cookie cannot be read by Javascript.
.. _itsdangerous: https://pythonhosted.org/itsdangerous/

53
docs/sanic/debug_mode.rst Normal file
View File

@@ -0,0 +1,53 @@
Debug Mode
=============
When enabling Sanic's debug mode, Sanic will provide a more verbose logging output
and by default will enable the Auto Reload feature.
.. warning::
Sanic's debug more will slow down the server's performance
and is therefore advised to enable it only in development environments.
Setting the debug mode
----------------------
By setting the ``debug`` mode a more verbose output from Sanic will be outputed
and the Automatic Reloader will be activated.
.. code-block:: python
from sanic import Sanic
from sanic.response import json
app = Sanic()
@app.route('/')
async def hello_world(request):
return json({"hello": "world"})
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000, debug=True)
Manually setting auto reload
----------------------------
Sanic offers a way to enable or disable the Automatic Reloader manually,
the ``auto_reload`` argument will activate or deactivate the Automatic Reloader.
.. code-block:: python
from sanic import Sanic
from sanic.response import json
app = Sanic()
@app.route('/')
async def hello_world(request):
return json({"hello": "world"})
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000, auto_reload=True)

39
docs/sanic/decorators.md Normal file
View File

@@ -0,0 +1,39 @@
# Handler Decorators
Since Sanic handlers are simple Python functions, you can apply decorators to them in a similar manner to Flask. A typical use case is when you want some code to run before a handler's code is executed.
## Authorization Decorator
Let's say you want to check that a user is authorized to access a particular endpoint. You can create a decorator that wraps a handler function, checks a request if the client is authorized to access a resource, and sends the appropriate response.
```python
from functools import wraps
from sanic.response import json
def authorized():
def decorator(f):
@wraps(f)
async def decorated_function(request, *args, **kwargs):
# run some method that checks the request
# for the client's authorization status
is_authorized = check_request_for_authorization_status(request)
if is_authorized:
# the user is authorized.
# run the handler method and return the response
response = await f(request, *args, **kwargs)
return response
else:
# the user is not authorized.
return json({'status': 'not_authorized'}, 403)
return decorated_function
return decorator
@app.route("/")
@authorized()
async def test(request):
return json({status: 'authorized'})
```

View File

@@ -44,3 +44,35 @@ directly run by the interpreter.
if __name__ == '__main__':
app.run(host='0.0.0.0', port=1337, workers=4)
```
## Running via Gunicorn
[Gunicorn](http://gunicorn.org/) Green Unicorn is a WSGI HTTP Server for UNIX.
Its a pre-fork worker model ported from Rubys Unicorn project.
In order to run Sanic application with Gunicorn, you need to use the special `sanic.worker.GunicornWorker`
for Gunicorn `worker-class` argument:
```
gunicorn myapp:app --bind 0.0.0.0:1337 --worker-class sanic.worker.GunicornWorker
```
If your application suffers from memory leaks, you can configure Gunicorn to gracefully restart a worker
after it has processed a given number of requests. This can be a convenient way to help limit the effects
of the memory leak.
See the [Gunicorn Docs](http://docs.gunicorn.org/en/latest/settings.html#max-requests) for more information.
## Asynchronous support
This is suitable if you *need* to share the sanic process with other applications, in particular the `loop`.
However be advised that this method does not support using multiple processes, and is not the preferred way
to run the app in general.
Here is an incomplete example (please see `run_async.py` in examples for something more practical):
```python
server = app.create_server(host="0.0.0.0", port=8000)
loop = asyncio.get_event_loop()
task = asyncio.ensure_future(server)
loop.run_forever()
```

View File

@@ -13,10 +13,23 @@ To throw an exception, simply `raise` the relevant exception from the
from sanic.exceptions import ServerError
@app.route('/killme')
def i_am_ready_to_die(request):
async def i_am_ready_to_die(request):
raise ServerError("Something bad happened", status_code=500)
```
You can also use the `abort` function with the appropriate status code:
```python
from sanic.exceptions import abort
from sanic.response import text
@app.route('/youshallnotpass')
async def no_no(request):
abort(401)
# this won't happen
text("OK")
```
## Handling exceptions
To override Sanic's default handling of an exception, the `@app.exception`
@@ -30,7 +43,7 @@ from sanic.response import text
from sanic.exceptions import NotFound
@app.exception(NotFound)
def ignore_404s(request, exception):
async def ignore_404s(request, exception):
return text("Yep, I totally found the page: {}".format(request.url))
```

View File

@@ -1,14 +1,34 @@
# Extensions
A list of Sanic extensions created by the community.
- [Sanic-Plugins-Framework](https://github.com/ashleysommer/sanicpluginsframework): Library for easily creating and using Sanic plugins.
- [Sessions](https://github.com/subyraman/sanic_session): Support for sessions.
Allows using redis, memcache or an in memory store.
- [CORS](https://github.com/ashleysommer/sanic-cors): A port of flask-cors.
- [Compress](https://github.com/subyraman/sanic_compress): Allows you to easily gzip Sanic responses. A port of Flask-Compress.
- [Jinja2](https://github.com/lixxu/sanic-jinja2): Support for Jinja2 template.
- [Sanic JWT](https://github.com/ahopkins/sanic-jwt): Authentication, JWT, and permission scoping for Sanic.
- [OpenAPI/Swagger](https://github.com/channelcat/sanic-openapi): OpenAPI support, plus a Swagger UI.
- [Pagination](https://github.com/lixxu/python-paginate): Simple pagination support.
- [Motor](https://github.com/lixxu/sanic-motor): Simple motor wrapper.
- [Sanic CRUD](https://github.com/Typhon66/sanic_crud): CRUD REST API generation with peewee models.
- [UserAgent](https://github.com/lixxu/sanic-useragent): Add `user_agent` to request
- [Limiter](https://github.com/bohea/sanic-limiter): Rate limiting for sanic.
- [Sanic EnvConfig](https://github.com/jamesstidard/sanic-envconfig): Pull environment variables into your sanic config.
- [Babel](https://github.com/lixxu/sanic-babel): Adds i18n/l10n support to Sanic applications with the help of the
`Babel` library
- [Dispatch](https://github.com/ashleysommer/sanic-dispatcher): A dispatcher inspired by `DispatcherMiddleware` in werkzeug. Can act as a Sanic-to-WSGI adapter.
- [Sanic-OAuth](https://github.com/Sniedes722/Sanic-OAuth): OAuth Library for connecting to & creating your own token providers.
- [sanic-oauth](https://gitlab.com/SirEdvin/sanic-oauth): OAuth Library with many provider and OAuth1/OAuth2 support.
- [Sanic-nginx-docker-example](https://github.com/itielshwartz/sanic-nginx-docker-example): Simple and easy to use example of Sanic behined nginx using docker-compose.
- [sanic-graphql](https://github.com/graphql-python/sanic-graphql): GraphQL integration with Sanic
- [sanic-prometheus](https://github.com/dkruchinin/sanic-prometheus): Prometheus metrics for Sanic
- [Sanic-RestPlus](https://github.com/ashleysommer/sanic-restplus): A port of Flask-RestPlus for Sanic. Full-featured REST API with SwaggerUI generation.
- [sanic-transmute](https://github.com/yunstanford/sanic-transmute): A Sanic extension that generates APIs from python function and classes, and also generates Swagger UI/documentation automatically.
- [pytest-sanic](https://github.com/yunstanford/pytest-sanic): A pytest plugin for Sanic. It helps you to test your code asynchronously.
- [jinja2-sanic](https://github.com/yunstanford/jinja2-sanic): a jinja2 template renderer for Sanic.([Documentation](http://jinja2-sanic.readthedocs.io/en/latest/))
- [GINO](https://github.com/fantix/gino): An asyncio ORM on top of SQLAlchemy core, delivered with a Sanic extension. ([Documentation](https://python-gino.readthedocs.io/))
- [Sanic-Auth](https://github.com/pyx/sanic-auth): A minimal backend agnostic session-based user authentication mechanism for Sanic.
- [Sanic-CookieSession](https://github.com/pyx/sanic-cookiesession): A client-side only, cookie-based session, similar to the built-in session in Flask.
- [Sanic-WTF](https://github.com/pyx/sanic-wtf): Sanic-WTF makes using WTForms with Sanic and CSRF (Cross-Site Request Forgery) protection a little bit easier.
- [sanic-sse](https://github.com/inn0kenty/sanic_sse): [Server-Sent Events](https://en.wikipedia.org/wiki/Server-sent_events) implementation for Sanic.

View File

@@ -9,15 +9,16 @@ syntax, so earlier versions of python won't work.
```python
from sanic import Sanic
from sanic.response import text
from sanic.response import json
app = Sanic(__name__)
app = Sanic()
@app.route("/")
async def test(request):
return text('Hello world!')
return json({"hello": "world"})
app.run(host="0.0.0.0", port=8000, debug=True)
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8000)
```
3. Run the server: `python3 main.py`

View File

@@ -7,8 +7,8 @@ On top of being Flask-like, Sanic supports async request handlers. This means y
Sanic is developed `on GitHub <https://github.com/channelcat/sanic/>`_. Contributions are welcome!
Sanic aspires to be simple:
-------------------
Sanic aspires to be simple
---------------------------
.. code:: python

85
docs/sanic/logging.md Normal file
View File

@@ -0,0 +1,85 @@
# Logging
Sanic allows you to do different types of logging (access log, error log) on the requests based on the [python3 logging API](https://docs.python.org/3/howto/logging.html). You should have some basic knowledge on python3 logging if you want to create a new configuration.
### Quick Start
A simple example using default settings would be like this:
```python
from sanic import Sanic
app = Sanic('test')
@app.route('/')
async def test(request):
return response.text('Hello World!')
if __name__ == "__main__":
app.run(debug=True, access_log=True)
```
To use your own logging config, simply use `logging.config.dictConfig`, or
pass `log_config` when you initialize `Sanic` app:
```python
app = Sanic('test', log_config=LOGGING_CONFIG)
```
And to close logging, simply assign access_log=False:
```python
if __name__ == "__main__":
app.run(access_log=False)
```
This would skip calling logging functions when handling requests.
And you could even do further in production to gain extra speed:
```python
if __name__ == "__main__":
# disable debug messages
app.run(debug=False, access_log=False)
```
### Configuration
By default, log_config parameter is set to use sanic.log.LOGGING_CONFIG_DEFAULTS dictionary for configuration.
There are three `loggers` used in sanic, and **must be defined if you want to create your own logging configuration**:
- root:<br>
Used to log internal messages.
- sanic.error:<br>
Used to log error logs.
- sanic.access:<br>
Used to log access logs.
#### Log format:
In addition to default parameters provided by python (asctime, levelname, message),
Sanic provides additional parameters for access logger with:
- host (str)<br>
request.ip
- request (str)<br>
request.method + " " + request.url
- status (int)<br>
response.status
- byte (int)<br>
len(response.body)
The default access log format is
```python
%(asctime)s - (%(name)s)[%(levelname)s][%(host)s]: %(request)s %(message)s %(status)d %(byte)d
```

View File

@@ -1,14 +1,19 @@
# Middleware
# Middleware And Listeners
Middleware are functions which are executed before or after requests to the
server. They can be used to modify the *request to* or *response from*
user-defined handler functions.
Additionally, Sanic provides listeners which allow you to run code at various points of your application's lifecycle.
## Middleware
There are two types of middleware: request and response. Both are declared
using the `@app.middleware` decorator, with the decorator's parameter being a
string representing its type: `'request'` or `'response'`. Response middleware
receives both the request and the response as arguments.
string representing its type: `'request'` or `'response'`.
* Request middleware receives only the `request` as argument.
* Response middleware receives both the `request` and `response`.
The simplest middleware doesn't modify the request or response at all:
@@ -64,3 +69,79 @@ async def halt_request(request):
async def halt_response(request, response):
return text('I halted the response')
```
## Listeners
If you want to execute startup/teardown code as your server starts or closes, you can use the following listeners:
- `before_server_start`
- `after_server_start`
- `before_server_stop`
- `after_server_stop`
These listeners are implemented as decorators on functions which accept the app object as well as the asyncio loop.
For example:
```python
@app.listener('before_server_start')
async def setup_db(app, loop):
app.db = await db_setup()
@app.listener('after_server_start')
async def notify_server_started(app, loop):
print('Server successfully started!')
@app.listener('before_server_stop')
async def notify_server_stopping(app, loop):
print('Server shutting down!')
@app.listener('after_server_stop')
async def close_db(app, loop):
await app.db.close()
```
It's also possible to register a listener using the `register_listener` method.
This may be useful if you define your listeners in another module besides
the one you instantiate your app in.
```python
app = Sanic()
async def setup_db(app, loop):
app.db = await db_setup()
app.register_listener(setup_db, 'before_server_start')
```
If you want to schedule a background task to run after the loop has started,
Sanic provides the `add_task` method to easily do so.
```python
async def notify_server_started_after_five_seconds():
await asyncio.sleep(5)
print('Server successfully started!')
app.add_task(notify_server_started_after_five_seconds())
```
Sanic will attempt to automatically inject the app, passing it as an argument to the task:
```python
async def notify_server_started_after_five_seconds(app):
await asyncio.sleep(5)
print(app.name)
app.add_task(notify_server_started_after_five_seconds)
```
Or you can pass the app explicitly for the same effect:
```python
async def notify_server_started_after_five_seconds(app):
await asyncio.sleep(5)
print(app.name)
app.add_task(notify_server_started_after_five_seconds(app))
`

View File

@@ -9,30 +9,34 @@ The following variables are accessible as properties on `Request` objects:
```python
from sanic.response import json
@app.route("/json")
def post_json(request):
return json({ "received": True, "message": request.json })
```
- `args` (dict) - Query string variables. A query string is the section of a
URL that resembles `?key1=value1&key2=value2`. If that URL were to be parsed,
the `args` dictionary would look like `{'key1': 'value1', 'key2': 'value2'}`.
the `args` dictionary would look like `{'key1': ['value1'], 'key2': ['value2']}`.
The request's `query_string` variable holds the unparsed string value.
```python
from sanic.response import json
@app.route("/query_string")
def query_string(request):
return json({ "parsed": True, "args": request.args, "url": request.url, "query_string": request.query_string })
```
- `raw_args` (dict) - On many cases you would need to access the url arguments in
a less packed dictionary. For same previous URL `?key1=value1&key2=value2`, the
`raw_args` dictionary would look like `{'key1': 'value1', 'key2': 'value2'}`.
- `files` (dictionary of `File` objects) - List of files that have a name, body, and type
```python
from sanic.response import json
@app.route("/files")
def post_json(request):
test_file = request.files.get('test')
@@ -50,7 +54,7 @@ The following variables are accessible as properties on `Request` objects:
```python
from sanic.response import json
@app.route("/form")
def post_json(request):
return json({ "received": True, "form_data": request.form, "test": request.form.get('test') })
@@ -58,17 +62,25 @@ The following variables are accessible as properties on `Request` objects:
- `body` (bytes) - Posted raw body. This property allows retrieval of the
request's raw data, regardless of content type.
```python
from sanic.response import text
@app.route("/users", methods=["POST",])
def create_user(request):
return text("You are trying to create a user with the following POST: %s" % request.body)
```
- `headers` (dict) - A case-insensitive dictionary that contains the request headers.
- `method` (str) - HTTP method of the request (ie `GET`, `POST`).
- `ip` (str) - IP address of the requester.
- `port` (str) - Port address of the requester.
- `socket` (tuple) - (IP, port) of the requester.
- `app` - a reference to the Sanic application object that is handling this request. This is useful when inside blueprints or other handlers in modules that do not have access to the global `app` object.
```python
@@ -85,6 +97,14 @@ The following variables are accessible as properties on `Request` objects:
return json({'status': 'production'})
```
- `url`: The full URL of the request, ie: `http://localhost:8000/posts/1/?foo=bar`
- `scheme`: The URL scheme associated with the request: `http` or `https`
- `host`: The host associated with the request: `localhost:8080`
- `path`: The path of the request: `/posts/1/`
- `query_string`: The query string of the request: `foo=bar` or a blank string `''`
- `uri_template`: Template for matching route handler: `/posts/<id>/`
- `token`: The value of Authorization header: `Basic YWRtaW46YWRtaW4=`
## Accessing values using `get` and `getlist`

112
docs/sanic/response.md Normal file
View File

@@ -0,0 +1,112 @@
# Response
Use functions in `sanic.response` module to create responses.
## Plain Text
```python
from sanic import response
@app.route('/text')
def handle_request(request):
return response.text('Hello world!')
```
## HTML
```python
from sanic import response
@app.route('/html')
def handle_request(request):
return response.html('<p>Hello world!</p>')
```
## JSON
```python
from sanic import response
@app.route('/json')
def handle_request(request):
return response.json({'message': 'Hello world!'})
```
## File
```python
from sanic import response
@app.route('/file')
async def handle_request(request):
return await response.file('/srv/www/whatever.png')
```
## Streaming
```python
from sanic import response
@app.route("/streaming")
async def index(request):
async def streaming_fn(response):
response.write('foo')
response.write('bar')
return response.stream(streaming_fn, content_type='text/plain')
```
## File Streaming
For large files, a combination of File and Streaming above
```python
from sanic import response
@app.route('/big_file.png')
async def handle_request(request):
return await response.file_stream('/srv/www/whatever.png')
```
## Redirect
```python
from sanic import response
@app.route('/redirect')
def handle_request(request):
return response.redirect('/json')
```
## Raw
Response without encoding the body
```python
from sanic import response
@app.route('/raw')
def handle_request(request):
return response.raw(b'raw data')
```
## Modify headers or status
To modify headers or status code, pass the `headers` or `status` argument to those functions:
```python
from sanic import response
@app.route('/json')
def handle_request(request):
return response.json(
{'message': 'Hello world!'},
headers={'X-Served-By': 'sanic'},
status=200
)
```

View File

@@ -52,7 +52,7 @@ async def integer_handler(request, integer_arg):
async def number_handler(request, number_arg):
return text('Number - {}'.format(number_arg))
@app.route('/person/<name:[A-z]>')
@app.route('/person/<name:[A-z]+>')
async def person_handler(request, name):
return text('Person - {}'.format(name))
@@ -138,13 +138,14 @@ app.add_route(person_handler2, '/person/<name:[A-z]>', methods=['GET'])
Sanic provides a `url_for` method, to generate URLs based on the handler method name. This is useful if you want to avoid hardcoding url paths into your app; instead, you can just reference the handler name. For example:
```python
from sanic.response import redirect
@app.route('/')
async def index(request):
# generate a URL for the endpoint `post_handler`
url = app.url_for('post_handler', post_id=5)
# the URL is `/posts/5`, redirect to it
return redirect(url)
return redirect(url)
@app.route('/posts/<post_id>')
async def post_handler(request, post_id):
@@ -181,3 +182,154 @@ url = app.url_for('post_handler', post_id=5, arg_one=['one', 'two'], arg_two=2,
# http://another_server:8888/posts/5?arg_one=one&arg_one=two&arg_two=2#anchor
```
- All valid parameters must be passed to `url_for` to build a URL. If a parameter is not supplied, or if a parameter does not match the specified type, a `URLBuildError` will be thrown.
## WebSocket routes
Routes for the WebSocket protocol can be defined with the `@app.websocket`
decorator:
```python
@app.websocket('/feed')
async def feed(request, ws):
while True:
data = 'hello!'
print('Sending: ' + data)
await ws.send(data)
data = await ws.recv()
print('Received: ' + data)
```
Alternatively, the `app.add_websocket_route` method can be used instead of the
decorator:
```python
async def feed(request, ws):
pass
app.add_websocket_route(my_websocket_handler, '/feed')
```
Handlers for a WebSocket route are passed the request as first argument, and a
WebSocket protocol object as second argument. The protocol object has `send`
and `recv` methods to send and receive data respectively.
WebSocket support requires the [websockets](https://github.com/aaugustin/websockets)
package by Aymeric Augustin.
## About `strict_slashes`
You can make `routes` strict to trailing slash or not, it's configurable.
```python
# provide default strict_slashes value for all routes
app = Sanic('test_route_strict_slash', strict_slashes=True)
# you can also overwrite strict_slashes value for specific route
@app.get('/get', strict_slashes=False)
def handler(request):
return text('OK')
# It also works for blueprints
bp = Blueprint('test_bp_strict_slash', strict_slashes=True)
@bp.get('/bp/get', strict_slashes=False)
def handler(request):
return text('OK')
app.blueprint(bp)
```
## User defined route name
You can pass `name` to change the route name to avoid using the default name (`handler.__name__`).
```python
app = Sanic('test_named_route')
@app.get('/get', name='get_handler')
def handler(request):
return text('OK')
# then you need use `app.url_for('get_handler')`
# instead of # `app.url_for('handler')`
# It also works for blueprints
bp = Blueprint('test_named_bp')
@bp.get('/bp/get', name='get_handler')
def handler(request):
return text('OK')
app.blueprint(bp)
# then you need use `app.url_for('test_named_bp.get_handler')`
# instead of `app.url_for('test_named_bp.handler')`
# different names can be used for same url with different methods
@app.get('/test', name='route_test')
def handler(request):
return text('OK')
@app.post('/test', name='route_post')
def handler2(request):
return text('OK POST')
@app.put('/test', name='route_put')
def handler3(request):
return text('OK PUT')
# below url are the same, you can use any of them
# '/test'
app.url_for('route_test')
# app.url_for('route_post')
# app.url_for('route_put')
# for same handler name with different methods
# you need specify the name (it's url_for issue)
@app.get('/get')
def handler(request):
return text('OK')
@app.post('/post', name='post_handler')
def handler(request):
return text('OK')
# then
# app.url_for('handler') == '/get'
# app.url_for('post_handler') == '/post'
```
## Build URL for static files
You can use `url_for` for static file url building now.
If it's for file directly, `filename` can be ignored.
```python
app = Sanic('test_static')
app.static('/static', './static')
app.static('/uploads', './uploads', name='uploads')
app.static('/the_best.png', '/home/ubuntu/test.png', name='best_png')
bp = Blueprint('bp', url_prefix='bp')
bp.static('/static', './static')
bp.static('/uploads', './uploads', name='uploads')
bp.static('/the_best.png', '/home/ubuntu/test.png', name='best_png')
app.blueprint(bp)
# then build the url
app.url_for('static', filename='file.txt') == '/static/file.txt'
app.url_for('static', name='static', filename='file.txt') == '/static/file.txt'
app.url_for('static', name='uploads', filename='file.txt') == '/uploads/file.txt'
app.url_for('static', name='best_png') == '/the_best.png'
# blueprint url building
app.url_for('static', name='bp.static', filename='file.txt') == '/bp/static/file.txt'
app.url_for('static', name='bp.uploads', filename='file.txt') == '/bp/uploads/file.txt'
app.url_for('static', name='bp.best_png') == '/bp/static/the_best.png'
```

View File

@@ -9,4 +9,12 @@ Optionally pass in an SSLContext:
context = ssl.create_default_context(purpose=ssl.Purpose.CLIENT_AUTH)
context.load_cert_chain("/path/to/cert", keyfile="/path/to/keyfile")
app.run(host="0.0.0.0", port=8443, ssl=context)
app.run(host="0.0.0.0", port=8443, ssl=context)
You can also pass in the locations of a certificate and key as a dictionary:
.. code:: python
ssl = {'cert': "/path/to/cert", 'key': "/path/to/keyfile"}
app.run(host="0.0.0.0", port=8443, ssl=ssl)

View File

@@ -6,16 +6,40 @@ filename. The file specified will then be accessible via the given endpoint.
```python
from sanic import Sanic
from sanic.blueprints import Blueprint
app = Sanic(__name__)
# Serves files from the static folder to the URL /static
app.static('/static', './static')
# use url_for to build the url, name defaults to 'static' and can be ignored
app.url_for('static', filename='file.txt') == '/static/file.txt'
app.url_for('static', name='static', filename='file.txt') == '/static/file.txt'
# Serves the file /home/ubuntu/test.png when the URL /the_best.png
# is requested
app.static('/the_best.png', '/home/ubuntu/test.png')
app.static('/the_best.png', '/home/ubuntu/test.png', name='best_png')
# you can use url_for to build the static file url
# you can ignore name and filename parameters if you don't define it
app.url_for('static', name='best_png') == '/the_best.png'
app.url_for('static', name='best_png', filename='any') == '/the_best.png'
# you need define the name for other static files
app.static('/another.png', '/home/ubuntu/another.png', name='another')
app.url_for('static', name='another') == '/another.png'
app.url_for('static', name='another', filename='any') == '/another.png'
# also, you can use static for blueprint
bp = Blueprint('bp', url_prefix='/bp')
bp.static('/static', './static')
# servers the file directly
bp.static('/the_best.png', '/home/ubuntu/test.png', name='best_png')
app.blueprint(bp)
app.url_for('static', name='bp.static', filename='file.txt') == '/bp/static/file.txt'
app.url_for('static', name='bp.best_png') == '/bp/test_best.png'
app.run(host="0.0.0.0", port=8000)
```
Note: currently you cannot build a URL for a static file using `url_for`.

106
docs/sanic/streaming.md Normal file
View File

@@ -0,0 +1,106 @@
# Streaming
## Request Streaming
Sanic allows you to get request data by stream, as below. When the request ends, `request.stream.get()` returns `None`. Only post, put and patch decorator have stream argument.
```python
from sanic import Sanic
from sanic.views import CompositionView
from sanic.views import HTTPMethodView
from sanic.views import stream as stream_decorator
from sanic.blueprints import Blueprint
from sanic.response import stream, text
bp = Blueprint('blueprint_request_stream')
app = Sanic('request_stream')
class SimpleView(HTTPMethodView):
@stream_decorator
async def post(self, request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
@app.post('/stream', stream=True)
async def handler(request):
async def streaming(response):
while True:
body = await request.stream.get()
if body is None:
break
body = body.decode('utf-8').replace('1', 'A')
await response.write(body)
return stream(streaming)
@bp.put('/bp_stream', stream=True)
async def bp_handler(request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8').replace('1', 'A')
return text(result)
async def post_handler(request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
app.blueprint(bp)
app.add_route(SimpleView.as_view(), '/method_view')
view = CompositionView()
view.add(['POST'], post_handler, stream=True)
app.add_route(view, '/composition_view')
if __name__ == '__main__':
app.run(host='127.0.0.1', port=8000)
```
## Response Streaming
Sanic allows you to stream content to the client with the `stream` method. This method accepts a coroutine callback which is passed a `StreamingHTTPResponse` object that is written to. A simple example is like follows:
```python
from sanic import Sanic
from sanic.response import stream
app = Sanic(__name__)
@app.route("/")
async def test(request):
async def sample_streaming_fn(response):
await response.write('foo,')
await response.write('bar')
return stream(sample_streaming_fn, content_type='text/csv')
```
This is useful in situations where you want to stream content to the client that originates in an external service, like a database. For example, you can stream database records to the client with the asynchronous cursor that `asyncpg` provides:
```python
@app.route("/")
async def index(request):
async def stream_from_db(response):
conn = await asyncpg.connect(database='test')
async with conn.transaction():
async for record in conn.cursor('SELECT generate_series(0, 10)'):
await response.write(record[0])
return stream(stream_from_db)
```

View File

@@ -20,7 +20,7 @@ def test_index_put_not_allowed():
assert response.status == 405
```
Internally, each time you call one of the `test_client` methods, the Sanic app is run at `127.0.01:42101` and
Internally, each time you call one of the `test_client` methods, the Sanic app is run at `127.0.0.1:42101` and
your test request is executed against your application, using `aiohttp`.
The `test_client` methods accept the following arguments and keyword arguments:
@@ -59,15 +59,69 @@ the available arguments to aiohttp can be found
[in the documentation for ClientSession](https://aiohttp.readthedocs.io/en/stable/client_reference.html#client-session).
### Deprecated: `sanic_endpoint_test`
## pytest-sanic
Prior to version 0.3.2, testing was provided through the `sanic_endpoint_test` method. This method will be deprecated in the next major version after 0.4.0; please use the `test_client` instead.
[pytest-sanic](https://github.com/yunstanford/pytest-sanic) is a pytest plugin, it helps you to test your code asynchronously.
Just write tests like,
```
from sanic.utils import sanic_endpoint_test
def test_index_returns_200():
request, response = sanic_endpoint_test(app)
assert response.status == 200
```python
async def test_sanic_db_find_by_id(app):
"""
Let's assume that, in db we have,
{
"id": "123",
"name": "Kobe Bryant",
"team": "Lakers",
}
"""
doc = await app.db["players"].find_by_id("123")
assert doc.name == "Kobe Bryant"
assert doc.team == "Lakers"
```
[pytest-sanic](https://github.com/yunstanford/pytest-sanic) also provides some useful fixtures, like loop, unused_port,
test_server, test_client.
```python
@pytest.yield_fixture
def app():
app = Sanic("test_sanic_app")
@app.route("/test_get", methods=['GET'])
async def test_get(request):
return response.json({"GET": True})
@app.route("/test_post", methods=['POST'])
async def test_post(request):
return response.json({"POST": True})
yield app
@pytest.fixture
def test_cli(loop, app, test_client):
return loop.run_until_complete(test_client(app, protocol=WebSocketProtocol))
#########
# Tests #
#########
async def test_fixture_test_client_get(test_cli):
"""
GET request
"""
resp = await test_cli.get('/test_get')
assert resp.status == 200
resp_json = await resp.json()
assert resp_json == {"GET": True}
async def test_fixture_test_client_post(test_cli):
"""
POST request
"""
resp = await test_cli.post('/test_post')
assert resp.status == 200
resp_json = await resp.json()
assert resp_json == {"POST": True}
```

50
docs/sanic/versioning.md Normal file
View File

@@ -0,0 +1,50 @@
# Versioning
You can pass the `version` keyword to the route decorators, or to a blueprint initializer. It will result in the `v{version}` url prefix where `{version}` is the version number.
## Per route
You can pass a version number to the routes directly.
```python
from sanic import response
@app.route('/text', version=1)
def handle_request(request):
return response.text('Hello world! Version 1')
@app.route('/text', version=2)
def handle_request(request):
return response.text('Hello world! Version 2')
app.run(port=80)
```
Then with curl:
```bash
curl localhost/v1/text
curl localhost/v2/text
```
## Global blueprint version
You can also pass a version number to the blueprint, which will apply to all routes.
```python
from sanic import response
from sanic.blueprints import Blueprint
bp = Blueprint('test', version=1)
@bp.route('/html')
def handle_request(request):
return response.html('<p>Hello world!</p>')
```
Then with curl:
```bash
curl localhost/v1/html
```

51
docs/sanic/websocket.rst Normal file
View File

@@ -0,0 +1,51 @@
WebSocket
=========
Sanic supports websockets, to setup a WebSocket:
.. code:: python
from sanic import Sanic
from sanic.response import json
from sanic.websocket import WebSocketProtocol
app = Sanic()
@app.websocket('/feed')
async def feed(request, ws):
while True:
data = 'hello!'
print('Sending: ' + data)
await ws.send(data)
data = await ws.recv()
print('Received: ' + data)
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8000, protocol=WebSocketProtocol)
Alternatively, the ``app.add_websocket_route`` method can be used instead of the
decorator:
.. code:: python
async def feed(request, ws):
pass
app.add_websocket_route(feed, '/feed')
Handlers for a WebSocket route are passed the request as first argument, and a
WebSocket protocol object as second argument. The protocol object has ``send``
and ``recv`` methods to send and receive data respectively.
You could setup your own WebSocket configuration through ``app.config``, like
.. code:: python
app.config.WEBSOCKET_MAX_SIZE = 2 ** 20
app.config.WEBSOCKET_MAX_QUEUE = 32
app.config.WEBSOCKET_READ_LIMIT = 2 ** 16
app.config.WEBSOCKET_WRITE_LIMIT = 2 ** 16
Find more in ``Configuration`` section.

View File

@@ -15,4 +15,6 @@ dependencies:
- httptools>=0.0.9
- ujson>=1.35
- aiofiles>=0.3.0
- https://github.com/channelcat/docutils-fork/zipball/master
- websockets>=3.2
- sphinxcontrib-asyncio>=0.2.0
- https://github.com/channelcat/docutils-fork/zipball/master

View File

@@ -0,0 +1,17 @@
# -*- coding: utf-8 -*-
import asyncio
from sanic import Sanic
app = Sanic()
async def notify_server_started_after_five_seconds():
await asyncio.sleep(5)
print('Server successfully started!')
app.add_task(notify_server_started_after_five_seconds())
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,28 +0,0 @@
from sanic import Sanic
from sanic.response import json
import aiohttp
app = Sanic(__name__)
async def fetch(session, url):
"""
Use session object to perform 'get' request on url
"""
async with session.get(url) as response:
return await response.json()
@app.route("/")
async def test(request):
"""
Download and serve example JSON
"""
url = "https://api.github.com/repos/channelcat/sanic"
async with aiohttp.ClientSession() as session:
response = await fetch(session, url)
return json(response)
app.run(host="0.0.0.0", port=8000, workers=2)

View File

@@ -0,0 +1,42 @@
# -*- coding: utf-8 -*-
from sanic import Sanic
from functools import wraps
from sanic.response import json
app = Sanic()
def check_request_for_authorization_status(request):
# Note: Define your check, for instance cookie, session.
flag = True
return flag
def authorized():
def decorator(f):
@wraps(f)
async def decorated_function(request, *args, **kwargs):
# run some method that checks the request
# for the client's authorization status
is_authorized = check_request_for_authorization_status(request)
if is_authorized:
# the user is authorized.
# run the handler method and return the response
response = await f(request, *args, **kwargs)
return response
else:
# the user is not authorized.
return json({'status': 'not_authorized'}, 403)
return decorated_function
return decorator
@app.route("/")
@authorized()
async def test(request):
return json({'status': 'authorized'})
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,11 +1,10 @@
from sanic import Sanic
from sanic import Blueprint
from sanic.response import json, text
from sanic import Blueprint, Sanic
from sanic.response import file, json
app = Sanic(__name__)
blueprint = Blueprint('name', url_prefix='/my_blueprint')
blueprint2 = Blueprint('name', url_prefix='/my_blueprint2')
blueprint2 = Blueprint('name2', url_prefix='/my_blueprint2')
blueprint3 = Blueprint('name3', url_prefix='/my_blueprint3')
@blueprint.route('/foo')
@@ -18,7 +17,22 @@ async def foo2(request):
return json({'msg': 'hi from blueprint2'})
app.register_blueprint(blueprint)
app.register_blueprint(blueprint2)
@blueprint3.route('/foo')
async def index(request):
return await file('websocket.html')
@app.websocket('/feed')
async def foo3(request, ws):
while True:
data = 'hello!'
print('Sending: ' + data)
await ws.send(data)
data = await ws.recv()
print('Received: ' + data)
app.blueprint(blueprint)
app.blueprint(blueprint2)
app.blueprint(blueprint3)
app.run(host="0.0.0.0", port=8000, debug=True)

View File

@@ -1,46 +0,0 @@
"""
Example of caching using aiocache package. To run it you will need a Redis
instance running in localhost:6379. You can also try with SimpleMemoryCache.
Running this example you will see that the first call lasts 3 seconds and
the rest are instant because the value is retrieved from the Redis.
If you want more info about the package check
https://github.com/argaen/aiocache
"""
import asyncio
import aiocache
from sanic import Sanic
from sanic.response import json
from sanic.log import log
from aiocache import cached
from aiocache.serializers import JsonSerializer
app = Sanic(__name__)
@app.listener('before_server_start')
def init_cache(sanic, loop):
aiocache.settings.set_defaults(
class_="aiocache.RedisCache",
# class_="aiocache.SimpleMemoryCache",
loop=loop
)
@cached(key="my_custom_key", serializer=JsonSerializer())
async def expensive_call():
log.info("Expensive has been called")
await asyncio.sleep(3)
return {"test": True}
@app.route("/")
async def test(request):
log.info("Received GET /")
return json(await expensive_call())
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,22 +1,21 @@
"""
Example intercepting uncaught exceptions using Sanic's error handler framework.
This may be useful for developers wishing to use Sentry, Airbrake, etc.
or a custom system to log and monitor unexpected errors in production.
First we create our own class inheriting from Handler in sanic.exceptions,
and pass in an instance of it when we create our Sanic instance. Inside this
class' default handler, we can do anything including sending exceptions to
an external service.
"""
from sanic.exceptions import Handler, SanicException
from sanic.handlers import ErrorHandler
from sanic.exceptions import SanicException
"""
Imports and code relevant for our CustomHandler class
(Ordinarily this would be in a separate file)
"""
class CustomHandler(Handler):
class CustomHandler(ErrorHandler):
def default(self, request, exception):
# Here, we have access to the exception object
@@ -38,11 +37,10 @@ server's error_handler to an instance of our CustomHandler
"""
from sanic import Sanic
from sanic.response import json
app = Sanic(__name__)
handler = CustomHandler(sanic=app)
handler = CustomHandler()
app.error_handler = handler
@@ -50,8 +48,7 @@ app.error_handler = handler
async def test(request):
# Here, something occurs which causes an unexpected exception
# This exception will flow to our custom handler.
1 / 0
return json({"test": True})
raise SanicException('You Broke It!')
app.run(host="0.0.0.0", port=8000, debug=True)
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000, debug=True)

View File

@@ -1,18 +0,0 @@
## To use this example:
# curl -d '{"name": "John Doe"}' localhost:8000
from sanic import Sanic
from sanic.response import html
from jinja2 import Template
template = Template('Hello {{ name }}!')
app = Sanic(__name__)
@app.route('/')
async def test(request):
data = request.json
return html(template.render(**data))
app.run(host="0.0.0.0", port=8000)

View File

@@ -8,11 +8,12 @@ app = Sanic(__name__)
sem = None
@app.listener('before_server_start')
def init(sanic, loop):
global sem
CONCURRENCY_PER_WORKER = 4
sem = asyncio.Semaphore(CONCURRENCY_PER_WORKER, loop=loop)
concurrency_per_worker = 4
sem = asyncio.Semaphore(concurrency_per_worker, loop=loop)
async def bounded_fetch(session, url):
"""

View File

@@ -0,0 +1,86 @@
'''
Based on example from https://github.com/Skyscanner/aiotask-context
and `examples/{override_logging,run_async}.py`.
Needs https://github.com/Skyscanner/aiotask-context/tree/52efbc21e2e1def2d52abb9a8e951f3ce5e6f690 or newer
$ pip install git+https://github.com/Skyscanner/aiotask-context.git
'''
import asyncio
import uuid
import logging
from signal import signal, SIGINT
from sanic import Sanic
from sanic import response
import uvloop
import aiotask_context as context
log = logging.getLogger(__name__)
class RequestIdFilter(logging.Filter):
def filter(self, record):
record.request_id = context.get('X-Request-ID')
return True
LOG_SETTINGS = {
'version': 1,
'disable_existing_loggers': False,
'handlers': {
'console': {
'class': 'logging.StreamHandler',
'level': 'DEBUG',
'formatter': 'default',
'filters': ['requestid'],
},
},
'filters': {
'requestid': {
'()': RequestIdFilter,
},
},
'formatters': {
'default': {
'format': '%(asctime)s %(levelname)s %(name)s:%(lineno)d %(request_id)s | %(message)s',
},
},
'loggers': {
'': {
'level': 'DEBUG',
'handlers': ['console'],
'propagate': True
},
}
}
app = Sanic(__name__, log_config=LOG_SETTINGS)
@app.middleware('request')
async def set_request_id(request):
request_id = request.headers.get('X-Request-ID') or str(uuid.uuid4())
context.set("X-Request-ID", request_id)
@app.route("/")
async def test(request):
log.debug('X-Request-ID: %s', context.get('X-Request-ID'))
log.info('Hello from test!')
return response.json({"test": True})
if __name__ == '__main__':
asyncio.set_event_loop(uvloop.new_event_loop())
server = app.create_server(host="0.0.0.0", port=8000)
loop = asyncio.get_event_loop()
loop.set_task_factory(context.task_factory)
task = asyncio.ensure_future(server)
try:
loop.run_forever()
except:
loop.stop()

View File

@@ -0,0 +1,28 @@
"""
Modify header or status in response
"""
from sanic import Sanic
from sanic import response
app = Sanic(__name__)
@app.route('/')
def handle_request(request):
return response.json(
{'message': 'Hello world!'},
headers={'X-Served-By': 'sanic'},
status=200
)
@app.route('/unauthorized')
def handle_request(request):
return response.json(
{'message': 'You are not authorized'},
headers={'X-Served-By': 'sanic'},
status=404
)
app.run(host="0.0.0.0", port=8000, debug=True)

View File

@@ -1,6 +1,5 @@
from sanic import Sanic
from sanic.response import text
import json
from sanic import response
import logging
logging_format = "[%(asctime)s] %(process)d-%(levelname)s "
@@ -15,9 +14,11 @@ log = logging.getLogger()
# Set logger to override default basicConfig
sanic = Sanic()
@sanic.route("/")
def test(request):
log.info("received request; responding with 'hey'")
return text("hey")
return response.text("hey")
sanic.run(host="0.0.0.0", port=8000)

49
examples/pytest_xdist.py Normal file
View File

@@ -0,0 +1,49 @@
"""pytest-xdist example for sanic server
Install testing tools:
$ pip install pytest pytest-xdist
Run with xdist params:
$ pytest examples/pytest_xdist.py -n 8 # 8 workers
"""
import re
from sanic import Sanic
from sanic.response import text
from sanic.testing import PORT as PORT_BASE, SanicTestClient
import pytest
@pytest.fixture(scope="session")
def test_port(worker_id):
m = re.search(r'[0-9]+', worker_id)
if m:
num_id = m.group(0)
else:
num_id = 0
port = PORT_BASE + int(num_id)
return port
@pytest.fixture(scope="session")
def app():
app = Sanic()
@app.route('/')
async def index(request):
return text('OK')
return app
@pytest.fixture(scope="session")
def client(app, test_port):
return SanicTestClient(app, test_port)
@pytest.mark.parametrize('run_id', range(100))
def test_index(client, run_id):
request, response = client._sanic_endpoint_test('get', '/')
assert response.status == 200
assert response.text == 'OK'

View File

@@ -0,0 +1,18 @@
from sanic import Sanic
from sanic import response
app = Sanic(__name__)
@app.route('/')
def handle_request(request):
return response.redirect('/redirect')
@app.route('/redirect')
async def test(request):
return response.json({"Redirected": True})
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000)

View File

@@ -0,0 +1,10 @@
import requests
# Warning: This is a heavy process.
data = ""
for i in range(1, 250000):
data += str(i)
r = requests.post('http://0.0.0.0:8000/stream', data=data)
print(r.text)

View File

@@ -0,0 +1,65 @@
from sanic import Sanic
from sanic.views import CompositionView
from sanic.views import HTTPMethodView
from sanic.views import stream as stream_decorator
from sanic.blueprints import Blueprint
from sanic.response import stream, text
bp = Blueprint('blueprint_request_stream')
app = Sanic('request_stream')
class SimpleView(HTTPMethodView):
@stream_decorator
async def post(self, request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
@app.post('/stream', stream=True)
async def handler(request):
async def streaming(response):
while True:
body = await request.stream.get()
if body is None:
break
body = body.decode('utf-8').replace('1', 'A')
await response.write(body)
return stream(streaming)
@bp.put('/bp_stream', stream=True)
async def bp_handler(request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8').replace('1', 'A')
return text(result)
async def post_handler(request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
app.blueprint(bp)
app.add_route(SimpleView.as_view(), '/method_view')
view = CompositionView()
view.add(['POST'], post_handler, stream=True)
app.add_route(view, '/composition_view')
if __name__ == '__main__':
app.run(host='0.0.0.0', port=8000)

View File

@@ -1,6 +1,6 @@
from sanic import Sanic
import asyncio
from sanic.response import text
from sanic import Sanic
from sanic import response
from sanic.config import Config
from sanic.exceptions import RequestTimeout
@@ -11,11 +11,11 @@ app = Sanic(__name__)
@app.route('/')
async def test(request):
await asyncio.sleep(3)
return text('Hello, world!')
return response.text('Hello, world!')
@app.exception(RequestTimeout)
def timeout(request, exception):
return text('RequestTimeout from error_handler.', 408)
return response.text('RequestTimeout from error_handler.', 408)
app.run(host='0.0.0.0', port=8000)

View File

@@ -1,18 +1,18 @@
from sanic import Sanic
from sanic.response import json
from multiprocessing import Event
from sanic import response
from signal import signal, SIGINT
import asyncio
import uvloop
app = Sanic(__name__)
@app.route("/")
async def test(request):
return json({"answer": "42"})
return response.json({"answer": "42"})
asyncio.set_event_loop(uvloop.new_event_loop())
server = app.create_server(host="0.0.0.0", port=8001)
server = app.create_server(host="0.0.0.0", port=8000)
loop = asyncio.get_event_loop()
task = asyncio.ensure_future(server)
signal(SIGINT, lambda s, f: loop.stop())

View File

@@ -1,65 +0,0 @@
""" To run this example you need additional aiopg package
"""
import os
import asyncio
import uvloop
import aiopg
from sanic import Sanic
from sanic.response import json
database_name = os.environ['DATABASE_NAME']
database_host = os.environ['DATABASE_HOST']
database_user = os.environ['DATABASE_USER']
database_password = os.environ['DATABASE_PASSWORD']
connection = 'postgres://{0}:{1}@{2}/{3}'.format(database_user,
database_password,
database_host,
database_name)
async def get_pool():
return await aiopg.create_pool(connection)
app = Sanic(name=__name__)
@app.listener('before_server_start')
async def prepare_db(app, loop):
"""
Let's create some table and add some data
"""
async with aiopg.create_pool(connection) as pool:
async with pool.acquire() as conn:
async with conn.cursor() as cur:
await cur.execute('DROP TABLE IF EXISTS sanic_polls')
await cur.execute("""CREATE TABLE sanic_polls (
id serial primary key,
question varchar(50),
pub_date timestamp
);""")
for i in range(0, 100):
await cur.execute("""INSERT INTO sanic_polls
(id, question, pub_date) VALUES ({}, {}, now())
""".format(i, i))
@app.route("/")
async def handle(request):
result = []
async def test_select():
async with aiopg.create_pool(connection) as pool:
async with pool.acquire() as conn:
async with conn.cursor() as cur:
await cur.execute("SELECT question, pub_date FROM sanic_polls")
async for row in cur:
result.append({"question": row[0], "pub_date": row[1]})
res = await test_select()
return json({'polls': result})
if __name__ == '__main__':
app.run(host='0.0.0.0',
port=8000,
debug=True)

View File

@@ -1,67 +0,0 @@
""" To run this example you need additional aiopg package
"""
import os
import asyncio
import datetime
import uvloop
from aiopg.sa import create_engine
import sqlalchemy as sa
from sanic import Sanic
from sanic.response import json
database_name = os.environ['DATABASE_NAME']
database_host = os.environ['DATABASE_HOST']
database_user = os.environ['DATABASE_USER']
database_password = os.environ['DATABASE_PASSWORD']
connection = 'postgres://{0}:{1}@{2}/{3}'.format(database_user,
database_password,
database_host,
database_name)
metadata = sa.MetaData()
polls = sa.Table('sanic_polls', metadata,
sa.Column('id', sa.Integer, primary_key=True),
sa.Column('question', sa.String(50)),
sa.Column("pub_date", sa.DateTime))
app = Sanic(name=__name__)
@app.listener('before_server_start')
async def prepare_db(app, loop):
""" Let's add some data
"""
async with create_engine(connection) as engine:
async with engine.acquire() as conn:
await conn.execute('DROP TABLE IF EXISTS sanic_polls')
await conn.execute("""CREATE TABLE sanic_polls (
id serial primary key,
question varchar(50),
pub_date timestamp
);""")
for i in range(0, 100):
await conn.execute(
polls.insert().values(question=i,
pub_date=datetime.datetime.now())
)
@app.route("/")
async def handle(request):
async with create_engine(connection) as engine:
async with engine.acquire() as conn:
result = []
async for row in conn.execute(polls.select()):
result.append({"question": row.question,
"pub_date": row.pub_date})
return json({"polls": result})
if __name__ == '__main__':
app.run(host='0.0.0.0', port=8000)

View File

@@ -1,59 +0,0 @@
""" To run this example you need additional asyncpg package
"""
import os
import asyncio
import uvloop
from asyncpg import create_pool
from sanic import Sanic
from sanic.response import json
DB_CONFIG = {
'host': '<host>',
'user': '<username>',
'password': '<password>',
'port': '<port>',
'database': '<database>'
}
def jsonify(records):
"""
Parse asyncpg record response into JSON format
"""
return [{key: value for key, value in
zip(r.keys(), r.values())} for r in records]
app = Sanic(__name__)
@app.listener('before_server_start')
async def create_db(app, loop):
"""
Create some table and add some data
"""
async with create_pool(**DB_CONFIG) as pool:
async with pool.acquire() as connection:
async with connection.transaction():
await connection.execute('DROP TABLE IF EXISTS sanic_post')
await connection.execute("""CREATE TABLE sanic_post (
id serial primary key,
content varchar(50),
post_date timestamp
);""")
for i in range(0, 100):
await connection.execute(f"""INSERT INTO sanic_post
(id, content, post_date) VALUES ({i}, {i}, now())""")
@app.route("/")
async def handler(request):
async with create_pool(**DB_CONFIG) as pool:
async with pool.acquire() as connection:
async with connection.transaction():
results = await connection.fetch('SELECT * FROM sanic_post')
return json({'posts': jsonify(results)})
if __name__ == '__main__':
app.run(host='0.0.0.0', port=8000)

View File

@@ -1,41 +0,0 @@
""" sanic motor (async driver for mongodb) example
Required packages:
pymongo==3.4.0
motor==1.1
sanic==0.2.0
"""
from sanic import Sanic
from sanic.response import json
app = Sanic('motor_mongodb')
def get_db():
from motor.motor_asyncio import AsyncIOMotorClient
mongo_uri = "mongodb://127.0.0.1:27017/test"
client = AsyncIOMotorClient(mongo_uri)
return client['test']
@app.route('/objects', methods=['GET'])
async def get(request):
db = get_db()
docs = await db.test_col.find().to_list(length=100)
for doc in docs:
doc['id'] = str(doc['_id'])
del doc['_id']
return json(docs)
@app.route('/post', methods=['POST'])
async def new(request):
doc = request.json
print(doc)
db = get_db()
object_id = await db.test_col.save(doc)
return json({'object_id': str(object_id)})
if __name__ == "__main__":
app.run(host='127.0.0.1', port=8000)

View File

@@ -1,79 +0,0 @@
## You need the following additional packages for this example
# aiopg
# peewee_async
# peewee
## sanic imports
from sanic import Sanic
from sanic.response import json
## peewee_async related imports
import peewee
from peewee_async import Manager, PostgresqlDatabase
# we instantiate a custom loop so we can pass it to our db manager
## from peewee_async docs:
# Also theres no need to connect and re-connect before executing async queries
# with manager! Its all automatic. But you can run Manager.connect() or
# Manager.close() when you need it.
# let's create a simple key value store:
class KeyValue(peewee.Model):
key = peewee.CharField(max_length=40, unique=True)
text = peewee.TextField(default='')
class Meta:
database = database
# create table synchronously
KeyValue.create_table(True)
# OPTIONAL: close synchronous connection
database.close()
# OPTIONAL: disable any future syncronous calls
objects.database.allow_sync = False # this will raise AssertionError on ANY sync call
app = Sanic('peewee_example')
@app.listener('before_server_start')
def setup(app, loop):
database = PostgresqlDatabase(database='test',
host='127.0.0.1',
user='postgres',
password='mysecretpassword')
objects = Manager(database, loop=loop)
@app.route('/post/<key>/<value>')
async def post(request, key, value):
"""
Save get parameters to database
"""
obj = await objects.create(KeyValue, key=key, text=value)
return json({'object_id': obj.id})
@app.route('/get')
async def get(request):
"""
Load all objects from database
"""
all_objects = await objects.execute(KeyValue.select())
serialized_obj = []
for obj in all_objects:
serialized_obj.append({
'id': obj.id,
'key': obj.key,
'value': obj.text}
)
return json({'objects': serialized_obj})
if __name__ == "__main__":
app.run(host='0.0.0.0', port=8000)

View File

@@ -0,0 +1,42 @@
from sanic import Sanic
from sanic.views import HTTPMethodView
from sanic.response import text
app = Sanic('some_name')
class SimpleView(HTTPMethodView):
def get(self, request):
return text('I am get method')
def post(self, request):
return text('I am post method')
def put(self, request):
return text('I am put method')
def patch(self, request):
return text('I am patch method')
def delete(self, request):
return text('I am delete method')
class SimpleAsyncView(HTTPMethodView):
async def get(self, request):
return text('I am async get method')
async def post(self, request):
return text('I am async post method')
async def put(self, request):
return text('I am async put method')
app.add_route(SimpleView.as_view(), '/')
app.add_route(SimpleAsyncView.as_view(), '/async')
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000, debug=True)

View File

@@ -1,12 +1,13 @@
from sanic import Sanic
from sanic.response import json
from sanic import response
app = Sanic(__name__)
@app.route("/")
async def test(request):
return json({"test": True})
return response.json({"test": True})
app.run(host="0.0.0.0", port=8000)
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000)

13
examples/teapot.py Normal file
View File

@@ -0,0 +1,13 @@
from sanic import Sanic
from sanic import response as res
app = Sanic(__name__)
@app.route("/")
async def test(req):
return res.text("I\'m a teapot", status=418)
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,8 +1,8 @@
import os
from sanic import Sanic
from sanic.log import log
from sanic.response import json, text, file
from sanic.log import logger as log
from sanic import response
from sanic.exceptions import ServerError
app = Sanic(__name__)
@@ -10,41 +10,50 @@ app = Sanic(__name__)
@app.route("/")
async def test_async(request):
return json({"test": True})
return response.json({"test": True})
@app.route("/sync", methods=['GET', 'POST'])
def test_sync(request):
return json({"test": True})
return response.json({"test": True})
@app.route("/dynamic/<name>/<id:int>")
def test_params(request, name, id):
return text("yeehaww {} {}".format(name, id))
@app.route("/dynamic/<name>/<i:int>")
def test_params(request, name, i):
return response.text("yeehaww {} {}".format(name, i))
@app.route("/exception")
def exception(request):
raise ServerError("It's dead jim")
@app.route("/await")
async def test_await(request):
import asyncio
await asyncio.sleep(5)
return text("I'm feeling sleepy")
return response.text("I'm feeling sleepy")
@app.route("/file")
async def test_file(request):
return await file(os.path.abspath("setup.py"))
return await response.file(os.path.abspath("setup.py"))
@app.route("/file_stream")
async def test_file_stream(request):
return await response.file_stream(os.path.abspath("setup.py"),
chunk_size=1024)
# ----------------------------------------------- #
# Exceptions
# ----------------------------------------------- #
@app.exception(ServerError)
async def test(request, exception):
return json({"exception": "{}".format(exception), "status": exception.status_code}, status=exception.status_code)
return response.json({"exception": "{}".format(exception), "status": exception.status_code},
status=exception.status_code)
# ----------------------------------------------- #
@@ -53,23 +62,29 @@ async def test(request, exception):
@app.route("/json")
def post_json(request):
return json({"received": True, "message": request.json})
return response.json({"received": True, "message": request.json})
@app.route("/form")
def post_json(request):
return json({"received": True, "form_data": request.form, "test": request.form.get('test')})
def post_form_json(request):
return response.json({"received": True, "form_data": request.form, "test": request.form.get('test')})
@app.route("/query_string")
def query_string(request):
return json({"parsed": True, "args": request.args, "url": request.url, "query_string": request.query_string})
return response.json({"parsed": True, "args": request.args, "url": request.url,
"query_string": request.query_string})
# ----------------------------------------------- #
# Run Server
# ----------------------------------------------- #
@app.listener('before_server_start')
def before_start(app, loop):
log.info("SERVER STARTING")
@app.listener('after_server_start')
def after_start(app, loop):
log.info("OH OH OH OH OHHHHHHHH")
@@ -77,7 +92,13 @@ def after_start(app, loop):
@app.listener('before_server_stop')
def before_stop(app, loop):
log.info("SERVER STOPPING")
@app.listener('after_server_stop')
def after_stop(app, loop):
log.info("TRIED EVERYTHING")
app.run(host="0.0.0.0", port=8000, debug=True)
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000, debug=True)

23
examples/unix_socket.py Normal file
View File

@@ -0,0 +1,23 @@
from sanic import Sanic
from sanic import response
import socket
import os
app = Sanic(__name__)
@app.route("/test")
async def test(request):
return response.text("OK")
if __name__ == '__main__':
server_address = './uds_socket'
# Make sure the socket does not already exist
try:
os.unlink(server_address)
except OSError:
if os.path.exists(server_address):
raise
sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
sock.bind(server_address)
app.run(sock=sock)

View File

@@ -0,0 +1,20 @@
from sanic import Sanic
from sanic import response
app = Sanic(__name__)
@app.route('/')
async def index(request):
# generate a URL for the endpoint `post_handler`
url = app.url_for('post_handler', post_id=5)
# the URL is `/posts/5`, redirect to it
return response.redirect(url)
@app.route('/posts/<post_id>')
async def post_handler(request, post_id):
return response.text('Post - {}'.format(post_id))
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000, debug=True)

View File

@@ -1,4 +1,4 @@
from sanic.response import text
from sanic import response
from sanic import Sanic
from sanic.blueprints import Blueprint
@@ -11,29 +11,29 @@ from sanic.blueprints import Blueprint
app = Sanic()
bp = Blueprint("bp", host="bp.example.com")
@app.route('/', host=["example.com",
"somethingelse.com",
"therestofyourdomains.com"])
async def hello(request):
return text("Some defaults")
return response.text("Some defaults")
@app.route('/', host="example.com")
async def hello(request):
return text("Answer")
@app.route('/', host="sub.example.com")
async def hello(request):
return text("42")
return response.text("42")
@bp.route("/question")
async def hello(request):
return text("What is the meaning of life?")
return response.text("What is the meaning of life?")
@bp.route("/answer")
async def hello(request):
return text("42")
return response.text("42")
app.register_blueprint(bp)
app.blueprint(bp)
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000)
app.run(host="0.0.0.0", port=8000)

29
examples/websocket.html Normal file
View File

@@ -0,0 +1,29 @@
<!DOCTYPE html>
<html>
<head>
<title>WebSocket demo</title>
</head>
<body>
<script>
var ws = new WebSocket('ws://' + document.domain + ':' + location.port + '/feed'),
messages = document.createElement('ul');
ws.onmessage = function (event) {
var messages = document.getElementsByTagName('ul')[0],
message = document.createElement('li'),
content = document.createTextNode('Received: ' + event.data);
message.appendChild(content);
messages.appendChild(message);
};
document.body.appendChild(messages);
window.setInterval(function() {
data = 'bye!'
ws.send(data);
var messages = document.getElementsByTagName('ul')[0],
message = document.createElement('li'),
content = document.createTextNode('Sent: ' + data);
message.appendChild(content);
messages.appendChild(message);
}, 1000);
</script>
</body>
</html>

24
examples/websocket.py Normal file
View File

@@ -0,0 +1,24 @@
from sanic import Sanic
from sanic.response import file
app = Sanic(__name__)
@app.route('/')
async def index(request):
return await file('websocket.html')
@app.websocket('/feed')
async def feed(request, ws):
while True:
data = 'hello!'
print('Sending: ' + data)
await ws.send(data)
data = await ws.recv()
print('Received: ' + data)
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000, debug=True)

View File

@@ -1,18 +1,13 @@
aiocache
aiofiles
aiohttp
aiohttp>=2.3.0,<=3.2.1
chardet<=2.3.0
beautifulsoup4
bottle
coverage
falcon
gunicorn
httptools
kyoukai
pytest
recommonmark
sphinx
sphinx_rtd_theme
tornado
flake8
pytest==3.3.2
tox
ujson
uvloop
ujson; sys_platform != "win32" and implementation_name == "cpython"
uvloop; sys_platform != "win32" and implementation_name == "cpython"
gunicorn
multidict>=4.0,<5.0

4
requirements-docs.txt Normal file
View File

@@ -0,0 +1,4 @@
sphinx
sphinx_rtd_theme
recommonmark
sphinxcontrib-asyncio

View File

@@ -1,4 +1,6 @@
aiofiles
httptools
ujson
uvloop
ujson; sys_platform != "win32" and implementation_name == "cpython"
uvloop; sys_platform != "win32" and implementation_name == "cpython"
websockets>=5.0,<6.0
multidict>=4.0,<5.0

View File

@@ -1,6 +1,6 @@
from sanic.app import Sanic
from sanic.blueprints import Blueprint
__version__ = '0.4.1'
__version__ = '0.8.2'
__all__ = ['Sanic', 'Blueprint']

View File

@@ -1,13 +1,17 @@
from argparse import ArgumentParser
from importlib import import_module
from sanic.log import log
from sanic.log import logger
from sanic.app import Sanic
if __name__ == "__main__":
parser = ArgumentParser(prog='sanic')
parser.add_argument('--host', dest='host', type=str, default='127.0.0.1')
parser.add_argument('--port', dest='port', type=int, default=8000)
parser.add_argument('--cert', dest='cert', type=str,
help='location of certificate for SSL')
parser.add_argument('--key', dest='key', type=str,
help='location of keyfile for SSL.')
parser.add_argument('--workers', dest='workers', type=int, default=1, )
parser.add_argument('--debug', dest='debug', action="store_true")
parser.add_argument('module')
@@ -24,13 +28,17 @@ if __name__ == "__main__":
raise ValueError("Module is not a Sanic app, it is a {}. "
"Perhaps you meant {}.app?"
.format(type(app).__name__, args.module))
if args.cert is not None or args.key is not None:
ssl = {'cert': args.cert, 'key': args.key}
else:
ssl = None
app.run(host=args.host, port=args.port,
workers=args.workers, debug=args.debug)
except ImportError:
log.error("No module named {} found.\n"
" Example File: project/sanic_server.py -> app\n"
" Example Module: project.sanic_server.app"
.format(module_name))
workers=args.workers, debug=args.debug, ssl=ssl)
except ImportError as e:
logger.error("No module named {} found.\n"
" Example File: project/sanic_server.py -> app\n"
" Example Module: project.sanic_server.app"
.format(e.name))
except ValueError as e:
log.error("{}".format(e))
logger.error("{}".format(e))

View File

@@ -1,56 +1,64 @@
import os
import logging
import logging.config
import re
import warnings
from asyncio import get_event_loop
from asyncio import get_event_loop, ensure_future, CancelledError
from collections import deque, defaultdict
from functools import partial
from inspect import isawaitable, stack, getmodulename
from inspect import getmodulename, isawaitable, signature, stack
from traceback import format_exc
from urllib.parse import urlencode, urlunparse
from ssl import create_default_context, Purpose
from sanic.config import Config
from sanic.constants import HTTP_METHODS
from sanic.exceptions import ServerError, URLBuildError, SanicException
from sanic.handlers import ErrorHandler
from sanic.log import log
from sanic.response import HTTPResponse
from sanic.log import logger, error_logger, LOGGING_CONFIG_DEFAULTS
from sanic.response import HTTPResponse, StreamingHTTPResponse
from sanic.router import Router
from sanic.server import serve, serve_multiple, HttpProtocol
from sanic.server import serve, serve_multiple, HttpProtocol, Signal
from sanic.static import register as static_register
from sanic.testing import TestClient
from sanic.testing import SanicTestClient
from sanic.views import CompositionView
from sanic.websocket import WebSocketProtocol, ConnectionClosed
import sanic.reloader_helpers as reloader_helpers
class Sanic:
def __init__(self, name=None, router=None, error_handler=None):
# Only set up a default log handler if the
# end-user application didn't set anything up.
if not logging.root.handlers and log.level == logging.NOTSET:
formatter = logging.Formatter(
"%(asctime)s: %(levelname)s: %(message)s")
handler = logging.StreamHandler()
handler.setFormatter(formatter)
log.addHandler(handler)
log.setLevel(logging.INFO)
def __init__(self, name=None, router=None, error_handler=None,
load_env=True, request_class=None,
strict_slashes=False, log_config=None,
configure_logging=True):
# Get name from previous stack frame
if name is None:
frame_records = stack()[1]
name = getmodulename(frame_records[1])
# logging
if configure_logging:
logging.config.dictConfig(log_config or LOGGING_CONFIG_DEFAULTS)
self.name = name
self.router = router or Router()
self.request_class = request_class
self.error_handler = error_handler or ErrorHandler()
self.config = Config()
self.config = Config(load_env=load_env)
self.request_middleware = deque()
self.response_middleware = deque()
self.blueprints = {}
self._blueprint_order = []
self.configure_logging = configure_logging
self.debug = None
self.sock = None
self.strict_slashes = strict_slashes
self.listeners = defaultdict(list)
self.is_running = False
self.is_request_stream = False
self.websocket_enabled = False
self.websocket_tasks = set()
# Register alternative method names
self.go_fast = self.run
@@ -79,12 +87,24 @@ class Sanic:
:param task: future, couroutine or awaitable
"""
@self.listener('before_server_start')
def run(app, loop):
try:
if callable(task):
loop.create_task(task())
try:
self.loop.create_task(task(self))
except TypeError:
self.loop.create_task(task())
else:
loop.create_task(task)
self.loop.create_task(task)
except SanicException:
@self.listener('before_server_start')
def run(app, loop):
if callable(task):
try:
loop.create_task(task(self))
except TypeError:
loop.create_task(task())
else:
loop.create_task(task)
# Decorator
def listener(self, event):
@@ -92,18 +112,38 @@ class Sanic:
:param event: event to listen to
"""
def decorator(listener):
self.listeners[event].append(listener)
return listener
return decorator
def register_listener(self, listener, event):
"""
Register the listener for a given event.
Args:
listener: callable i.e. setup_db(app, loop)
event: when to register listener i.e. 'before_server_start'
Returns: listener
"""
return self.listener(event)(listener)
# Decorator
def route(self, uri, methods=frozenset({'GET'}), host=None):
def route(self, uri, methods=frozenset({'GET'}), host=None,
strict_slashes=None, stream=False, version=None, name=None):
"""Decorate a function to be registered as a route
:param uri: path of the URL
:param methods: list or tuple of methods allowed
:param host:
:param strict_slashes:
:param stream:
:param version:
:param name: user defined route name for url_for
:return: decorated function
"""
@@ -112,36 +152,75 @@ class Sanic:
if not uri.startswith('/'):
uri = '/' + uri
if stream:
self.is_request_stream = True
if strict_slashes is None:
strict_slashes = self.strict_slashes
def response(handler):
self.router.add(uri=uri, methods=methods, handler=handler,
host=host)
return handler
args = [key for key in signature(handler).parameters.keys()]
if args:
if stream:
handler.is_stream = stream
self.router.add(uri=uri, methods=methods, handler=handler,
host=host, strict_slashes=strict_slashes,
version=version, name=name)
return handler
else:
raise ValueError(
'Required parameter `request` missing'
'in the {0}() route?'.format(
handler.__name__))
return response
# Shorthand method decorators
def get(self, uri, host=None):
return self.route(uri, methods=frozenset({"GET"}), host=host)
def get(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=frozenset({"GET"}), host=host,
strict_slashes=strict_slashes, version=version,
name=name)
def post(self, uri, host=None):
return self.route(uri, methods=frozenset({"POST"}), host=host)
def post(self, uri, host=None, strict_slashes=None, stream=False,
version=None, name=None):
return self.route(uri, methods=frozenset({"POST"}), host=host,
strict_slashes=strict_slashes, stream=stream,
version=version, name=name)
def put(self, uri, host=None):
return self.route(uri, methods=frozenset({"PUT"}), host=host)
def put(self, uri, host=None, strict_slashes=None, stream=False,
version=None, name=None):
return self.route(uri, methods=frozenset({"PUT"}), host=host,
strict_slashes=strict_slashes, stream=stream,
version=version, name=name)
def head(self, uri, host=None):
return self.route(uri, methods=frozenset({"HEAD"}), host=host)
def head(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=frozenset({"HEAD"}), host=host,
strict_slashes=strict_slashes, version=version,
name=name)
def options(self, uri, host=None):
return self.route(uri, methods=frozenset({"OPTIONS"}), host=host)
def options(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=frozenset({"OPTIONS"}), host=host,
strict_slashes=strict_slashes, version=version,
name=name)
def patch(self, uri, host=None):
return self.route(uri, methods=frozenset({"PATCH"}), host=host)
def patch(self, uri, host=None, strict_slashes=None, stream=False,
version=None, name=None):
return self.route(uri, methods=frozenset({"PATCH"}), host=host,
strict_slashes=strict_slashes, stream=stream,
version=version, name=name)
def delete(self, uri, host=None):
return self.route(uri, methods=frozenset({"DELETE"}), host=host)
def delete(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=frozenset({"DELETE"}), host=host,
strict_slashes=strict_slashes, version=version,
name=name)
def add_route(self, handler, uri, methods=frozenset({'GET'}), host=None):
def add_route(self, handler, uri, methods=frozenset({'GET'}), host=None,
strict_slashes=None, version=None, name=None, stream=False):
"""A helper method to register class instance or
functions as a handler to the application url
routes.
@@ -151,6 +230,10 @@ class Sanic:
:param methods: list or tuple of methods allowed, these are overridden
if using a HTTPMethodView
:param host:
:param strict_slashes:
:param version:
:param name: user defined route name for url_for
:param stream: boolean specifying if the handler is a stream handler
:return: function or class instance
"""
# Handle HTTPMethodView differently
@@ -158,16 +241,104 @@ class Sanic:
methods = set()
for method in HTTP_METHODS:
if getattr(handler.view_class, method.lower(), None):
_handler = getattr(handler.view_class, method.lower(), None)
if _handler:
methods.add(method)
if hasattr(_handler, 'is_stream'):
stream = True
# handle composition view differently
if isinstance(handler, CompositionView):
methods = handler.handlers.keys()
for _handler in handler.handlers.values():
if hasattr(_handler, 'is_stream'):
stream = True
break
self.route(uri=uri, methods=methods, host=host)(handler)
if strict_slashes is None:
strict_slashes = self.strict_slashes
self.route(uri=uri, methods=methods, host=host,
strict_slashes=strict_slashes, stream=stream,
version=version, name=name)(handler)
return handler
# Decorator
def websocket(self, uri, host=None, strict_slashes=None,
subprotocols=None, name=None):
"""Decorate a function to be registered as a websocket route
:param uri: path of the URL
:param subprotocols: optional list of strings with the supported
subprotocols
:param host:
:return: decorated function
"""
self.enable_websocket()
# Fix case where the user did not prefix the URL with a /
# and will probably get confused as to why it's not working
if not uri.startswith('/'):
uri = '/' + uri
if strict_slashes is None:
strict_slashes = self.strict_slashes
def response(handler):
async def websocket_handler(request, *args, **kwargs):
request.app = self
try:
protocol = request.transport.get_protocol()
except AttributeError:
# On Python3.5 the Transport classes in asyncio do not
# have a get_protocol() method as in uvloop
protocol = request.transport._protocol
ws = await protocol.websocket_handshake(request, subprotocols)
# schedule the application handler
# its future is kept in self.websocket_tasks in case it
# needs to be cancelled due to the server being stopped
fut = ensure_future(handler(request, ws, *args, **kwargs))
self.websocket_tasks.add(fut)
try:
await fut
except (CancelledError, ConnectionClosed):
pass
finally:
self.websocket_tasks.remove(fut)
await ws.close()
self.router.add(uri=uri, handler=websocket_handler,
methods=frozenset({'GET'}), host=host,
strict_slashes=strict_slashes, name=name)
return handler
return response
def add_websocket_route(self, handler, uri, host=None,
strict_slashes=None, subprotocols=None, name=None):
"""A helper method to register a function as a websocket route."""
if strict_slashes is None:
strict_slashes = self.strict_slashes
return self.websocket(uri, host=host, strict_slashes=strict_slashes,
subprotocols=subprotocols, name=name)(handler)
def enable_websocket(self, enable=True):
"""Enable or disable the support for websocket.
Websocket is enabled automatically if websocket routes are
added to the application.
"""
if not self.websocket_enabled:
# if the server is stopped, we want to cancel any ongoing
# websocket tasks, to allow the server to exit promptly
@self.listener('before_server_stop')
def cancel_websocket_tasks(app, loop):
for task in self.websocket_tasks:
task.cancel()
self.websocket_enabled = enable
def remove_route(self, uri, clean_cache=True, host=None):
self.router.remove(uri, clean_cache, host)
@@ -181,47 +352,60 @@ class Sanic:
def response(handler):
for exception in exceptions:
self.error_handler.add(exception, handler)
if isinstance(exception, (tuple, list)):
for e in exception:
self.error_handler.add(e, handler)
else:
self.error_handler.add(exception, handler)
return handler
return response
def register_middleware(self, middleware, attach_to='request'):
if attach_to == 'request':
self.request_middleware.append(middleware)
if attach_to == 'response':
self.response_middleware.appendleft(middleware)
return middleware
# Decorator
def middleware(self, middleware_or_request):
"""Decorate and register middleware to be called before a request.
Can either be called as @app.middleware or @app.middleware('request')
"""
def register_middleware(middleware, attach_to='request'):
if attach_to == 'request':
self.request_middleware.append(middleware)
if attach_to == 'response':
self.response_middleware.appendleft(middleware)
return middleware
# Detect which way this was called, @middleware or @middleware('AT')
if callable(middleware_or_request):
return register_middleware(middleware_or_request)
return self.register_middleware(middleware_or_request)
else:
return partial(register_middleware,
return partial(self.register_middleware,
attach_to=middleware_or_request)
# Static Files
def static(self, uri, file_or_directory, pattern='.+',
use_modified_since=True, use_content_range=False):
def static(self, uri, file_or_directory, pattern=r'/?.+',
use_modified_since=True, use_content_range=False,
stream_large_files=False, name='static', host=None,
strict_slashes=None, content_type=None):
"""Register a root to serve files from. The input can either be a
file or a directory. See
"""
static_register(self, uri, file_or_directory, pattern,
use_modified_since, use_content_range)
use_modified_since, use_content_range,
stream_large_files, name, host, strict_slashes,
content_type)
def blueprint(self, blueprint, **options):
"""Register a blueprint on the application.
:param blueprint: Blueprint object
:param blueprint: Blueprint object or (list, tuple) thereof
:param options: option dictionary with blueprint defaults
:return: Nothing
"""
if isinstance(blueprint, (list, tuple)):
for item in blueprint:
self.blueprint(item, **options)
return
if blueprint.name in self.blueprints:
assert self.blueprints[blueprint.name] is blueprint, \
'A blueprint with the name "%s" is already registered. ' \
@@ -254,7 +438,7 @@ class Sanic:
the output URL's query string.
:param view_name: string referencing the view name
:param **kwargs: keys and values that are used to build request
:param \*\*kwargs: keys and values that are used to build request
parameters and query string arguments.
:return: the built URL
@@ -263,12 +447,31 @@ class Sanic:
URLBuildError
"""
# find the route by the supplied view name
uri, route = self.router.find_route_by_view_name(view_name)
kw = {}
# special static files url_for
if view_name == 'static':
kw.update(name=kwargs.pop('name', 'static'))
elif view_name.endswith('.static'): # blueprint.static
kwargs.pop('name', None)
kw.update(name=view_name)
if not uri or not route:
raise URLBuildError(
'Endpoint with name `{}` was not found'.format(
view_name))
uri, route = self.router.find_route_by_view_name(view_name, **kw)
if not (uri and route):
raise URLBuildError('Endpoint with name `{}` was not found'.format(
view_name))
if view_name == 'static' or view_name.endswith('.static'):
filename = kwargs.pop('filename', None)
# it's static folder
if '<file_uri:' in uri:
folder_ = uri.split('<file_uri:', 1)[0]
if folder_.endswith('/'):
folder_ = folder_[:-1]
if filename.startswith('/'):
filename = filename[1:]
uri = '{}/{}'.format(folder_, filename)
if uri != '/' and uri.endswith('/'):
uri = uri[:-1]
@@ -292,6 +495,16 @@ class Sanic:
if netloc is None and external:
netloc = self.config.get('SERVER_NAME', '')
if external:
if not scheme:
if ':' in netloc[:8]:
scheme = netloc[:8].split(':', 1)[0]
else:
scheme = 'http'
if '://' in netloc[:8]:
netloc = netloc.split('://', 1)[-1]
for match in matched_params:
name, _type, pattern = self.router.parse_parameter_string(
match)
@@ -299,7 +512,7 @@ class Sanic:
specific_pattern = '^{}$'.format(pattern)
supplied_param = None
if kwargs.get(name):
if name in kwargs:
supplied_param = kwargs.get(name)
del kwargs[name]
else:
@@ -345,33 +558,30 @@ class Sanic:
def converted_response_type(self, response):
pass
async def handle_request(self, request, response_callback):
async def handle_request(self, request, write_callback, stream_callback):
"""Take a request from the HTTP Server and return a response object
to be sent back The HTTP Server only expects a response object, so
exception handling must be done here
:param request: HTTP Request object
:param response_callback: Response function to be called with the
response as the only argument
:param write_callback: Synchronous response function to be
called with the response as the only argument
:param stream_callback: Coroutine that handles streaming a
StreamingHTTPResponse if produced by the handler.
:return: Nothing
"""
# Define `response` var here to remove warnings about
# allocation before assignment below.
response = None
cancelled = False
try:
# -------------------------------------------- #
# Request Middleware
# -------------------------------------------- #
request.app = self
response = False
# The if improves speed. I don't know why
if self.request_middleware:
for middleware in self.request_middleware:
response = middleware(request)
if isawaitable(response):
response = await response
if response:
break
response = await self._run_request_middleware(request)
# No middleware results
if not response:
# -------------------------------------------- #
@@ -379,7 +589,9 @@ class Sanic:
# -------------------------------------------- #
# Fetch handler from router
handler, args, kwargs = self.router.get(request)
handler, args, kwargs, uri = self.router.get(request)
request.uri_template = uri
if handler is None:
raise ServerError(
("'None' was returned while requesting a "
@@ -389,20 +601,13 @@ class Sanic:
response = handler(request, *args, **kwargs)
if isawaitable(response):
response = await response
# -------------------------------------------- #
# Response Middleware
# -------------------------------------------- #
if self.response_middleware:
for middleware in self.response_middleware:
_response = middleware(request, response)
if isawaitable(_response):
_response = await _response
if _response:
response = _response
break
except CancelledError:
# If response handler times out, the server handles the error
# and cancels the handle_request job.
# In this case, the transport is already closed and we cannot
# issue a response.
response = None
cancelled = True
except Exception as e:
# -------------------------------------------- #
# Response Generation Failed
@@ -413,15 +618,43 @@ class Sanic:
if isawaitable(response):
response = await response
except Exception as e:
if self.debug:
if isinstance(e, SanicException):
response = self.error_handler.default(request=request,
exception=e)
elif self.debug:
response = HTTPResponse(
"Error while handling error: {}\nStack: {}".format(
e, format_exc()))
e, format_exc()), status=500)
else:
response = HTTPResponse(
"An error occurred while handling an error")
"An error occurred while handling an error",
status=500)
finally:
# -------------------------------------------- #
# Response Middleware
# -------------------------------------------- #
# Don't run response middleware if response is None
if response is not None:
try:
response = await self._run_response_middleware(request,
response)
except CancelledError:
# Response middleware can timeout too, as above.
response = None
cancelled = True
except BaseException:
error_logger.exception(
'Exception occurred in one of response '
'middleware handlers'
)
if cancelled:
raise CancelledError()
response_callback(response)
# pass the response to the correct callback
if isinstance(response, StreamingHTTPResponse):
await stream_callback(response)
else:
write_callback(response)
# -------------------------------------------------------------------- #
# Testing
@@ -429,108 +662,176 @@ class Sanic:
@property
def test_client(self):
return TestClient(self)
return SanicTestClient(self)
# -------------------------------------------------------------------- #
# Execution
# -------------------------------------------------------------------- #
def run(self, host="127.0.0.1", port=8000, debug=False, before_start=None,
after_start=None, before_stop=None, after_stop=None, ssl=None,
sock=None, workers=1, loop=None, protocol=HttpProtocol,
backlog=100, stop_event=None, register_sys_signals=True):
def run(self, host=None, port=None, debug=False, ssl=None,
sock=None, workers=1, protocol=None,
backlog=100, stop_event=None, register_sys_signals=True,
access_log=True, **kwargs):
"""Run the HTTP Server and listen until keyboard interrupt or term
signal. On termination, drain connections before closing.
:param host: Address to host on
:param port: Port to host on
:param debug: Enables debug output (slows server)
:param before_start: Functions to be executed before the server starts
accepting connections
:param after_start: Functions to be executed after the server starts
accepting connections
:param before_stop: Functions to be executed when a stop signal is
received before it is respected
:param after_stop: Functions to be executed when all requests are
complete
:param ssl: SSLContext for SSL encryption of worker(s)
:param ssl: SSLContext, or location of certificate and key
for SSL encryption of worker(s)
:param sock: Socket for the server to accept connections from
:param workers: Number of processes
received before it is respected
:param loop:
received before it is respected
:param backlog:
:param stop_event:
:param register_sys_signals:
:param protocol: Subclass of asyncio protocol class
:return: Nothing
"""
# Default auto_reload to false
auto_reload = False
# If debug is set, default it to true (unless on windows)
if debug and os.name == 'posix':
auto_reload = True
# Allow for overriding either of the defaults
auto_reload = kwargs.get("auto_reload", auto_reload)
if sock is None:
host, port = host or "127.0.0.1", port or 8000
if protocol is None:
protocol = (WebSocketProtocol if self.websocket_enabled
else HttpProtocol)
if stop_event is not None:
if debug:
warnings.simplefilter('default')
warnings.warn("stop_event will be removed from future versions.",
DeprecationWarning)
# compatibility old access_log params
self.config.ACCESS_LOG = access_log
server_settings = self._helper(
host=host, port=port, debug=debug, before_start=before_start,
after_start=after_start, before_stop=before_stop,
after_stop=after_stop, ssl=ssl, sock=sock, workers=workers,
loop=loop, protocol=protocol, backlog=backlog,
stop_event=stop_event, register_sys_signals=register_sys_signals)
host=host, port=port, debug=debug, ssl=ssl, sock=sock,
workers=workers, protocol=protocol, backlog=backlog,
register_sys_signals=register_sys_signals, auto_reload=auto_reload)
try:
self.is_running = True
if workers == 1:
serve(**server_settings)
if auto_reload and os.name != 'posix':
# This condition must be removed after implementing
# auto reloader for other operating systems.
raise NotImplementedError
if auto_reload and \
os.environ.get('SANIC_SERVER_RUNNING') != 'true':
reloader_helpers.watchdog(2)
else:
serve(**server_settings)
else:
serve_multiple(server_settings, workers, stop_event)
except:
log.exception(
serve_multiple(server_settings, workers)
except BaseException:
error_logger.exception(
'Experienced exception while trying to serve')
raise
finally:
self.is_running = False
log.info("Server Stopped")
logger.info("Server Stopped")
def stop(self):
"""This kills the Sanic"""
get_event_loop().stop()
async def create_server(self, host="127.0.0.1", port=8000, debug=False,
before_start=None, after_start=None,
before_stop=None, after_stop=None, ssl=None,
sock=None, loop=None, protocol=HttpProtocol,
backlog=100, stop_event=None):
def __call__(self):
"""gunicorn compatibility"""
return self
async def create_server(self, host=None, port=None, debug=False,
ssl=None, sock=None, protocol=None,
backlog=100, stop_event=None,
access_log=True):
"""Asynchronous version of `run`.
NOTE: This does not support multiprocessing and is not the preferred
way to run a Sanic application.
"""
if sock is None:
host, port = host or "127.0.0.1", port or 8000
if protocol is None:
protocol = (WebSocketProtocol if self.websocket_enabled
else HttpProtocol)
if stop_event is not None:
if debug:
warnings.simplefilter('default')
warnings.warn("stop_event will be removed from future versions.",
DeprecationWarning)
# compatibility old access_log params
self.config.ACCESS_LOG = access_log
server_settings = self._helper(
host=host, port=port, debug=debug, before_start=before_start,
after_start=after_start, before_stop=before_stop,
after_stop=after_stop, ssl=ssl, sock=sock,
loop=loop or get_event_loop(), protocol=protocol,
backlog=backlog, stop_event=stop_event,
run_async=True)
host=host, port=port, debug=debug, ssl=ssl, sock=sock,
loop=get_event_loop(), protocol=protocol,
backlog=backlog, run_async=True)
# Trigger before_start events
await self.trigger_events(
server_settings.get('before_start', []),
server_settings.get('loop')
)
return await serve(**server_settings)
def _helper(self, host="127.0.0.1", port=8000, debug=False,
before_start=None, after_start=None, before_stop=None,
after_stop=None, ssl=None, sock=None, workers=1, loop=None,
async def trigger_events(self, events, loop):
"""Trigger events (functions or async)
:param events: one or more sync or async functions to execute
:param loop: event loop
"""
for event in events:
result = event(loop)
if isawaitable(result):
await result
async def _run_request_middleware(self, request):
# The if improves speed. I don't know why
if self.request_middleware:
for middleware in self.request_middleware:
response = middleware(request)
if isawaitable(response):
response = await response
if response:
return response
return None
async def _run_response_middleware(self, request, response):
if self.response_middleware:
for middleware in self.response_middleware:
_response = middleware(request, response)
if isawaitable(_response):
_response = await _response
if _response:
response = _response
break
return response
def _helper(self, host=None, port=None, debug=False,
ssl=None, sock=None, workers=1, loop=None,
protocol=HttpProtocol, backlog=100, stop_event=None,
register_sys_signals=True, run_async=False):
register_sys_signals=True, run_async=False, auto_reload=False):
"""Helper function used by `run` and `create_server`."""
if loop is not None:
if isinstance(ssl, dict):
# try common aliaseses
cert = ssl.get('cert') or ssl.get('certificate')
key = ssl.get('key') or ssl.get('keyfile')
if cert is None or key is None:
raise ValueError("SSLContext or certificate and key required.")
context = create_default_context(purpose=Purpose.CLIENT_AUTH)
context.load_cert_chain(cert, keyfile=key)
ssl = context
if stop_event is not None:
if debug:
warnings.simplefilter('default')
warnings.warn("Passing a loop will be deprecated in version"
" 0.4.0 https://github.com/channelcat/sanic/"
"pull/335 has more information.",
DeprecationWarning)
# Deprecate this
if any(arg is not None for arg in (after_stop, after_start,
before_start, before_stop)):
if debug:
warnings.simplefilter('default')
warnings.warn("Passing a before_start, before_stop, after_start or"
"after_stop callback will be deprecated in next "
"major version after 0.4.0",
warnings.warn("stop_event will be removed from future versions.",
DeprecationWarning)
self.error_handler.debug = debug
@@ -538,54 +839,65 @@ class Sanic:
server_settings = {
'protocol': protocol,
'request_class': self.request_class,
'is_request_stream': self.is_request_stream,
'router': self.router,
'host': host,
'port': port,
'sock': sock,
'ssl': ssl,
'signal': Signal(),
'debug': debug,
'request_handler': self.handle_request,
'error_handler': self.error_handler,
'request_timeout': self.config.REQUEST_TIMEOUT,
'response_timeout': self.config.RESPONSE_TIMEOUT,
'keep_alive_timeout': self.config.KEEP_ALIVE_TIMEOUT,
'request_max_size': self.config.REQUEST_MAX_SIZE,
'keep_alive': self.config.KEEP_ALIVE,
'loop': loop,
'register_sys_signals': register_sys_signals,
'backlog': backlog
'backlog': backlog,
'access_log': self.config.ACCESS_LOG,
'websocket_max_size': self.config.WEBSOCKET_MAX_SIZE,
'websocket_max_queue': self.config.WEBSOCKET_MAX_QUEUE,
'websocket_read_limit': self.config.WEBSOCKET_READ_LIMIT,
'websocket_write_limit': self.config.WEBSOCKET_WRITE_LIMIT,
'graceful_shutdown_timeout': self.config.GRACEFUL_SHUTDOWN_TIMEOUT
}
# -------------------------------------------- #
# Register start/stop events
# -------------------------------------------- #
for event_name, settings_name, reverse, args in (
("before_server_start", "before_start", False, before_start),
("after_server_start", "after_start", False, after_start),
("before_server_stop", "before_stop", True, before_stop),
("after_server_stop", "after_stop", True, after_stop),
for event_name, settings_name, reverse in (
("before_server_start", "before_start", False),
("after_server_start", "after_start", False),
("before_server_stop", "before_stop", True),
("after_server_stop", "after_stop", True),
):
listeners = self.listeners[event_name].copy()
if args:
if callable(args):
listeners.append(args)
else:
listeners.extend(args)
if reverse:
listeners.reverse()
# Prepend sanic to the arguments when listeners are triggered
listeners = [partial(listener, self) for listener in listeners]
server_settings[settings_name] = listeners
if debug:
log.setLevel(logging.DEBUG)
if self.config.LOGO is not None:
log.debug(self.config.LOGO)
if self.configure_logging and debug:
logger.setLevel(logging.DEBUG)
if self.config.LOGO is not None and \
os.environ.get('SANIC_SERVER_RUNNING') != 'true':
logger.debug(self.config.LOGO)
if run_async:
server_settings['run_async'] = True
# Serve
proto = "http"
if ssl is not None:
proto = "https"
log.info('Goin\' Fast @ {}://{}:{}'.format(proto, host, port))
if host and port and os.environ.get('SANIC_SERVER_RUNNING') != 'true':
proto = "http"
if ssl is not None:
proto = "https"
logger.info('Goin\' Fast @ {}://{}:{}'.format(proto, host, port))
return server_settings

View File

@@ -3,7 +3,9 @@ from collections import defaultdict, namedtuple
from sanic.constants import HTTP_METHODS
from sanic.views import CompositionView
FutureRoute = namedtuple('Route', ['handler', 'uri', 'methods', 'host'])
FutureRoute = namedtuple('Route',
['handler', 'uri', 'methods', 'host',
'strict_slashes', 'stream', 'version', 'name'])
FutureListener = namedtuple('Listener', ['handler', 'uri', 'methods', 'host'])
FutureMiddleware = namedtuple('Route', ['middleware', 'args', 'kwargs'])
FutureException = namedtuple('Route', ['handler', 'args', 'kwargs'])
@@ -12,21 +14,49 @@ FutureStatic = namedtuple('Route',
class Blueprint:
def __init__(self, name, url_prefix=None, host=None):
def __init__(self, name,
url_prefix=None,
host=None, version=None,
strict_slashes=False):
"""Create a new blueprint
:param name: unique name of the blueprint
:param url_prefix: URL to be prefixed before all route URLs
:param strict_slashes: strict to trailing slash
"""
self.name = name
self.url_prefix = url_prefix
self.host = host
self.routes = []
self.websocket_routes = []
self.exceptions = []
self.listeners = defaultdict(list)
self.middlewares = []
self.statics = []
self.version = version
self.strict_slashes = strict_slashes
@staticmethod
def group(*blueprints, url_prefix=''):
"""Create a list of blueprints, optionally
grouping them under a general URL prefix.
:param blueprints: blueprints to be registered as a group
:param url_prefix: URL route to be prepended to all sub-prefixes
"""
def chain(nested):
"""itertools.chain() but leaves strings untouched"""
for i in nested:
if isinstance(i, (list, tuple)):
yield from chain(i)
else:
yield i
bps = []
for bp in chain(blueprints):
bp.url_prefix = url_prefix + bp.url_prefix
bps.append(bp)
return bps
def register(self, app, options):
"""Register the blueprint to the sanic app."""
@@ -40,19 +70,38 @@ class Blueprint:
future.handler.__blueprintname__ = self.name
# Prepend the blueprint URI prefix if available
uri = url_prefix + future.uri if url_prefix else future.uri
app.route(
uri=uri[1:] if uri.startswith('//') else uri,
methods=future.methods,
host=future.host or self.host
)(future.handler)
version = future.version or self.version
app.route(uri=uri[1:] if uri.startswith('//') else uri,
methods=future.methods,
host=future.host or self.host,
strict_slashes=future.strict_slashes,
stream=future.stream,
version=version,
name=future.name,
)(future.handler)
for future in self.websocket_routes:
# attach the blueprint name to the handler so that it can be
# prefixed properly in the router
future.handler.__blueprintname__ = self.name
# Prepend the blueprint URI prefix if available
uri = url_prefix + future.uri if url_prefix else future.uri
app.websocket(uri=uri,
host=future.host or self.host,
strict_slashes=future.strict_slashes,
name=future.name,
)(future.handler)
# Middleware
for future in self.middlewares:
if future.args or future.kwargs:
app.middleware(*future.args,
**future.kwargs)(future.middleware)
app.register_middleware(future.middleware,
*future.args,
**future.kwargs)
else:
app.middleware(future.middleware)
app.register_middleware(future.middleware)
# Exceptions
for future in self.exceptions:
@@ -70,25 +119,36 @@ class Blueprint:
for listener in listeners:
app.listener(event)(listener)
def route(self, uri, methods=frozenset({'GET'}), host=None):
def route(self, uri, methods=frozenset({'GET'}), host=None,
strict_slashes=None, stream=False, version=None, name=None):
"""Create a blueprint route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:param methods: list of acceptable HTTP methods.
"""
if strict_slashes is None:
strict_slashes = self.strict_slashes
def decorator(handler):
route = FutureRoute(handler, uri, methods, host)
route = FutureRoute(
handler, uri, methods, host, strict_slashes, stream, version,
name)
self.routes.append(route)
return handler
return decorator
def add_route(self, handler, uri, methods=frozenset({'GET'}), host=None):
def add_route(self, handler, uri, methods=frozenset({'GET'}), host=None,
strict_slashes=None, version=None, name=None):
"""Create a blueprint route from a function.
:param handler: function for handling uri requests. Accepts function,
or class instance with a view_class method.
:param uri: endpoint at which the route will be accessible.
:param methods: list of acceptable HTTP methods.
:param host:
:param strict_slashes:
:param version:
:param name: user defined route name for url_for
:return: function or class instance
"""
# Handle HTTPMethodView differently
@@ -99,11 +159,44 @@ class Blueprint:
if getattr(handler.view_class, method.lower(), None):
methods.add(method)
if strict_slashes is None:
strict_slashes = self.strict_slashes
# handle composition view differently
if isinstance(handler, CompositionView):
methods = handler.handlers.keys()
self.route(uri=uri, methods=methods, host=host)(handler)
self.route(uri=uri, methods=methods, host=host,
strict_slashes=strict_slashes, version=version,
name=name)(handler)
return handler
def websocket(self, uri, host=None, strict_slashes=None, version=None,
name=None):
"""Create a blueprint websocket route from a decorated function.
:param uri: endpoint at which the route will be accessible.
"""
if strict_slashes is None:
strict_slashes = self.strict_slashes
def decorator(handler):
route = FutureRoute(handler, uri, [], host, strict_slashes,
False, version, name)
self.websocket_routes.append(route)
return handler
return decorator
def add_websocket_route(self, handler, uri, host=None, version=None,
name=None):
"""Create a blueprint websocket route from a function.
:param handler: function for handling uri requests. Accepts function,
or class instance with a view_class method.
:param uri: endpoint at which the route will be accessible.
:return: function or class instance
"""
self.websocket(uri=uri, host=host, version=version, name=name)(handler)
return handler
def listener(self, event):
@@ -145,27 +238,57 @@ class Blueprint:
:param uri: endpoint at which the route will be accessible.
:param file_or_directory: Static asset.
"""
name = kwargs.pop('name', 'static')
if not name.startswith(self.name + '.'):
name = '{}.{}'.format(self.name, name)
kwargs.update(name=name)
strict_slashes = kwargs.get('strict_slashes')
if strict_slashes is None and self.strict_slashes is not None:
kwargs.update(strict_slashes=self.strict_slashes)
static = FutureStatic(uri, file_or_directory, args, kwargs)
self.statics.append(static)
# Shorthand method decorators
def get(self, uri, host=None):
return self.route(uri, methods=["GET"], host=host)
def get(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=["GET"], host=host,
strict_slashes=strict_slashes, version=version,
name=name)
def post(self, uri, host=None):
return self.route(uri, methods=["POST"], host=host)
def post(self, uri, host=None, strict_slashes=None, stream=False,
version=None, name=None):
return self.route(uri, methods=["POST"], host=host,
strict_slashes=strict_slashes, stream=stream,
version=version, name=name)
def put(self, uri, host=None):
return self.route(uri, methods=["PUT"], host=host)
def put(self, uri, host=None, strict_slashes=None, stream=False,
version=None, name=None):
return self.route(uri, methods=["PUT"], host=host,
strict_slashes=strict_slashes, stream=stream,
version=version, name=name)
def head(self, uri, host=None):
return self.route(uri, methods=["HEAD"], host=host)
def head(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=["HEAD"], host=host,
strict_slashes=strict_slashes, version=version,
name=name)
def options(self, uri, host=None):
return self.route(uri, methods=["OPTIONS"], host=host)
def options(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=["OPTIONS"], host=host,
strict_slashes=strict_slashes, version=version,
name=name)
def patch(self, uri, host=None):
return self.route(uri, methods=["PATCH"], host=host)
def patch(self, uri, host=None, strict_slashes=None, stream=False,
version=None, name=None):
return self.route(uri, methods=["PATCH"], host=host,
strict_slashes=strict_slashes, stream=stream,
version=version, name=name)
def delete(self, uri, host=None):
return self.route(uri, methods=["DELETE"], host=host)
def delete(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=["DELETE"], host=host,
strict_slashes=strict_slashes, version=version,
name=name)

View File

@@ -2,8 +2,11 @@ import os
import types
SANIC_PREFIX = 'SANIC_'
class Config(dict):
def __init__(self, defaults=None):
def __init__(self, defaults=None, load_env=True, keep_alive=True):
super().__init__(defaults or {})
self.LOGO = """
▄▄▄▄▄
@@ -26,8 +29,21 @@ class Config(dict):
▌ ▐ ▀▀▄▄▄▀
▀▀▄▄▀
"""
self.REQUEST_MAX_SIZE = 100000000 # 100 megababies
self.REQUEST_MAX_SIZE = 100000000 # 100 megabytes
self.REQUEST_TIMEOUT = 60 # 60 seconds
self.RESPONSE_TIMEOUT = 60 # 60 seconds
self.KEEP_ALIVE = keep_alive
self.KEEP_ALIVE_TIMEOUT = 5 # 5 seconds
self.WEBSOCKET_MAX_SIZE = 2 ** 20 # 1 megabytes
self.WEBSOCKET_MAX_QUEUE = 32
self.WEBSOCKET_READ_LIMIT = 2 ** 16
self.WEBSOCKET_WRITE_LIMIT = 2 ** 16
self.GRACEFUL_SHUTDOWN_TIMEOUT = 15.0 # 15 sec
self.ACCESS_LOG = True
if load_env:
prefix = SANIC_PREFIX if load_env is True else load_env
self.load_environment_vars(prefix=prefix)
def __getattr__(self, attr):
try:
@@ -90,3 +106,19 @@ class Config(dict):
for key in dir(obj):
if key.isupper():
self[key] = getattr(obj, key)
def load_environment_vars(self, prefix=SANIC_PREFIX):
"""
Looks for prefixed environment variables and applies
them to the configuration if present.
"""
for k, v in os.environ.items():
if k.startswith(prefix):
_, config_key = k.split(prefix, 1)
try:
self[config_key] = int(v)
except ValueError:
try:
self[config_key] = float(v)
except ValueError:
self[config_key] = v

View File

@@ -19,7 +19,7 @@ _Translator.update({
def _quote(str):
r"""Quote a string for use in a cookie header.
"""Quote a string for use in a cookie header.
If the string does not need to be double-quoted, then just return the
string. Otherwise, surround the string in doublequotes and quote
(with a \) special characters.
@@ -47,16 +47,15 @@ class CookieJar(dict):
super().__init__()
self.headers = headers
self.cookie_headers = {}
self.header_key = "Set-Cookie"
def __setitem__(self, key, value):
# If this cookie doesn't exist, add it to the header keys
cookie_header = self.cookie_headers.get(key)
if not cookie_header:
if not self.cookie_headers.get(key):
cookie = Cookie(key, value)
cookie['path'] = '/'
cookie_header = MultiHeader("Set-Cookie")
self.cookie_headers[key] = cookie_header
self.headers[cookie_header] = cookie
self.cookie_headers[key] = self.header_key
self.headers.add(self.header_key, cookie)
return super().__setitem__(key, cookie)
else:
self[key].value = value
@@ -67,7 +66,11 @@ class CookieJar(dict):
self[key]['max-age'] = 0
else:
cookie_header = self.cookie_headers[key]
del self.headers[cookie_header]
# remove it from header
cookies = self.headers.popall(cookie_header)
for cookie in cookies:
if cookie.key != key:
self.headers.add(cookie_header, cookie)
del self.cookie_headers[key]
return super().__delitem__(key)
@@ -83,6 +86,7 @@ class Cookie(dict):
"secure": "Secure",
"httponly": "HttpOnly",
"version": "Version",
"samesite": "SameSite",
}
_flags = {'secure', 'httponly'}
@@ -98,7 +102,8 @@ class Cookie(dict):
def __setitem__(self, key, value):
if key not in self._keys:
raise KeyError("Unknown cookie property")
return super().__setitem__(key, value)
if value is not False:
return super().__setitem__(key, value)
def encode(self, encoding):
output = ['%s=%s' % (self.key, _quote(self.value))]
@@ -116,25 +121,9 @@ class Cookie(dict):
))
except AttributeError:
output.append('%s=%s' % (self._keys[key], value))
elif key in self._flags:
if self[key]:
output.append(self._keys[key])
elif key in self._flags and self[key]:
output.append(self._keys[key])
else:
output.append('%s=%s' % (self._keys[key], value))
return "; ".join(output).encode(encoding)
# ------------------------------------------------------------ #
# Header Trickery
# ------------------------------------------------------------ #
class MultiHeader:
"""String-holding object which allow us to set a header within response
that has a unique key, but may contain duplicate header names
"""
def __init__(self, name):
self.name = name
def encode(self):
return self.name.encode()

View File

@@ -1,3 +1,5 @@
from sanic.http import STATUS_CODES
TRACEBACK_STYLE = '''
<style>
body {
@@ -47,6 +49,10 @@ TRACEBACK_STYLE = '''
padding: 5px 10px;
}
.tb-border {
padding-top: 20px;
}
.frame-descriptor {
background-color: #e2eafb;
}
@@ -63,20 +69,35 @@ TRACEBACK_WRAPPER_HTML = '''
{style}
</head>
<body>
<h1>{exc_name}</h1>
<h3><code>{exc_value}</code></h3>
<div class="tb-wrapper">
<p class="tb-header">Traceback (most recent call last):</p>
{frame_html}
<p class="summary">
{inner_html}
<div class="summary">
<p>
<b>{exc_name}: {exc_value}</b>
while handling uri <code>{uri}</code>
while handling path <code>{path}</code>
</p>
</div>
</body>
</html>
'''
TRACEBACK_WRAPPER_INNER_HTML = '''
<h1>{exc_name}</h1>
<h3><code>{exc_value}</code></h3>
<div class="tb-wrapper">
<p class="tb-header">Traceback (most recent call last):</p>
{frame_html}
</div>
'''
TRACEBACK_BORDER = '''
<div class="tb-border">
<b><i>
The above exception was the direct cause of the
following exception:
</i></b>
</div>
'''
TRACEBACK_LINE_HTML = '''
<div class="frame-line">
<p class="frame-descriptor">
@@ -96,6 +117,20 @@ INTERNAL_SERVER_ERROR_HTML = '''
'''
_sanic_exceptions = {}
def add_status_code(code):
"""
Decorator used for adding exceptions to _sanic_exceptions.
"""
def class_decorator(cls):
cls.status_code = code
_sanic_exceptions[code] = cls
return cls
return class_decorator
class SanicException(Exception):
def __init__(self, message, status_code=None):
@@ -105,46 +140,72 @@ class SanicException(Exception):
self.status_code = status_code
@add_status_code(404)
class NotFound(SanicException):
status_code = 404
pass
@add_status_code(400)
class InvalidUsage(SanicException):
status_code = 400
pass
@add_status_code(405)
class MethodNotSupported(SanicException):
def __init__(self, message, method, allowed_methods):
super().__init__(message)
self.headers = dict()
self.headers["Allow"] = ", ".join(allowed_methods)
if method in ['HEAD', 'PATCH', 'PUT', 'DELETE']:
self.headers['Content-Length'] = 0
@add_status_code(500)
class ServerError(SanicException):
status_code = 500
pass
class URLBuildError(SanicException):
status_code = 500
@add_status_code(503)
class ServiceUnavailable(SanicException):
"""The server is currently unavailable (because it is overloaded or
down for maintenance). Generally, this is a temporary state."""
pass
class URLBuildError(ServerError):
pass
class FileNotFound(NotFound):
status_code = 404
def __init__(self, message, path, relative_url):
super().__init__(message)
self.path = path
self.relative_url = relative_url
@add_status_code(408)
class RequestTimeout(SanicException):
status_code = 408
"""The Web server (running the Web site) thinks that there has been too
long an interval of time between 1) the establishment of an IP
connection (socket) between the client and the server and
2) the receipt of any data on that socket, so the server has dropped
the connection. The socket connection has actually been lost - the Web
server has 'timed out' on that particular socket connection.
"""
pass
@add_status_code(413)
class PayloadTooLarge(SanicException):
status_code = 413
pass
class HeaderNotFound(SanicException):
status_code = 400
class HeaderNotFound(InvalidUsage):
pass
@add_status_code(416)
class ContentRangeError(SanicException):
status_code = 416
def __init__(self, message, content_range):
super().__init__(message)
self.headers = {
@@ -153,5 +214,75 @@ class ContentRangeError(SanicException):
}
@add_status_code(403)
class Forbidden(SanicException):
pass
class InvalidRangeType(ContentRangeError):
pass
@add_status_code(401)
class Unauthorized(SanicException):
"""
Unauthorized exception (401 HTTP status code).
:param message: Message describing the exception.
:param status_code: HTTP Status code.
:param scheme: Name of the authentication scheme to be used.
When present, kwargs is used to complete the WWW-Authentication header.
Examples::
# With a Basic auth-scheme, realm MUST be present:
raise Unauthorized("Auth required.",
scheme="Basic",
realm="Restricted Area")
# With a Digest auth-scheme, things are a bit more complicated:
raise Unauthorized("Auth required.",
scheme="Digest",
realm="Restricted Area",
qop="auth, auth-int",
algorithm="MD5",
nonce="abcdef",
opaque="zyxwvu")
# With a Bearer auth-scheme, realm is optional so you can write:
raise Unauthorized("Auth required.", scheme="Bearer")
# or, if you want to specify the realm:
raise Unauthorized("Auth required.",
scheme="Bearer",
realm="Restricted Area")
"""
def __init__(self, message, status_code=None, scheme=None, **kwargs):
super().__init__(message, status_code)
# if auth-scheme is specified, set "WWW-Authenticate" header
if scheme is not None:
values = ['{!s}="{!s}"'.format(k, v) for k, v in kwargs.items()]
challenge = ', '.join(values)
self.headers = {
"WWW-Authenticate": "{} {}".format(scheme, challenge).rstrip()
}
def abort(status_code, message=None):
"""
Raise an exception based on SanicException. Returns the HTTP response
message appropriate for the given status code, unless provided.
:param status_code: The HTTP status code to return.
:param message: The HTTP response body. Defaults to the messages
in response.py for the given status code.
"""
if message is None:
message = STATUS_CODES.get(status_code)
# These are stored as bytes in the STATUS_CODES dict
message = message.decode('utf8')
sanic_exception = _sanic_exceptions.get(status_code, SanicException)
raise sanic_exception(message=message, status_code=status_code)

View File

@@ -9,35 +9,63 @@ from sanic.exceptions import (
SanicException,
TRACEBACK_LINE_HTML,
TRACEBACK_STYLE,
TRACEBACK_WRAPPER_HTML)
from sanic.log import log
TRACEBACK_WRAPPER_HTML,
TRACEBACK_WRAPPER_INNER_HTML,
TRACEBACK_BORDER)
from sanic.log import logger
from sanic.response import text, html
class ErrorHandler:
handlers = None
cached_handlers = None
_missing = object()
def __init__(self):
self.handlers = {}
self.handlers = []
self.cached_handlers = {}
self.debug = False
def _render_traceback_html(self, exception, request):
exc_type, exc_value, tb = sys.exc_info()
frames = extract_tb(tb)
def _render_exception(self, exception):
frames = extract_tb(exception.__traceback__)
frame_html = []
for frame in frames:
frame_html.append(TRACEBACK_LINE_HTML.format(frame))
return TRACEBACK_WRAPPER_INNER_HTML.format(
exc_name=exception.__class__.__name__,
exc_value=exception,
frame_html=''.join(frame_html))
def _render_traceback_html(self, exception, request):
exc_type, exc_value, tb = sys.exc_info()
exceptions = []
while exc_value:
exceptions.append(self._render_exception(exc_value))
exc_value = exc_value.__cause__
return TRACEBACK_WRAPPER_HTML.format(
style=TRACEBACK_STYLE,
exc_name=exc_type.__name__,
exc_value=exc_value,
frame_html=''.join(frame_html),
uri=request.url)
exc_name=exception.__class__.__name__,
exc_value=exception,
inner_html=TRACEBACK_BORDER.join(reversed(exceptions)),
path=request.path)
def add(self, exception, handler):
self.handlers[exception] = handler
self.handlers.append((exception, handler))
def lookup(self, exception):
handler = self.cached_handlers.get(exception, self._missing)
if handler is self._missing:
for exception_class, handler in self.handlers:
if isinstance(exception, exception_class):
self.cached_handlers[type(exception)] = handler
return handler
self.cached_handlers[type(exception)] = None
handler = None
return handler
def response(self, request, exception):
"""Fetches and executes an exception handler and returns a response
@@ -47,19 +75,24 @@ class ErrorHandler:
:param exception: Exception to handle
:return: Response object
"""
handler = self.handlers.get(type(exception), self.default)
handler = self.lookup(exception)
response = None
try:
response = handler(request=request, exception=exception)
if handler:
response = handler(request, exception)
if response is None:
response = self.default(request, exception)
except Exception:
self.log(format_exc())
if self.debug:
url = getattr(request, 'url', 'unknown')
response_message = (
'Exception raised in exception handler "{}" '
'for uri: "{}"\n{}').format(
handler.__name__, url, format_exc())
log.error(response_message)
return text(response_message, 500)
response_message = ('Exception raised in exception handler '
'"%s" for uri: "%s"\n%s')
logger.error(response_message,
handler.__name__, url, format_exc())
return text(response_message % (
handler.__name__, url, format_exc()), 500)
else:
return text('An error occurred while handling an error', 500)
return response
@@ -69,7 +102,7 @@ class ErrorHandler:
Override this method in an ErrorHandler subclass to prevent
logging exceptions.
"""
getattr(log, level)(message)
getattr(logger, level)(message)
def default(self, request, exception):
self.log(format_exc())
@@ -82,10 +115,9 @@ class ErrorHandler:
elif self.debug:
html_output = self._render_traceback_html(exception, request)
response_message = (
'Exception occurred while handling uri: "{}"\n{}'.format(
request.url, format_exc()))
log.error(response_message)
response_message = ('Exception occurred while handling uri: '
'"%s"\n%s')
logger.error(response_message, request.url, format_exc())
return html(html_output, status=500)
else:
return html(INTERNAL_SERVER_ERROR_HTML, status=500)

128
sanic/http.py Normal file
View File

@@ -0,0 +1,128 @@
"""Defines basics of HTTP standard."""
STATUS_CODES = {
100: b'Continue',
101: b'Switching Protocols',
102: b'Processing',
200: b'OK',
201: b'Created',
202: b'Accepted',
203: b'Non-Authoritative Information',
204: b'No Content',
205: b'Reset Content',
206: b'Partial Content',
207: b'Multi-Status',
208: b'Already Reported',
226: b'IM Used',
300: b'Multiple Choices',
301: b'Moved Permanently',
302: b'Found',
303: b'See Other',
304: b'Not Modified',
305: b'Use Proxy',
307: b'Temporary Redirect',
308: b'Permanent Redirect',
400: b'Bad Request',
401: b'Unauthorized',
402: b'Payment Required',
403: b'Forbidden',
404: b'Not Found',
405: b'Method Not Allowed',
406: b'Not Acceptable',
407: b'Proxy Authentication Required',
408: b'Request Timeout',
409: b'Conflict',
410: b'Gone',
411: b'Length Required',
412: b'Precondition Failed',
413: b'Request Entity Too Large',
414: b'Request-URI Too Long',
415: b'Unsupported Media Type',
416: b'Requested Range Not Satisfiable',
417: b'Expectation Failed',
418: b'I\'m a teapot',
422: b'Unprocessable Entity',
423: b'Locked',
424: b'Failed Dependency',
426: b'Upgrade Required',
428: b'Precondition Required',
429: b'Too Many Requests',
431: b'Request Header Fields Too Large',
451: b'Unavailable For Legal Reasons',
500: b'Internal Server Error',
501: b'Not Implemented',
502: b'Bad Gateway',
503: b'Service Unavailable',
504: b'Gateway Timeout',
505: b'HTTP Version Not Supported',
506: b'Variant Also Negotiates',
507: b'Insufficient Storage',
508: b'Loop Detected',
510: b'Not Extended',
511: b'Network Authentication Required'
}
# According to https://tools.ietf.org/html/rfc2616#section-7.1
_ENTITY_HEADERS = frozenset([
'allow',
'content-encoding',
'content-language',
'content-length',
'content-location',
'content-md5',
'content-range',
'content-type',
'expires',
'last-modified',
'extension-header'
])
# According to https://tools.ietf.org/html/rfc2616#section-13.5.1
_HOP_BY_HOP_HEADERS = frozenset([
'connection',
'keep-alive',
'proxy-authenticate',
'proxy-authorization',
'te',
'trailers',
'transfer-encoding',
'upgrade'
])
def has_message_body(status):
"""
According to the following RFC message body and length SHOULD NOT
be included in responses status 1XX, 204 and 304.
https://tools.ietf.org/html/rfc2616#section-4.4
https://tools.ietf.org/html/rfc2616#section-4.3
"""
return status not in (204, 304) and not (100 <= status < 200)
def is_entity_header(header):
"""Checks if the given header is an Entity Header"""
return header.lower() in _ENTITY_HEADERS
def is_hop_by_hop_header(header):
"""Checks if the given header is a Hop By Hop header"""
return header.lower() in _HOP_BY_HOP_HEADERS
def remove_entity_headers(headers,
allowed=('content-location', 'expires')):
"""
Removes all the entity headers present in the headers given.
According to RFC 2616 Section 10.3.5,
Content-Location and Expires are allowed as for the
"strong cache validator".
https://tools.ietf.org/html/rfc2616#section-10.3.5
returns the headers without the entity headers
"""
allowed = set([h.lower() for h in allowed])
headers = {header: value for header, value in headers.items()
if not is_entity_header(header)
and header.lower() not in allowed}
return headers

View File

@@ -1,3 +1,63 @@
import logging
import sys
log = logging.getLogger('sanic')
LOGGING_CONFIG_DEFAULTS = dict(
version=1,
disable_existing_loggers=False,
loggers={
"root": {
"level": "INFO",
"handlers": ["console"]
},
"sanic.error": {
"level": "INFO",
"handlers": ["error_console"],
"propagate": True,
"qualname": "sanic.error"
},
"sanic.access": {
"level": "INFO",
"handlers": ["access_console"],
"propagate": True,
"qualname": "sanic.access"
}
},
handlers={
"console": {
"class": "logging.StreamHandler",
"formatter": "generic",
"stream": sys.stdout
},
"error_console": {
"class": "logging.StreamHandler",
"formatter": "generic",
"stream": sys.stderr
},
"access_console": {
"class": "logging.StreamHandler",
"formatter": "access",
"stream": sys.stdout
},
},
formatters={
"generic": {
"format": "%(asctime)s [%(process)d] [%(levelname)s] %(message)s",
"datefmt": "[%Y-%m-%d %H:%M:%S %z]",
"class": "logging.Formatter"
},
"access": {
"format": "%(asctime)s - (%(name)s)[%(levelname)s][%(host)s]: " +
"%(request)s %(message)s %(status)d %(byte)d",
"datefmt": "[%Y-%m-%d %H:%M:%S %z]",
"class": "logging.Formatter"
},
}
)
logger = logging.getLogger('root')
error_logger = logging.getLogger('sanic.error')
access_logger = logging.getLogger('sanic.access')

151
sanic/reloader_helpers.py Normal file
View File

@@ -0,0 +1,151 @@
import os
import sys
import signal
import subprocess
from time import sleep
from multiprocessing import Process
def _iter_module_files():
"""This iterates over all relevant Python files.
It goes through all
loaded files from modules, all files in folders of already loaded modules
as well as all files reachable through a package.
"""
# The list call is necessary on Python 3 in case the module
# dictionary modifies during iteration.
for module in list(sys.modules.values()):
if module is None:
continue
filename = getattr(module, '__file__', None)
if filename:
old = None
while not os.path.isfile(filename):
old = filename
filename = os.path.dirname(filename)
if filename == old:
break
else:
if filename[-4:] in ('.pyc', '.pyo'):
filename = filename[:-1]
yield filename
def _get_args_for_reloading():
"""Returns the executable."""
rv = [sys.executable]
rv.extend(sys.argv)
return rv
def restart_with_reloader():
"""Create a new process and a subprocess in it with the same arguments as
this one.
"""
args = _get_args_for_reloading()
new_environ = os.environ.copy()
new_environ['SANIC_SERVER_RUNNING'] = 'true'
cmd = ' '.join(args)
worker_process = Process(
target=subprocess.call, args=(cmd,),
kwargs=dict(shell=True, env=new_environ))
worker_process.start()
return worker_process
def kill_process_children_unix(pid):
"""Find and kill child processes of a process (maximum two level).
:param pid: PID of parent process (process ID)
:return: Nothing
"""
root_process_path = "/proc/{pid}/task/{pid}/children".format(pid=pid)
if not os.path.isfile(root_process_path):
return
with open(root_process_path) as children_list_file:
children_list_pid = children_list_file.read().split()
for child_pid in children_list_pid:
children_proc_path = "/proc/%s/task/%s/children" % \
(child_pid, child_pid)
if not os.path.isfile(children_proc_path):
continue
with open(children_proc_path) as children_list_file_2:
children_list_pid_2 = children_list_file_2.read().split()
for _pid in children_list_pid_2:
try:
os.kill(int(_pid), signal.SIGTERM)
except ProcessLookupError:
continue
try:
os.kill(int(child_pid), signal.SIGTERM)
except ProcessLookupError:
continue
def kill_process_children_osx(pid):
"""Find and kill child processes of a process.
:param pid: PID of parent process (process ID)
:return: Nothing
"""
subprocess.run(['pkill', '-P', str(pid)])
def kill_process_children(pid):
"""Find and kill child processes of a process.
:param pid: PID of parent process (process ID)
:return: Nothing
"""
if sys.platform == 'darwin':
kill_process_children_osx(pid)
elif sys.platform == 'linux':
kill_process_children_unix(pid)
else:
pass # should signal error here
def kill_program_completly(proc):
"""Kill worker and it's child processes and exit.
:param proc: worker process (process ID)
:return: Nothing
"""
kill_process_children(proc.pid)
proc.terminate()
os._exit(0)
def watchdog(sleep_interval):
"""Watch project files, restart worker process if a change happened.
:param sleep_interval: interval in second.
:return: Nothing
"""
mtimes = {}
worker_process = restart_with_reloader()
signal.signal(
signal.SIGTERM, lambda *args: kill_program_completly(worker_process))
signal.signal(
signal.SIGINT, lambda *args: kill_program_completly(worker_process))
while True:
for filename in _iter_module_files():
try:
mtime = os.stat(filename).st_mtime
except OSError:
continue
old_time = mtimes.get(filename)
if old_time is None:
mtimes[filename] = mtime
continue
elif mtime > old_time:
kill_process_children(worker_process.pid)
worker_process.terminate()
worker_process = restart_with_reloader()
mtimes[filename] = mtime
break
sleep(sleep_interval)

View File

@@ -1,19 +1,28 @@
import sys
import json
import socket
from cgi import parse_header
from collections import namedtuple
from http.cookies import SimpleCookie
from httptools import parse_url
from urllib.parse import parse_qs
from urllib.parse import parse_qs, urlunparse
try:
from ujson import loads as json_loads
except ImportError:
from json import loads as json_loads
if sys.version_info[:2] == (3, 5):
def json_loads(data):
# on Python 3.5 json.loads only supports str not bytes
return json.loads(data.decode())
else:
json_loads = json.loads
from sanic.exceptions import InvalidUsage
from sanic.log import log
from sanic.log import error_logger, logger
DEFAULT_HTTP_CONTENT_TYPE = "application/octet-stream"
# HTTP/1.1: https://www.w3.org/Protocols/rfc2616/rfc2616-sec7.html#sec7.2.1
# > If the media type remains unknown, the recipient SHOULD treat it
# > as type "application/octet-stream"
@@ -36,24 +45,22 @@ class RequestParameters(dict):
class Request(dict):
"""Properties of an HTTP request such as URL, headers, etc."""
__slots__ = (
'app', 'url', 'headers', 'version', 'method', '_cookies', 'transport',
'query_string', 'body',
'parsed_json', 'parsed_args', 'parsed_form', 'parsed_files',
'_ip',
'app', 'headers', 'version', 'method', '_cookies', 'transport',
'body', 'parsed_json', 'parsed_args', 'parsed_form', 'parsed_files',
'_ip', '_parsed_url', 'uri_template', 'stream', '_remote_addr',
'_socket', '_port', '__weakref__', 'raw_url'
)
def __init__(self, url_bytes, headers, version, method, transport):
self.raw_url = url_bytes
# TODO: Content-Encoding detection
url_parsed = parse_url(url_bytes)
self._parsed_url = parse_url(url_bytes)
self.app = None
self.url = url_parsed.path.decode('utf-8')
self.headers = headers
self.version = version
self.method = method
self.transport = transport
self.query_string = None
if url_parsed.query:
self.query_string = url_parsed.query.decode('utf-8')
# Init but do not inhale
self.body = []
@@ -61,15 +68,36 @@ class Request(dict):
self.parsed_form = None
self.parsed_files = None
self.parsed_args = None
self.uri_template = None
self._cookies = None
self.stream = None
def __repr__(self):
if self.method is None or not self.path:
return '<{0}>'.format(self.__class__.__name__)
return '<{0}: {1} {2}>'.format(self.__class__.__name__,
self.method,
self.path)
def __bool__(self):
if self.transport:
return True
return False
@property
def json(self):
if self.parsed_json is None:
try:
self.parsed_json = json_loads(self.body)
except Exception:
raise InvalidUsage("Failed when parsing body as json")
self.load_json()
return self.parsed_json
def load_json(self, loads=json_loads):
try:
self.parsed_json = loads(self.body)
except Exception:
if not self.body:
return None
raise InvalidUsage("Failed when parsing body as json")
return self.parsed_json
@@ -79,9 +107,14 @@ class Request(dict):
:return: token related to request
"""
prefixes = ('Bearer', 'Token')
auth_header = self.headers.get('Authorization')
if auth_header is not None:
return auth_header.split()[1]
for prefix in prefixes:
if prefix in auth_header:
return auth_header.partition(prefix)[-1].strip()
return auth_header
@property
@@ -102,7 +135,7 @@ class Request(dict):
self.parsed_form, self.parsed_files = (
parse_multipart_form(self.body, boundary))
except Exception:
log.exception("Failed when parsing form")
error_logger.exception("Failed when parsing form")
return self.parsed_form
@@ -123,10 +156,14 @@ class Request(dict):
self.parsed_args = RequestParameters()
return self.parsed_args
@property
def raw_args(self):
return {k: v[0] for k, v in self.args.items()}
@property
def cookies(self):
if self._cookies is None:
cookie = self.headers.get('Cookie') or self.headers.get('cookie')
cookie = self.headers.get('Cookie')
if cookie is not None:
cookies = SimpleCookie()
cookies.load(cookie)
@@ -138,10 +175,104 @@ class Request(dict):
@property
def ip(self):
if not hasattr(self, '_ip'):
self._ip = self.transport.get_extra_info('peername')
if not hasattr(self, '_socket'):
self._get_address()
return self._ip
@property
def port(self):
if not hasattr(self, '_socket'):
self._get_address()
return self._port
@property
def socket(self):
if not hasattr(self, '_socket'):
self._get_address()
return self._socket
def _get_address(self):
sock = self.transport.get_extra_info('socket')
if sock.family == socket.AF_INET:
self._socket = (self.transport.get_extra_info('peername') or
(None, None))
self._ip, self._port = self._socket
elif sock.family == socket.AF_INET6:
self._socket = (self.transport.get_extra_info('peername') or
(None, None, None, None))
self._ip, self._port, *_ = self._socket
else:
self._ip, self._port = (None, None)
@property
def remote_addr(self):
"""Attempt to return the original client ip based on X-Forwarded-For.
:return: original client ip.
"""
if not hasattr(self, '_remote_addr'):
forwarded_for = self.headers.get('X-Forwarded-For', '').split(',')
remote_addrs = [
addr for addr in [
addr.strip() for addr in forwarded_for
] if addr
]
if len(remote_addrs) > 0:
self._remote_addr = remote_addrs[0]
else:
self._remote_addr = ''
return self._remote_addr
@property
def scheme(self):
if self.app.websocket_enabled \
and self.headers.get('upgrade') == 'websocket':
scheme = 'ws'
else:
scheme = 'http'
if self.transport.get_extra_info('sslcontext'):
scheme += 's'
return scheme
@property
def host(self):
# it appears that httptools doesn't return the host
# so pull it from the headers
return self.headers.get('Host', '')
@property
def content_type(self):
return self.headers.get('Content-Type', DEFAULT_HTTP_CONTENT_TYPE)
@property
def match_info(self):
"""return matched info after resolving route"""
return self.app.router.get(self)[2]
@property
def path(self):
return self._parsed_url.path.decode('utf-8')
@property
def query_string(self):
if self._parsed_url.query:
return self._parsed_url.query.decode('utf-8')
else:
return ''
@property
def url(self):
return urlunparse((
self.scheme,
self.host,
self.path,
None,
self.query_string,
None))
File = namedtuple('File', ['type', 'body', 'name'])
@@ -159,7 +290,8 @@ def parse_multipart_form(body, boundary):
form_parts = body.split(boundary)
for form_part in form_parts[1:-1]:
file_name = None
file_type = None
content_type = 'text/plain'
content_charset = 'utf-8'
field_name = None
line_index = 2
line_end_index = 0
@@ -172,29 +304,35 @@ def parse_multipart_form(body, boundary):
break
colon_index = form_line.index(':')
form_header_field = form_line[0:colon_index]
form_header_field = form_line[0:colon_index].lower()
form_header_value, form_parameters = parse_header(
form_line[colon_index + 2:])
if form_header_field == 'Content-Disposition':
if 'filename' in form_parameters:
file_name = form_parameters['filename']
if form_header_field == 'content-disposition':
file_name = form_parameters.get('filename')
field_name = form_parameters.get('name')
elif form_header_field == 'Content-Type':
file_type = form_header_value
elif form_header_field == 'content-type':
content_type = form_header_value
content_charset = form_parameters.get('charset', 'utf-8')
post_data = form_part[line_index:-4]
if file_name or file_type:
file = File(type=file_type, name=file_name, body=post_data)
if field_name in files:
files[field_name].append(file)
if field_name:
post_data = form_part[line_index:-4]
if file_name:
form_file = File(type=content_type,
name=file_name,
body=post_data)
if field_name in files:
files[field_name].append(form_file)
else:
files[field_name] = [form_file]
else:
files[field_name] = [file]
value = post_data.decode(content_charset)
if field_name in fields:
fields[field_name].append(value)
else:
fields[field_name] = [value]
else:
value = post_data.decode('utf-8')
if field_name in fields:
fields[field_name].append(value)
else:
fields[field_name] = [value]
logger.debug('Form-data field does not have a \'name\' parameter \
in the Content-Disposition header')
return fields, files

View File

@@ -1,109 +1,29 @@
from mimetypes import guess_type
from os import path
from ujson import dumps as json_dumps
from urllib.parse import quote_plus
try:
from ujson import dumps as json_dumps
except BaseException:
from json import dumps as json_dumps
from aiofiles import open as open_async
from multidict import CIMultiDict
from sanic import http
from sanic.cookies import CookieJar
COMMON_STATUS_CODES = {
200: b'OK',
400: b'Bad Request',
404: b'Not Found',
500: b'Internal Server Error',
}
ALL_STATUS_CODES = {
100: b'Continue',
101: b'Switching Protocols',
102: b'Processing',
200: b'OK',
201: b'Created',
202: b'Accepted',
203: b'Non-Authoritative Information',
204: b'No Content',
205: b'Reset Content',
206: b'Partial Content',
207: b'Multi-Status',
208: b'Already Reported',
226: b'IM Used',
300: b'Multiple Choices',
301: b'Moved Permanently',
302: b'Found',
303: b'See Other',
304: b'Not Modified',
305: b'Use Proxy',
307: b'Temporary Redirect',
308: b'Permanent Redirect',
400: b'Bad Request',
401: b'Unauthorized',
402: b'Payment Required',
403: b'Forbidden',
404: b'Not Found',
405: b'Method Not Allowed',
406: b'Not Acceptable',
407: b'Proxy Authentication Required',
408: b'Request Timeout',
409: b'Conflict',
410: b'Gone',
411: b'Length Required',
412: b'Precondition Failed',
413: b'Request Entity Too Large',
414: b'Request-URI Too Long',
415: b'Unsupported Media Type',
416: b'Requested Range Not Satisfiable',
417: b'Expectation Failed',
422: b'Unprocessable Entity',
423: b'Locked',
424: b'Failed Dependency',
426: b'Upgrade Required',
428: b'Precondition Required',
429: b'Too Many Requests',
431: b'Request Header Fields Too Large',
500: b'Internal Server Error',
501: b'Not Implemented',
502: b'Bad Gateway',
503: b'Service Unavailable',
504: b'Gateway Timeout',
505: b'HTTP Version Not Supported',
506: b'Variant Also Negotiates',
507: b'Insufficient Storage',
508: b'Loop Detected',
510: b'Not Extended',
511: b'Network Authentication Required'
}
class BaseHTTPResponse:
def _encode_body(self, data):
try:
# Try to encode it regularly
return data.encode()
except AttributeError:
# Convert it to a str if you can't
return str(data).encode()
class HTTPResponse:
__slots__ = ('body', 'status', 'content_type', 'headers', '_cookies')
def __init__(self, body=None, status=200, headers=None,
content_type='text/plain', body_bytes=b''):
self.content_type = content_type
if body is not None:
try:
# Try to encode it regularly
self.body = body.encode()
except AttributeError:
# Convert it to a str if you can't
self.body = str(body).encode()
else:
self.body = body_bytes
self.status = status
self.headers = headers or {}
self._cookies = None
def output(self, version="1.1", keep_alive=False, keep_alive_timeout=None):
# This is all returned in a kind-of funky way
# We tried to make this as fast as possible in pure python
timeout_header = b''
if keep_alive and keep_alive_timeout is not None:
timeout_header = b'Keep-Alive: %d\r\n' % keep_alive_timeout
self.headers['Content-Length'] = self.headers.get(
'Content-Length', len(self.body))
self.headers['Content-Type'] = self.headers.get(
'Content-Type', self.content_type)
def _parse_headers(self):
headers = b''
for name, value in self.headers.items():
try:
@@ -115,25 +35,7 @@ class HTTPResponse:
b'%b: %b\r\n' % (
str(name).encode(), str(value).encode('utf-8')))
# Try to pull from the common codes first
# Speeds up response rate 6% over pulling from all
status = COMMON_STATUS_CODES.get(self.status)
if not status:
status = ALL_STATUS_CODES.get(self.status)
return (b'HTTP/%b %d %b\r\n'
b'Connection: %b\r\n'
b'%b'
b'%b\r\n'
b'%b') % (
version.encode(),
self.status,
status,
b'keep-alive' if keep_alive else b'close',
timeout_header,
headers,
self.body
)
return headers
@property
def cookies(self):
@@ -142,41 +44,181 @@ class HTTPResponse:
return self._cookies
def json(body, status=200, headers=None, **kwargs):
class StreamingHTTPResponse(BaseHTTPResponse):
__slots__ = (
'protocol', 'streaming_fn', 'status',
'content_type', 'headers', '_cookies'
)
def __init__(self, streaming_fn, status=200, headers=None,
content_type='text/plain'):
self.content_type = content_type
self.streaming_fn = streaming_fn
self.status = status
self.headers = CIMultiDict(headers or {})
self._cookies = None
async def write(self, data):
"""Writes a chunk of data to the streaming response.
:param data: bytes-ish data to be written.
"""
if type(data) != bytes:
data = self._encode_body(data)
self.protocol.push_data(
b"%x\r\n%b\r\n" % (len(data), data))
await self.protocol.drain()
async def stream(
self, version="1.1", keep_alive=False, keep_alive_timeout=None):
"""Streams headers, runs the `streaming_fn` callback that writes
content to the response body, then finalizes the response body.
"""
headers = self.get_headers(
version, keep_alive=keep_alive,
keep_alive_timeout=keep_alive_timeout)
self.protocol.push_data(headers)
await self.protocol.drain()
await self.streaming_fn(self)
self.protocol.push_data(b'0\r\n\r\n')
# no need to await drain here after this write, because it is the
# very last thing we write and nothing needs to wait for it.
def get_headers(
self, version="1.1", keep_alive=False, keep_alive_timeout=None):
# This is all returned in a kind-of funky way
# We tried to make this as fast as possible in pure python
timeout_header = b''
if keep_alive and keep_alive_timeout is not None:
timeout_header = b'Keep-Alive: %d\r\n' % keep_alive_timeout
self.headers['Transfer-Encoding'] = 'chunked'
self.headers.pop('Content-Length', None)
self.headers['Content-Type'] = self.headers.get(
'Content-Type', self.content_type)
headers = self._parse_headers()
if self.status is 200:
status = b'OK'
else:
status = http.STATUS_CODES.get(self.status)
return (b'HTTP/%b %d %b\r\n'
b'%b'
b'%b\r\n') % (
version.encode(),
self.status,
status,
timeout_header,
headers
)
class HTTPResponse(BaseHTTPResponse):
__slots__ = ('body', 'status', 'content_type', 'headers', '_cookies')
def __init__(self, body=None, status=200, headers=None,
content_type='text/plain', body_bytes=b''):
self.content_type = content_type
if body is not None:
self.body = self._encode_body(body)
else:
self.body = body_bytes
self.status = status
self.headers = CIMultiDict(headers or {})
self._cookies = None
def output(
self, version="1.1", keep_alive=False, keep_alive_timeout=None):
# This is all returned in a kind-of funky way
# We tried to make this as fast as possible in pure python
timeout_header = b''
if keep_alive and keep_alive_timeout is not None:
timeout_header = b'Keep-Alive: %d\r\n' % keep_alive_timeout
body = b''
if http.has_message_body(self.status):
body = self.body
self.headers['Content-Length'] = self.headers.get(
'Content-Length', len(self.body))
self.headers['Content-Type'] = self.headers.get(
'Content-Type', self.content_type)
if self.status in (304, 412):
self.headers = http.remove_entity_headers(self.headers)
headers = self._parse_headers()
if self.status is 200:
status = b'OK'
else:
status = http.STATUS_CODES.get(self.status, b'UNKNOWN RESPONSE')
return (b'HTTP/%b %d %b\r\n'
b'Connection: %b\r\n'
b'%b'
b'%b\r\n'
b'%b') % (
version.encode(),
self.status,
status,
b'keep-alive' if keep_alive else b'close',
timeout_header,
headers,
body
)
@property
def cookies(self):
if self._cookies is None:
self._cookies = CookieJar(self.headers)
return self._cookies
def json(body, status=200, headers=None,
content_type="application/json", dumps=json_dumps,
**kwargs):
"""
Returns response object with body in json format.
:param body: Response data to be serialized.
:param status: Response code.
:param headers: Custom Headers.
:param kwargs: Remaining arguments that are passed to the json encoder.
"""
return HTTPResponse(json_dumps(body, **kwargs), headers=headers,
status=status, content_type="application/json")
return HTTPResponse(dumps(body, **kwargs), headers=headers,
status=status, content_type=content_type)
def text(body, status=200, headers=None,
content_type="text/plain; charset=utf-8"):
"""
Returns response object with body in text format.
:param body: Response data to be encoded.
:param status: Response code.
:param headers: Custom Headers.
:param content_type:
the content type (string) of the response
:param content_type: the content type (string) of the response
"""
return HTTPResponse(body, status=status, headers=headers,
content_type=content_type)
return HTTPResponse(
body, status=status, headers=headers,
content_type=content_type)
def raw(body, status=200, headers=None,
content_type="application/octet-stream"):
"""
Returns response object without encoding the body.
:param body: Response data.
:param status: Response code.
:param headers: Custom Headers.
:param content_type:
the content type (string) of the response
:param content_type: the content type (string) of the response.
"""
return HTTPResponse(body_bytes=body, status=status, headers=headers,
content_type=content_type)
@@ -185,6 +227,7 @@ def raw(body, status=200, headers=None,
def html(body, status=200, headers=None):
"""
Returns response object with body in html format.
:param body: Response data to be encoded.
:param status: Response code.
:param headers: Custom Headers.
@@ -193,15 +236,22 @@ def html(body, status=200, headers=None):
content_type="text/html; charset=utf-8")
async def file(location, mime_type=None, headers=None, _range=None):
async def file(location, status=200, mime_type=None, headers=None,
filename=None, _range=None):
"""Return a response object with file data.
:param location: Location of file on system.
:param mime_type: Specific mime_type.
:param headers: Custom Headers.
:param filename: Override filename.
:param _range:
"""
filename = path.split(location)[-1]
headers = headers or {}
if filename:
headers.setdefault(
'Content-Disposition',
'attachment; filename="{}"'.format(filename))
filename = filename or path.split(location)[-1]
async with open_async(location, mode='rb') as _file:
if _range:
@@ -213,13 +263,94 @@ async def file(location, mime_type=None, headers=None, _range=None):
out_stream = await _file.read()
mime_type = mime_type or guess_type(filename)[0] or 'text/plain'
return HTTPResponse(status=200,
return HTTPResponse(status=status,
headers=headers,
content_type=mime_type,
body_bytes=out_stream)
async def file_stream(location, status=200, chunk_size=4096, mime_type=None,
headers=None, filename=None, _range=None):
"""Return a streaming response object with file data.
:param location: Location of file on system.
:param chunk_size: The size of each chunk in the stream (in bytes)
:param mime_type: Specific mime_type.
:param headers: Custom Headers.
:param filename: Override filename.
:param _range:
"""
headers = headers or {}
if filename:
headers.setdefault(
'Content-Disposition',
'attachment; filename="{}"'.format(filename))
filename = filename or path.split(location)[-1]
_file = await open_async(location, mode='rb')
async def _streaming_fn(response):
nonlocal _file, chunk_size
try:
if _range:
chunk_size = min((_range.size, chunk_size))
await _file.seek(_range.start)
to_send = _range.size
while to_send > 0:
content = await _file.read(chunk_size)
if len(content) < 1:
break
to_send -= len(content)
await response.write(content)
else:
while True:
content = await _file.read(chunk_size)
if len(content) < 1:
break
await response.write(content)
finally:
await _file.close()
return # Returning from this fn closes the stream
mime_type = mime_type or guess_type(filename)[0] or 'text/plain'
if _range:
headers['Content-Range'] = 'bytes %s-%s/%s' % (
_range.start, _range.end, _range.total)
return StreamingHTTPResponse(streaming_fn=_streaming_fn,
status=status,
headers=headers,
content_type=mime_type)
def stream(
streaming_fn, status=200, headers=None,
content_type="text/plain; charset=utf-8"):
"""Accepts an coroutine `streaming_fn` which can be used to
write chunks to a streaming response. Returns a `StreamingHTTPResponse`.
Example usage::
@app.route("/")
async def index(request):
async def streaming_fn(response):
await response.write('foo')
await response.write('bar')
return stream(streaming_fn, content_type='text/plain')
:param streaming_fn: A coroutine accepts a response and
writes content to that response.
:param mime_type: Specific mime_type.
:param headers: Custom Headers.
"""
return StreamingHTTPResponse(
streaming_fn,
headers=headers,
content_type=content_type,
status=status
)
def redirect(to, headers=None, status=302,
content_type="text/html; charset=utf-8"):
"""Abort execution and cause a 302 redirect (by default).
@@ -232,8 +363,11 @@ def redirect(to, headers=None, status=302,
"""
headers = headers or {}
# URL Quote the URL before redirecting
safe_to = quote_plus(to, safe=":/#?&=@[]!$&'()*+,;")
# According to RFC 7231, a relative URI is now permitted.
headers['Location'] = to
headers['Location'] = safe_to
return HTTPResponse(
status=status,

View File

@@ -1,14 +1,16 @@
import re
import uuid
from collections import defaultdict, namedtuple
from collections.abc import Iterable
from functools import lru_cache
from urllib.parse import unquote
from sanic.exceptions import NotFound, InvalidUsage
from sanic.exceptions import NotFound, MethodNotSupported
from sanic.views import CompositionView
Route = namedtuple(
'Route',
['handler', 'methods', 'pattern', 'parameters', 'name'])
['handler', 'methods', 'pattern', 'parameters', 'name', 'uri'])
Parameter = namedtuple('Parameter', ['name', 'cast'])
REGEX_TYPES = {
@@ -16,6 +18,9 @@ REGEX_TYPES = {
'int': (int, r'\d+'),
'number': (float, r'[0-9\\.]+'),
'alpha': (str, r'[A-Za-z]+'),
'path': (str, r'[^/].*?'),
'uuid': (uuid.UUID, r'[A-Fa-f0-9]{8}-[A-Fa-f0-9]{4}-'
r'[A-Fa-f0-9]{4}-[A-Fa-f0-9]{4}-[A-Fa-f0-9]{12}')
}
ROUTER_CACHE_SIZE = 1024
@@ -66,18 +71,22 @@ class Router:
def __init__(self):
self.routes_all = {}
self.routes_names = {}
self.routes_static_files = {}
self.routes_static = {}
self.routes_dynamic = defaultdict(list)
self.routes_always_check = []
self.hosts = set()
def parse_parameter_string(self, parameter_string):
@classmethod
def parse_parameter_string(cls, parameter_string):
"""Parse a parameter string into its constituent name, type, and
pattern
For example:
`parse_parameter_string('<param_one:[A-z]>')` ->
('param_one', str, '[A-z]')
For example::
parse_parameter_string('<param_one:[A-z]>')` ->
('param_one', str, '[A-z]')
:param parameter_string: String to parse
:return: tuple containing
@@ -88,6 +97,10 @@ class Router:
pattern = 'string'
if ':' in parameter_string:
name, pattern = parameter_string.split(':', 1)
if not name:
raise ValueError(
"Invalid parameter syntax: {}".format(parameter_string)
)
default = (str, pattern)
# Pull from pre-configured types
@@ -95,26 +108,8 @@ class Router:
return name, _type, pattern
def add(self, uri, methods, handler, host=None):
# add regular version
self._add(uri, methods, handler, host)
slash_is_missing = (
not uri[-1] == '/'
and not self.routes_all.get(uri + '/', False)
)
without_slash_is_missing = (
uri[-1] == '/'
and not self.routes_all.get(uri[:-1], False)
and not uri == '/'
)
# add version with trailing slash
if slash_is_missing:
self._add(uri + '/', methods, handler, host)
# add version without trailing slash
elif without_slash_is_missing:
self._add(uri[:-1], methods, handler, host)
def _add(self, uri, methods, handler, host=None):
def add(self, uri, methods, handler, host=None, strict_slashes=False,
version=None, name=None):
"""Add a handler to the route list
:param uri: path to match
@@ -122,6 +117,63 @@ class Router:
provided, any method is allowed
:param handler: request handler function.
When executed, it should provide a response object.
:param strict_slashes: strict to trailing slash
:param version: current version of the route or blueprint. See
docs for further details.
:return: Nothing
"""
if version is not None:
version = re.escape(str(version).strip('/').lstrip('v'))
uri = "/".join(["/v{}".format(version), uri.lstrip('/')])
# add regular version
self._add(uri, methods, handler, host, name)
if strict_slashes:
return
if not isinstance(host, str) and host is not None:
# we have gotten back to the top of the recursion tree where the
# host was originally a list. By now, we've processed the strict
# slashes logic on the leaf nodes (the individual host strings in
# the list of host)
return
# Add versions with and without trailing /
slashed_methods = self.routes_all.get(uri + '/', frozenset({}))
unslashed_methods = self.routes_all.get(uri[:-1], frozenset({}))
if isinstance(methods, Iterable):
_slash_is_missing = all(method in slashed_methods for
method in methods)
_without_slash_is_missing = all(method in unslashed_methods for
method in methods)
else:
_slash_is_missing = methods in slashed_methods
_without_slash_is_missing = methods in unslashed_methods
slash_is_missing = (
not uri[-1] == '/' and not _slash_is_missing
)
without_slash_is_missing = (
uri[-1] == '/' and not
_without_slash_is_missing and not
uri == '/'
)
# add version with trailing slash
if slash_is_missing:
self._add(uri + '/', methods, handler, host, name)
# add version without trailing slash
elif without_slash_is_missing:
self._add(uri[:-1], methods, handler, host, name)
def _add(self, uri, methods, handler, host=None, name=None):
"""Add a handler to the route list
:param uri: path to match
:param methods: sequence of accepted method names. If none are
provided, any method is allowed
:param handler: request handler function.
When executed, it should provide a response object.
:param name: user defined route name for url_for
:return: Nothing
"""
if host is not None:
@@ -135,11 +187,8 @@ class Router:
"host strings, not {!r}".format(host))
for host_ in host:
self.add(uri, methods, handler, host_)
self.add(uri, methods, handler, host_, name)
return
else:
# default host
self.hosts.add('*')
# Dict for faster lookups of if method allowed
if methods:
@@ -157,10 +206,10 @@ class Router:
parameters.append(parameter)
# Mark the whole route as unhashable if it has the hash key in it
if re.search('(^|[^^]){1}/', pattern):
if re.search(r'(^|[^^]){1}/', pattern):
properties['unhashable'] = True
# Mark the route as unhashable if it matches the hash key
elif re.search(pattern, '/'):
elif re.search(r'/', pattern):
properties['unhashable'] = True
return '({})'.format(pattern)
@@ -195,33 +244,49 @@ class Router:
if properties['unhashable']:
routes_to_check = self.routes_always_check
ndx, route = self.check_dynamic_route_exists(
pattern, routes_to_check)
pattern, routes_to_check, parameters)
else:
routes_to_check = self.routes_dynamic[url_hash(uri)]
ndx, route = self.check_dynamic_route_exists(
pattern, routes_to_check)
pattern, routes_to_check, parameters)
if ndx != -1:
# Pop the ndx of the route, no dups of the same route
routes_to_check.pop(ndx)
else:
route = self.routes_all.get(uri)
# prefix the handler name with the blueprint name
# if available
# special prefix for static files
is_static = False
if name and name.startswith('_static_'):
is_static = True
name = name.split('_static_', 1)[-1]
if hasattr(handler, '__blueprintname__'):
handler_name = '{}.{}'.format(
handler.__blueprintname__, name or handler.__name__)
else:
handler_name = name or getattr(handler, '__name__', None)
if route:
route = merge_route(route, methods, handler)
else:
# prefix the handler name with the blueprint name
# if available
if hasattr(handler, '__blueprintname__'):
handler_name = '{}.{}'.format(
handler.__blueprintname__, handler.__name__)
else:
handler_name = getattr(handler, '__name__', None)
route = Route(
handler=handler, methods=methods, pattern=pattern,
parameters=parameters, name=handler_name)
parameters=parameters, name=handler_name, uri=uri)
self.routes_all[uri] = route
if is_static:
pair = self.routes_static_files.get(handler_name)
if not (pair and (pair[0] + '/' == uri or uri + '/' == pair[0])):
self.routes_static_files[handler_name] = (uri, route)
else:
pair = self.routes_names.get(handler_name)
if not (pair and (pair[0] + '/' == uri or uri + '/' == pair[0])):
self.routes_names[handler_name] = (uri, route)
if properties['unhashable']:
self.routes_always_check.append(route)
elif parameters:
@@ -230,9 +295,9 @@ class Router:
self.routes_static[uri] = route
@staticmethod
def check_dynamic_route_exists(pattern, routes_to_check):
def check_dynamic_route_exists(pattern, routes_to_check, parameters):
for ndx, route in enumerate(routes_to_check):
if route.pattern == pattern:
if route.pattern == pattern and route.parameters == parameters:
return ndx, route
else:
return -1, None
@@ -242,6 +307,16 @@ class Router:
uri = host + uri
try:
route = self.routes_all.pop(uri)
for handler_name, pairs in self.routes_names.items():
if pairs[0] == uri:
self.routes_names.pop(handler_name)
break
for handler_name, pairs in self.routes_static_files.items():
if pairs[0] == uri:
self.routes_static_files.pop(handler_name)
break
except KeyError:
raise RouteDoesNotExist("Route was not registered: {}".format(uri))
@@ -257,20 +332,20 @@ class Router:
self._get.cache_clear()
@lru_cache(maxsize=ROUTER_CACHE_SIZE)
def find_route_by_view_name(self, view_name):
def find_route_by_view_name(self, view_name, name=None):
"""Find a route in the router based on the specified view name.
:param view_name: string of view name to search by
:param kwargs: additional params, usually for static files
:return: tuple containing (uri, Route)
"""
if not view_name:
return (None, None)
for uri, route in self.routes_all.items():
if route.name == view_name:
return uri, route
if view_name == 'static' or view_name.endswith('.static'):
return self.routes_static_files.get(name, (None, None))
return (None, None)
return self.routes_names.get(view_name, (None, None))
def get(self, request):
"""Get a request handler based on the URL of the request, or raises an
@@ -281,14 +356,24 @@ class Router:
"""
# No virtual hosts specified; default behavior
if not self.hosts:
return self._get(request.url, request.method, '')
return self._get(request.path, request.method, '')
# virtual hosts specified; try to match route to the host header
try:
return self._get(request.url, request.method,
return self._get(request.path, request.method,
request.headers.get("Host", ''))
# try default hosts
except NotFound:
return self._get(request.url, request.method, '')
return self._get(request.path, request.method, '')
def get_supported_methods(self, url):
"""Get a list of supported methods for a url and optional host.
:param url: URL string (including host)
:return: frozenset of supported methods
"""
route = self.routes_all.get(url)
# if methods are None then this logic will prevent an error
return getattr(route, 'methods', None) or frozenset()
@lru_cache(maxsize=ROUTER_CACHE_SIZE)
def _get(self, url, method, host):
@@ -299,12 +384,13 @@ class Router:
:param method: request method
:return: handler, arguments, keyword arguments
"""
url = host + url
url = unquote(host + url)
# Check against known static routes
route = self.routes_static.get(url)
method_not_supported = InvalidUsage(
'Method {} not allowed for URL {}'.format(
method, url), status_code=405)
method_not_supported = MethodNotSupported(
'Method {} not allowed for URL {}'.format(method, url),
method=method,
allowed_methods=self.get_supported_methods(url))
if route:
if route.methods and method not in route.methods:
raise method_not_supported
@@ -338,4 +424,18 @@ class Router:
route_handler = route.handler
if hasattr(route_handler, 'handlers'):
route_handler = route_handler.handlers[method]
return route_handler, [], kwargs
return route_handler, [], kwargs, route.uri
def is_stream_handler(self, request):
""" Handler for request is stream or not.
:param request: Request object
:return: bool
"""
try:
handler = self.get(request)[0]
except (NotFound, MethodNotSupported):
return False
if (hasattr(handler, 'view_class') and
hasattr(handler.view_class, request.method.lower())):
handler = getattr(handler.view_class, request.method.lower())
return hasattr(handler, 'is_stream')

View File

@@ -1,28 +1,37 @@
import asyncio
import os
import traceback
import warnings
from functools import partial
from inspect import isawaitable
from multiprocessing import Process, Event
from os import set_inheritable
from signal import SIGTERM, SIGINT
from signal import signal as signal_func
from socket import socket, SOL_SOCKET, SO_REUSEADDR
from multiprocessing import Process
from signal import (
SIGTERM, SIGINT, SIG_IGN,
signal as signal_func,
Signals
)
from socket import (
socket,
SOL_SOCKET,
SO_REUSEADDR,
)
from time import time
from httptools import HttpRequestParser
from httptools.parser.errors import HttpParserError
from multidict import CIMultiDict
try:
import uvloop as async_loop
import uvloop
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
except ImportError:
async_loop = asyncio
pass
from sanic.log import log
from sanic.log import logger, access_logger
from sanic.response import HTTPResponse
from sanic.request import Request
from sanic.exceptions import (
RequestTimeout, PayloadTooLarge, InvalidUsage, ServerError)
RequestTimeout, PayloadTooLarge, InvalidUsage, ServerError,
ServiceUnavailable)
current_time = None
@@ -31,25 +40,6 @@ class Signal:
stopped = False
class CIDict(dict):
"""Case Insensitive dict where all keys are converted to lowercase
This does not maintain the inputted case when calling items() or keys()
in favor of speed, since headers are case insensitive
"""
def get(self, key, default=None):
return super().get(key.casefold(), default)
def __getitem__(self, key):
return super().__getitem__(key.casefold())
def __setitem__(self, key, value):
return super().__setitem__(key.casefold(), value)
def __contains__(self, key):
return super().__contains__(key.casefold())
class HttpProtocol(asyncio.Protocol):
__slots__ = (
# event loop, connection
@@ -57,29 +47,65 @@ class HttpProtocol(asyncio.Protocol):
# request params
'parser', 'request', 'url', 'headers',
# request config
'request_handler', 'request_timeout', 'request_max_size',
'request_handler', 'request_timeout', 'response_timeout',
'keep_alive_timeout', 'request_max_size', 'request_class',
'is_request_stream', 'router',
# enable or disable access log purpose
'access_log',
# connection management
'_total_request_size', '_timeout_handler', '_last_communication_time')
'_total_request_size', '_request_timeout_handler',
'_response_timeout_handler', '_keep_alive_timeout_handler',
'_last_request_time', '_last_response_time', '_is_stream_handler',
'_not_paused')
def __init__(self, *, loop, request_handler, error_handler,
signal=Signal(), connections=set(), request_timeout=60,
request_max_size=None):
response_timeout=60, keep_alive_timeout=5,
request_max_size=None, request_class=None, access_log=True,
keep_alive=True, is_request_stream=False, router=None,
state=None, debug=False, **kwargs):
self.loop = loop
self.transport = None
self.request = None
self.parser = None
self.url = None
self.headers = None
self.router = router
self.signal = signal
self.access_log = access_log
self.connections = connections
self.request_handler = request_handler
self.error_handler = error_handler
self.request_timeout = request_timeout
self.response_timeout = response_timeout
self.keep_alive_timeout = keep_alive_timeout
self.request_max_size = request_max_size
self.request_class = request_class or Request
self.is_request_stream = is_request_stream
self._is_stream_handler = False
self._not_paused = asyncio.Event(loop=loop)
self._total_request_size = 0
self._timeout_handler = None
self._request_timeout_handler = None
self._response_timeout_handler = None
self._keep_alive_timeout_handler = None
self._last_request_time = None
self._last_response_time = None
self._request_handler_task = None
self._request_stream_task = None
self._keep_alive = keep_alive
self._header_fragment = b''
self.state = state if state else {}
if 'requests_count' not in self.state:
self.state['requests_count'] = 0
self._debug = debug
self._not_paused.set()
@property
def keep_alive(self):
return (
self._keep_alive and
not self.signal.stopped and
self.parser.should_keep_alive())
# -------------------------------------------- #
# Connection
@@ -87,27 +113,82 @@ class HttpProtocol(asyncio.Protocol):
def connection_made(self, transport):
self.connections.add(self)
self._timeout_handler = self.loop.call_later(
self.request_timeout, self.connection_timeout)
self._request_timeout_handler = self.loop.call_later(
self.request_timeout, self.request_timeout_callback)
self.transport = transport
self._last_request_time = current_time
def connection_lost(self, exc):
self.connections.discard(self)
self._timeout_handler.cancel()
if self._request_timeout_handler:
self._request_timeout_handler.cancel()
if self._response_timeout_handler:
self._response_timeout_handler.cancel()
if self._keep_alive_timeout_handler:
self._keep_alive_timeout_handler.cancel()
def connection_timeout(self):
# Check if
def pause_writing(self):
self._not_paused.clear()
def resume_writing(self):
self._not_paused.set()
def request_timeout_callback(self):
# See the docstring in the RequestTimeout exception, to see
# exactly what this timeout is checking for.
# Check if elapsed time since request initiated exceeds our
# configured maximum request timeout value
time_elapsed = current_time - self._last_request_time
if time_elapsed < self.request_timeout:
time_left = self.request_timeout - time_elapsed
self._timeout_handler = (
self.loop.call_later(time_left, self.connection_timeout))
self._request_timeout_handler = (
self.loop.call_later(time_left,
self.request_timeout_callback)
)
else:
if self._request_stream_task:
self._request_stream_task.cancel()
if self._request_handler_task:
self._request_handler_task.cancel()
exception = RequestTimeout('Request Timeout')
self.write_error(exception)
try:
raise RequestTimeout('Request Timeout')
except RequestTimeout as exception:
self.write_error(exception)
def response_timeout_callback(self):
# Check if elapsed time since response was initiated exceeds our
# configured maximum request timeout value
time_elapsed = current_time - self._last_request_time
if time_elapsed < self.response_timeout:
time_left = self.response_timeout - time_elapsed
self._response_timeout_handler = (
self.loop.call_later(time_left,
self.response_timeout_callback)
)
else:
if self._request_stream_task:
self._request_stream_task.cancel()
if self._request_handler_task:
self._request_handler_task.cancel()
try:
raise ServiceUnavailable('Response Timeout')
except ServiceUnavailable as exception:
self.write_error(exception)
def keep_alive_timeout_callback(self):
# Check if elapsed time since last response exceeds our configured
# maximum keep alive timeout value
time_elapsed = current_time - self._last_response_time
if time_elapsed < self.keep_alive_timeout:
time_left = self.keep_alive_timeout - time_elapsed
self._keep_alive_timeout_handler = (
self.loop.call_later(time_left,
self.keep_alive_timeout_callback)
)
else:
logger.debug('KeepAlive Timeout. Closing connection.')
self.transport.close()
self.transport = None
# -------------------------------------------- #
# Parsing
@@ -127,62 +208,143 @@ class HttpProtocol(asyncio.Protocol):
self.headers = []
self.parser = HttpRequestParser(self)
# requests count
self.state['requests_count'] = self.state['requests_count'] + 1
# Parse request chunk or close connection
try:
self.parser.feed_data(data)
except HttpParserError:
exception = InvalidUsage('Bad Request')
message = 'Bad Request'
if self._debug:
message += '\n' + traceback.format_exc()
exception = InvalidUsage(message)
self.write_error(exception)
def on_url(self, url):
self.url = url
if not self.url:
self.url = url
else:
self.url += url
def on_header(self, name, value):
if name == b'Content-Length' and int(value) > self.request_max_size:
exception = PayloadTooLarge('Payload Too Large')
self.write_error(exception)
self._header_fragment += name
self.headers.append((name.decode().casefold(), value.decode()))
if value is not None:
if self._header_fragment == b'Content-Length' \
and int(value) > self.request_max_size:
exception = PayloadTooLarge('Payload Too Large')
self.write_error(exception)
try:
value = value.decode()
except UnicodeDecodeError:
value = value.decode('latin_1')
self.headers.append(
(self._header_fragment.decode().casefold(), value))
self._header_fragment = b''
def on_headers_complete(self):
self.request = Request(
self.request = self.request_class(
url_bytes=self.url,
headers=CIDict(self.headers),
headers=CIMultiDict(self.headers),
version=self.parser.get_http_version(),
method=self.parser.get_method().decode(),
transport=self.transport
)
# Remove any existing KeepAlive handler here,
# It will be recreated if required on the new request.
if self._keep_alive_timeout_handler:
self._keep_alive_timeout_handler.cancel()
self._keep_alive_timeout_handler = None
if self.is_request_stream:
self._is_stream_handler = self.router.is_stream_handler(
self.request)
if self._is_stream_handler:
self.request.stream = asyncio.Queue()
self.execute_request_handler()
def on_body(self, body):
if self.is_request_stream and self._is_stream_handler:
self._request_stream_task = self.loop.create_task(
self.request.stream.put(body))
return
self.request.body.append(body)
def on_message_complete(self):
if self.request.body:
self.request.body = b''.join(self.request.body)
# Entire request (headers and whole body) is received.
# We can cancel and remove the request timeout handler now.
if self._request_timeout_handler:
self._request_timeout_handler.cancel()
self._request_timeout_handler = None
if self.is_request_stream and self._is_stream_handler:
self._request_stream_task = self.loop.create_task(
self.request.stream.put(None))
return
self.request.body = b''.join(self.request.body)
self.execute_request_handler()
def execute_request_handler(self):
self._response_timeout_handler = self.loop.call_later(
self.response_timeout, self.response_timeout_callback)
self._last_request_time = current_time
self._request_handler_task = self.loop.create_task(
self.request_handler(self.request, self.write_response))
self.request_handler(
self.request,
self.write_response,
self.stream_response))
# -------------------------------------------- #
# Responding
# -------------------------------------------- #
def log_response(self, response):
if self.access_log:
extra = {
'status': getattr(response, 'status', 0),
}
if isinstance(response, HTTPResponse):
extra['byte'] = len(response.body)
else:
extra['byte'] = -1
extra['host'] = 'UNKNOWN'
if self.request is not None:
if self.request.ip:
extra['host'] = '{0}:{1}'.format(self.request.ip,
self.request.port)
extra['request'] = '{0} {1}'.format(self.request.method,
self.request.url)
else:
extra['request'] = 'nil'
access_logger.info('', extra=extra)
def write_response(self, response):
keep_alive = (
self.parser.should_keep_alive() and not self.signal.stopped)
"""
Writes response content synchronously to the transport.
"""
if self._response_timeout_handler:
self._response_timeout_handler.cancel()
self._response_timeout_handler = None
try:
keep_alive = self.keep_alive
self.transport.write(
response.output(
self.request.version, keep_alive, self.request_timeout))
self.request.version, keep_alive,
self.keep_alive_timeout))
self.log_response(response)
except AttributeError:
log.error(
('Invalid response object for url {}, '
'Expected Type: HTTPResponse, Actual Type: {}').format(
self.url, type(response)))
logger.error('Invalid response object for url %s, '
'Expected Type: HTTPResponse, Actual Type: %s',
self.url, type(response))
self.write_error(ServerError('Invalid response type'))
except RuntimeError:
log.error(
'Connection lost before response written @ {}'.format(
self.request.ip))
if self._debug:
logger.error('Connection lost before response written @ %s',
self.request.ip)
keep_alive = False
except Exception as e:
self.bail_out(
"Writing response failed, connection closed {}".format(
@@ -190,47 +352,113 @@ class HttpProtocol(asyncio.Protocol):
finally:
if not keep_alive:
self.transport.close()
self.transport = None
else:
# Record that we received data
self._last_request_time = current_time
self._keep_alive_timeout_handler = self.loop.call_later(
self.keep_alive_timeout,
self.keep_alive_timeout_callback)
self._last_response_time = current_time
self.cleanup()
async def drain(self):
await self._not_paused.wait()
def push_data(self, data):
self.transport.write(data)
async def stream_response(self, response):
"""
Streams a response to the client asynchronously. Attaches
the transport to the response so the response consumer can
write to the response as needed.
"""
if self._response_timeout_handler:
self._response_timeout_handler.cancel()
self._response_timeout_handler = None
try:
keep_alive = self.keep_alive
response.protocol = self
await response.stream(
self.request.version, keep_alive, self.keep_alive_timeout)
self.log_response(response)
except AttributeError:
logger.error('Invalid response object for url %s, '
'Expected Type: HTTPResponse, Actual Type: %s',
self.url, type(response))
self.write_error(ServerError('Invalid response type'))
except RuntimeError:
if self._debug:
logger.error('Connection lost before response written @ %s',
self.request.ip)
keep_alive = False
except Exception as e:
self.bail_out(
"Writing response failed, connection closed {}".format(
repr(e)))
finally:
if not keep_alive:
self.transport.close()
self.transport = None
else:
self._keep_alive_timeout_handler = self.loop.call_later(
self.keep_alive_timeout,
self.keep_alive_timeout_callback)
self._last_response_time = current_time
self.cleanup()
def write_error(self, exception):
# An error _is_ a response.
# Don't throw a response timeout, when a response _is_ given.
if self._response_timeout_handler:
self._response_timeout_handler.cancel()
self._response_timeout_handler = None
response = None
try:
response = self.error_handler.response(self.request, exception)
version = self.request.version if self.request else '1.1'
self.transport.write(response.output(version))
except RuntimeError:
log.error(
'Connection lost before error written @ {}'.format(
self.request.ip if self.request else 'Unknown'))
if self._debug:
logger.error('Connection lost before error written @ %s',
self.request.ip if self.request else 'Unknown')
except Exception as e:
self.bail_out(
"Writing error failed, connection closed {}".format(repr(e)),
from_error=True)
"Writing error failed, connection closed {}".format(
repr(e)), from_error=True
)
finally:
self.transport.close()
if self.parser and (self.keep_alive
or getattr(response, 'status', 0) == 408):
self.log_response(response)
try:
self.transport.close()
except AttributeError as e:
logger.debug('Connection lost before server could close it.')
def bail_out(self, message, from_error=False):
if from_error and self.transport.is_closing():
log.error(
("Transport closed @ {} and exception "
"experienced during error handling").format(
self.transport.get_extra_info('peername')))
log.debug(
'Exception:\n{}'.format(traceback.format_exc()))
if from_error or self.transport.is_closing():
logger.error("Transport closed @ %s and exception "
"experienced during error handling",
self.transport.get_extra_info('peername'))
logger.debug('Exception:\n%s', traceback.format_exc())
else:
exception = ServerError(message)
self.write_error(exception)
log.error(message)
logger.error(message)
def cleanup(self):
"""This is called when KeepAlive feature is used,
it resets the connection in order for it to be able
to handle receiving another request on the same connection."""
self.parser = None
self.request = None
self.url = None
self.headers = None
self._request_handler_task = None
self._request_stream_task = None
self._total_request_size = 0
self._is_stream_handler = False
def close_if_idle(self):
"""Close the connection if a request is not being sent or received
@@ -242,6 +470,14 @@ class HttpProtocol(asyncio.Protocol):
return True
return False
def close(self):
"""
Force close the connection.
"""
if self.transport is not None:
self.transport.close()
self.transport = None
def update_current_time(loop):
"""Cache the current time, since it is needed at the end of every
@@ -269,9 +505,15 @@ def trigger_events(events, loop):
def serve(host, port, request_handler, error_handler, before_start=None,
after_start=None, before_stop=None, after_stop=None, debug=False,
request_timeout=60, ssl=None, sock=None, request_max_size=None,
reuse_port=False, loop=None, protocol=HttpProtocol, backlog=100,
register_sys_signals=True, run_async=False):
request_timeout=60, response_timeout=60, keep_alive_timeout=5,
ssl=None, sock=None, request_max_size=None, reuse_port=False,
loop=None, protocol=HttpProtocol, backlog=100,
register_sys_signals=True, run_multiple=False, run_async=False,
connections=None, signal=Signal(), request_class=None,
access_log=True, keep_alive=True, is_request_stream=False,
router=None, websocket_max_size=None, websocket_max_queue=None,
websocket_read_limit=2 ** 16, websocket_write_limit=2 ** 16,
state=None, graceful_shutdown_timeout=15.0):
"""Start asynchronous HTTP Server on an individual process.
:param host: Address to host on
@@ -287,28 +529,42 @@ def serve(host, port, request_handler, error_handler, before_start=None,
`app` instance and `loop`
:param after_stop: function to be executed when a stop signal is
received after it is respected. Takes arguments
`app` instance and `loop`
`app` instance and `loop`
:param debug: enables debug output (slows server)
:param request_timeout: time in seconds
:param response_timeout: time in seconds
:param keep_alive_timeout: time in seconds
:param ssl: SSLContext
:param sock: Socket for the server to accept connections from
:param request_max_size: size in bytes, `None` for no limit
:param reuse_port: `True` for multiple workers
:param loop: asyncio compatible event loop
:param protocol: subclass of asyncio protocol class
:param request_class: Request class to use
:param access_log: disable/enable access log
:param websocket_max_size: enforces the maximum size for
incoming messages in bytes.
:param websocket_max_queue: sets the maximum length of the queue
that holds incoming messages.
:param websocket_read_limit: sets the high-water limit of the buffer for
incoming bytes, the low-water limit is half
the high-water limit.
:param websocket_write_limit: sets the high-water limit of the buffer for
outgoing bytes, the low-water limit is a
quarter of the high-water limit.
:param is_request_stream: disable/enable Request.stream
:param router: Router object
:return: Nothing
"""
if not run_async:
loop = async_loop.new_event_loop()
# create new event_loop after fork
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
if debug:
loop.set_debug(debug)
trigger_events(before_start, loop)
connections = set()
signal = Signal()
connections = connections if connections is not None else set()
server = partial(
protocol,
loop=loop,
@@ -317,7 +573,20 @@ def serve(host, port, request_handler, error_handler, before_start=None,
request_handler=request_handler,
error_handler=error_handler,
request_timeout=request_timeout,
response_timeout=response_timeout,
keep_alive_timeout=keep_alive_timeout,
request_max_size=request_max_size,
request_class=request_class,
access_log=access_log,
keep_alive=keep_alive,
is_request_stream=is_request_stream,
router=router,
websocket_max_size=websocket_max_size,
websocket_max_queue=websocket_max_queue,
websocket_read_limit=websocket_read_limit,
websocket_write_limit=websocket_write_limit,
state=state,
debug=debug,
)
server_coroutine = loop.create_server(
@@ -329,6 +598,7 @@ def serve(host, port, request_handler, error_handler, before_start=None,
sock=sock,
backlog=backlog
)
# Instead of pulling time at the end of every request,
# pull it once per minute
loop.call_soon(partial(update_current_time, loop))
@@ -336,28 +606,35 @@ def serve(host, port, request_handler, error_handler, before_start=None,
if run_async:
return server_coroutine
trigger_events(before_start, loop)
try:
http_server = loop.run_until_complete(server_coroutine)
except:
log.exception("Unable to start server")
except BaseException:
logger.exception("Unable to start server")
return
trigger_events(after_start, loop)
# Ignore SIGINT when run_multiple
if run_multiple:
signal_func(SIGINT, SIG_IGN)
# Register signals for graceful termination
if register_sys_signals:
for _signal in (SIGINT, SIGTERM):
_singals = (SIGTERM,) if run_multiple else (SIGINT, SIGTERM)
for _signal in _singals:
try:
loop.add_signal_handler(_signal, loop.stop)
except NotImplementedError:
log.warn('Sanic tried to use loop.add_signal_handler but it is'
' not implemented on this platform.')
logger.warning('Sanic tried to use loop.add_signal_handler '
'but it is not implemented on this platform.')
pid = os.getpid()
try:
log.info('Starting worker [{}]'.format(pid))
logger.info('Starting worker [%s]', pid)
loop.run_forever()
finally:
log.info("Stopping worker [{}]".format(pid))
logger.info("Stopping worker [%s]", pid)
# Run the on_stop function if provided
trigger_events(before_stop, loop)
@@ -371,15 +648,35 @@ def serve(host, port, request_handler, error_handler, before_start=None,
for connection in connections:
connection.close_if_idle()
while connections:
# Gracefully shutdown timeout.
# We should provide graceful_shutdown_timeout,
# instead of letting connection hangs forever.
# Let's roughly calcucate time.
start_shutdown = 0
while connections and (start_shutdown < graceful_shutdown_timeout):
loop.run_until_complete(asyncio.sleep(0.1))
start_shutdown = start_shutdown + 0.1
# Force close non-idle connection after waiting for
# graceful_shutdown_timeout
coros = []
for conn in connections:
if hasattr(conn, "websocket") and conn.websocket:
coros.append(
conn.websocket.close_connection()
)
else:
conn.close()
_shutdown = asyncio.gather(*coros, loop=loop)
loop.run_until_complete(_shutdown)
trigger_events(after_stop, loop)
loop.close()
def serve_multiple(server_settings, workers, stop_event=None):
def serve_multiple(server_settings, workers):
"""Start multiple server processes simultaneously. Stop on interrupt
and terminate signals, and drain connections when complete.
@@ -388,29 +685,29 @@ def serve_multiple(server_settings, workers, stop_event=None):
:param stop_event: if provided, is used as a stop signal
:return:
"""
if server_settings.get('loop', None) is not None:
if server_settings.get('debug', False):
warnings.simplefilter('default')
warnings.warn("Passing a loop will be deprecated in version 0.4.0"
" https://github.com/channelcat/sanic/pull/335"
" has more information.", DeprecationWarning)
server_settings['reuse_port'] = True
server_settings['run_multiple'] = True
sock = socket()
sock.setsockopt(SOL_SOCKET, SO_REUSEADDR, 1)
sock.bind((server_settings['host'], server_settings['port']))
set_inheritable(sock.fileno(), True)
server_settings['sock'] = sock
server_settings['host'] = None
server_settings['port'] = None
# Handling when custom socket is not provided.
if server_settings.get('sock') is None:
sock = socket()
sock.setsockopt(SOL_SOCKET, SO_REUSEADDR, 1)
sock.bind((server_settings['host'], server_settings['port']))
sock.set_inheritable(True)
server_settings['sock'] = sock
server_settings['host'] = None
server_settings['port'] = None
if stop_event is None:
stop_event = Event()
def sig_handler(signal, frame):
logger.info("Received signal %s. Shutting down.", Signals(signal).name)
for process in processes:
os.kill(process.pid, SIGTERM)
signal_func(SIGINT, lambda s, f: stop_event.set())
signal_func(SIGTERM, lambda s, f: stop_event.set())
signal_func(SIGINT, lambda s, f: sig_handler(s, f))
signal_func(SIGTERM, lambda s, f: sig_handler(s, f))
processes = []
for _ in range(workers):
process = Process(target=serve, kwargs=server_settings)
process.daemon = True
@@ -423,6 +720,4 @@ def serve_multiple(server_settings, workers, stop_event=None):
# the above processes will block this until they're stopped
for process in processes:
process.terminate()
sock.close()
asyncio.get_event_loop().stop()
server_settings.get('sock').close()

View File

@@ -13,11 +13,13 @@ from sanic.exceptions import (
InvalidUsage,
)
from sanic.handlers import ContentRangeHandler
from sanic.response import file, HTTPResponse
from sanic.response import file, file_stream, HTTPResponse
def register(app, uri, file_or_directory, pattern,
use_modified_since, use_content_range):
use_modified_since, use_content_range,
stream_large_files, name='static', host=None,
strict_slashes=None, content_type=None):
# TODO: Though sanic is not a file server, I feel like we should at least
# make a good effort here. Modified-since is nice, but we could
# also look into etags, expires, and caching
@@ -34,6 +36,12 @@ def register(app, uri, file_or_directory, pattern,
server's
:param use_content_range: If true, process header for range requests
and sends the file part that is requested
:param stream_large_files: If true, use the file_stream() handler rather
than the file() handler to send the file
If this is an integer, this represents the
threshold size to switch to file_stream()
:param name: user defined name used for url_for
:param content_type: user defined content type for header
"""
# If we're not trying to match a file directly,
# serve from the folder
@@ -48,14 +56,18 @@ def register(app, uri, file_or_directory, pattern,
# Merge served directory and requested file if provided
# Strip all / that in the beginning of the URL to help prevent python
# from herping a derp and treating the uri as an absolute path
file_path = file_or_directory
root_path = file_path = file_or_directory
if file_uri:
file_path = path.join(
file_or_directory, sub('^[/]*', '', file_uri))
# URL decode the path sent by the browser otherwise we won't be able to
# match filenames which got encoded (filenames with spaces etc)
file_path = unquote(file_path)
file_path = path.abspath(unquote(file_path))
if not file_path.startswith(path.abspath(unquote(root_path))):
raise FileNotFound('File not found',
path=file_or_directory,
relative_url=file_uri)
try:
headers = {}
# Check if the client has been sent this file before
@@ -84,11 +96,22 @@ def register(app, uri, file_or_directory, pattern,
del headers['Content-Length']
for key, value in _range.headers.items():
headers[key] = value
headers['Content-Type'] = content_type \
or guess_type(file_path)[0] or 'text/plain'
if request.method == 'HEAD':
return HTTPResponse(
headers=headers,
content_type=guess_type(file_path)[0] or 'text/plain')
return HTTPResponse(headers=headers)
else:
if stream_large_files:
if isinstance(stream_large_files, int):
threshold = stream_large_files
else:
threshold = 1024 * 1024
if not stats:
stats = await stat(file_path)
if stats.st_size >= threshold:
return await file_stream(file_path, headers=headers,
_range=_range)
return await file(file_path, headers=headers, _range=_range)
except ContentRangeError:
raise
@@ -97,4 +120,9 @@ def register(app, uri, file_or_directory, pattern,
path=file_or_directory,
relative_url=file_uri)
app.route(uri, methods=['GET', 'HEAD'])(_handler)
# special prefix for static files
if not name.startswith('_static_'):
name = '_static_{}'.format(name)
app.route(uri, methods=['GET', 'HEAD'], name=name, host=host,
strict_slashes=strict_slashes)(_handler)

View File

@@ -1,12 +1,18 @@
from sanic.log import log
import traceback
from json import JSONDecodeError
from sanic.log import logger
from sanic.exceptions import MethodNotSupported
from sanic.response import text
HOST = '127.0.0.1'
PORT = 42101
class TestClient:
def __init__(self, app):
class SanicTestClient:
def __init__(self, app, port=PORT):
self.app = app
self.port = port
async def _local_request(self, method, uri, cookies=None, *args, **kwargs):
import aiohttp
@@ -14,19 +20,32 @@ class TestClient:
url = uri
else:
url = 'http://{host}:{port}{uri}'.format(
host=HOST, port=PORT, uri=uri)
host=HOST, port=self.port, uri=uri)
log.info(url)
async with aiohttp.ClientSession(cookies=cookies) as session:
logger.info(url)
conn = aiohttp.TCPConnector(verify_ssl=False)
async with aiohttp.ClientSession(
cookies=cookies, connector=conn) as session:
async with getattr(
session, method.lower())(url, *args, **kwargs) as response:
response.text = await response.text()
try:
response.text = await response.text()
except UnicodeDecodeError as e:
response.text = None
try:
response.json = await response.json()
except (JSONDecodeError,
UnicodeDecodeError,
aiohttp.ClientResponseError):
response.json = None
response.body = await response.read()
return response
def _sanic_endpoint_test(
self, method='get', uri='/', gather_request=True,
debug=False, server_kwargs={},
debug=False, server_kwargs={"auto_reload": False},
*request_args, **request_kwargs):
results = [None, None]
exceptions = []
@@ -37,6 +56,15 @@ class TestClient:
results[0] = request
self.app.request_middleware.appendleft(_collect_request)
@self.app.exception(MethodNotSupported)
async def error_handler(request, exception):
if request.method in ['HEAD', 'PATCH', 'PUT', 'DELETE']:
return text(
'', exception.status_code, headers=exception.headers
)
else:
return self.app.error_handler.default(request, exception)
@self.app.listener('after_server_start')
async def _collect_response(sanic, loop):
try:
@@ -45,10 +73,12 @@ class TestClient:
**request_kwargs)
results[-1] = response
except Exception as e:
logger.error(
'Exception:\n{}'.format(traceback.format_exc()))
exceptions.append(e)
self.app.stop()
self.app.run(host=HOST, debug=debug, port=PORT, **server_kwargs)
self.app.run(host=HOST, debug=debug, port=self.port, **server_kwargs)
self.app.listeners['after_server_start'].pop()
if exceptions:
@@ -58,14 +88,14 @@ class TestClient:
try:
request, response = results
return request, response
except:
except BaseException:
raise ValueError(
"Request and response object expected, got ({})".format(
results))
else:
try:
return results[-1]
except:
except BaseException:
raise ValueError(
"Request object expected, got ({})".format(results))

View File

@@ -1,17 +0,0 @@
import warnings
from sanic.testing import TestClient
def sanic_endpoint_test(app, method='get', uri='/', gather_request=True,
debug=False, server_kwargs={},
*request_args, **request_kwargs):
warnings.warn(
"Use of sanic_endpoint_test will be deprecated in"
"the next major version after 0.4.0. Please use the `test_client` "
"available on the app object.", DeprecationWarning)
test_client = TestClient(app)
return test_client._sanic_endpoint_test(
method, uri, gather_request, debug, server_kwargs,
*request_args, **request_kwargs)

View File

@@ -64,6 +64,11 @@ class HTTPMethodView:
return view
def stream(func):
func.is_stream = True
return func
class CompositionView:
"""Simple method-function mapped view for the sanic.
You can add handler functions to methods (get, post, put, patch, delete)
@@ -83,7 +88,9 @@ class CompositionView:
def __init__(self):
self.handlers = {}
def add(self, methods, handler):
def add(self, methods, handler, stream=False):
if stream:
handler.is_stream = stream
for method in methods:
if method not in HTTP_METHODS:
raise InvalidUsage(

103
sanic/websocket.py Normal file
View File

@@ -0,0 +1,103 @@
from sanic.exceptions import InvalidUsage
from sanic.server import HttpProtocol
from httptools import HttpParserUpgrade
from websockets import handshake, WebSocketCommonProtocol, InvalidHandshake
from websockets import ConnectionClosed # noqa
class WebSocketProtocol(HttpProtocol):
def __init__(self, *args, websocket_timeout=10,
websocket_max_size=None,
websocket_max_queue=None,
websocket_read_limit=2 ** 16,
websocket_write_limit=2 ** 16, **kwargs):
super().__init__(*args, **kwargs)
self.websocket = None
self.websocket_timeout = websocket_timeout
self.websocket_max_size = websocket_max_size
self.websocket_max_queue = websocket_max_queue
self.websocket_read_limit = websocket_read_limit
self.websocket_write_limit = websocket_write_limit
# timeouts make no sense for websocket routes
def request_timeout_callback(self):
if self.websocket is None:
super().request_timeout_callback()
def response_timeout_callback(self):
if self.websocket is None:
super().response_timeout_callback()
def keep_alive_timeout_callback(self):
if self.websocket is None:
super().keep_alive_timeout_callback()
def connection_lost(self, exc):
if self.websocket is not None:
self.websocket.connection_lost(exc)
super().connection_lost(exc)
def data_received(self, data):
if self.websocket is not None:
# pass the data to the websocket protocol
self.websocket.data_received(data)
else:
try:
super().data_received(data)
except HttpParserUpgrade:
# this is okay, it just indicates we've got an upgrade request
pass
def write_response(self, response):
if self.websocket is not None:
# websocket requests do not write a response
self.transport.close()
else:
super().write_response(response)
async def websocket_handshake(self, request, subprotocols=None):
# let the websockets package do the handshake with the client
headers = []
def get_header(k):
return request.headers.get(k, '')
def set_header(k, v):
headers.append((k, v))
try:
key = handshake.check_request(get_header)
handshake.build_response(set_header, key)
except InvalidHandshake:
raise InvalidUsage('Invalid websocket request')
subprotocol = None
if subprotocols and 'Sec-Websocket-Protocol' in request.headers:
# select a subprotocol
client_subprotocols = [p.strip() for p in request.headers[
'Sec-Websocket-Protocol'].split(',')]
for p in client_subprotocols:
if p in subprotocols:
subprotocol = p
set_header('Sec-Websocket-Protocol', subprotocol)
break
# write the 101 response back to the client
rv = b'HTTP/1.1 101 Switching Protocols\r\n'
for k, v in headers:
rv += k.encode('utf-8') + b': ' + v.encode('utf-8') + b'\r\n'
rv += b'\r\n'
request.transport.write(rv)
# hook up the websocket protocol
self.websocket = WebSocketCommonProtocol(
timeout=self.websocket_timeout,
max_size=self.websocket_max_size,
max_queue=self.websocket_max_queue,
read_limit=self.websocket_read_limit,
write_limit=self.websocket_write_limit
)
self.websocket.subprotocol = subprotocol
self.websocket.connection_made(request.transport)
self.websocket.connection_open()
return self.websocket

210
sanic/worker.py Normal file
View File

@@ -0,0 +1,210 @@
import os
import sys
import signal
import asyncio
import logging
import traceback
try:
import ssl
except ImportError:
ssl = None
try:
import uvloop
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
except ImportError:
pass
import gunicorn.workers.base as base
from sanic.server import trigger_events, serve, HttpProtocol, Signal
from sanic.websocket import WebSocketProtocol
class GunicornWorker(base.Worker):
http_protocol = HttpProtocol
websocket_protocol = WebSocketProtocol
def __init__(self, *args, **kw): # pragma: no cover
super().__init__(*args, **kw)
cfg = self.cfg
if cfg.is_ssl:
self.ssl_context = self._create_ssl_context(cfg)
else:
self.ssl_context = None
self.servers = {}
self.connections = set()
self.exit_code = 0
self.signal = Signal()
def init_process(self):
# create new event_loop after fork
asyncio.get_event_loop().close()
self.loop = asyncio.new_event_loop()
asyncio.set_event_loop(self.loop)
super().init_process()
def run(self):
is_debug = self.log.loglevel == logging.DEBUG
protocol = (
self.websocket_protocol if self.app.callable.websocket_enabled
else self.http_protocol)
self._server_settings = self.app.callable._helper(
loop=self.loop,
debug=is_debug,
protocol=protocol,
ssl=self.ssl_context,
run_async=True)
self._server_settings['signal'] = self.signal
self._server_settings.pop('sock')
trigger_events(self._server_settings.get('before_start', []),
self.loop)
self._server_settings['before_start'] = ()
self._runner = asyncio.ensure_future(self._run(), loop=self.loop)
try:
self.loop.run_until_complete(self._runner)
self.app.callable.is_running = True
trigger_events(self._server_settings.get('after_start', []),
self.loop)
self.loop.run_until_complete(self._check_alive())
trigger_events(self._server_settings.get('before_stop', []),
self.loop)
self.loop.run_until_complete(self.close())
except BaseException:
traceback.print_exc()
finally:
try:
trigger_events(self._server_settings.get('after_stop', []),
self.loop)
except BaseException:
traceback.print_exc()
finally:
self.loop.close()
sys.exit(self.exit_code)
async def close(self):
if self.servers:
# stop accepting connections
self.log.info("Stopping server: %s, connections: %s",
self.pid, len(self.connections))
for server in self.servers:
server.close()
await server.wait_closed()
self.servers.clear()
# prepare connections for closing
self.signal.stopped = True
for conn in self.connections:
conn.close_if_idle()
# gracefully shutdown timeout
start_shutdown = 0
graceful_shutdown_timeout = self.cfg.graceful_timeout
while self.connections and \
(start_shutdown < graceful_shutdown_timeout):
await asyncio.sleep(0.1)
start_shutdown = start_shutdown + 0.1
# Force close non-idle connection after waiting for
# graceful_shutdown_timeout
coros = []
for conn in self.connections:
if hasattr(conn, "websocket") and conn.websocket:
coros.append(
conn.websocket.close_connection()
)
else:
conn.close()
_shutdown = asyncio.gather(*coros, loop=self.loop)
await _shutdown
async def _run(self):
for sock in self.sockets:
state = dict(requests_count=0)
self._server_settings["host"] = None
self._server_settings["port"] = None
server = await serve(
sock=sock,
connections=self.connections,
state=state,
**self._server_settings
)
self.servers[server] = state
async def _check_alive(self):
# If our parent changed then we shut down.
pid = os.getpid()
try:
while self.alive:
self.notify()
req_count = sum(
self.servers[srv]["requests_count"] for srv in self.servers
)
if self.max_requests and req_count > self.max_requests:
self.alive = False
self.log.info("Max requests exceeded, shutting down: %s",
self)
elif pid == os.getpid() and self.ppid != os.getppid():
self.alive = False
self.log.info("Parent changed, shutting down: %s", self)
else:
await asyncio.sleep(1.0, loop=self.loop)
except (Exception, BaseException, GeneratorExit, KeyboardInterrupt):
pass
@staticmethod
def _create_ssl_context(cfg):
""" Creates SSLContext instance for usage in asyncio.create_server.
See ssl.SSLSocket.__init__ for more details.
"""
ctx = ssl.SSLContext(cfg.ssl_version)
ctx.load_cert_chain(cfg.certfile, cfg.keyfile)
ctx.verify_mode = cfg.cert_reqs
if cfg.ca_certs:
ctx.load_verify_locations(cfg.ca_certs)
if cfg.ciphers:
ctx.set_ciphers(cfg.ciphers)
return ctx
def init_signals(self):
# Set up signals through the event loop API.
self.loop.add_signal_handler(signal.SIGQUIT, self.handle_quit,
signal.SIGQUIT, None)
self.loop.add_signal_handler(signal.SIGTERM, self.handle_exit,
signal.SIGTERM, None)
self.loop.add_signal_handler(signal.SIGINT, self.handle_quit,
signal.SIGINT, None)
self.loop.add_signal_handler(signal.SIGWINCH, self.handle_winch,
signal.SIGWINCH, None)
self.loop.add_signal_handler(signal.SIGUSR1, self.handle_usr1,
signal.SIGUSR1, None)
self.loop.add_signal_handler(signal.SIGABRT, self.handle_abort,
signal.SIGABRT, None)
# Don't let SIGTERM and SIGUSR1 disturb active requests
# by interrupting system calls
signal.siginterrupt(signal.SIGTERM, False)
signal.siginterrupt(signal.SIGUSR1, False)
def handle_quit(self, sig, frame):
self.alive = False
self.app.callable.is_running = False
self.cfg.worker_int(self)
def handle_abort(self, sig, frame):
self.alive = False
self.exit_code = 1
self.cfg.worker_abort(self)
sys.exit(1)

View File

@@ -4,43 +4,73 @@ Sanic
import codecs
import os
import re
from distutils.errors import DistutilsPlatformError
from distutils.util import strtobool
from setuptools import setup
with codecs.open(os.path.join(os.path.abspath(os.path.dirname(
__file__)), 'sanic', '__init__.py'), 'r', 'latin1') as fp:
def open_local(paths, mode='r', encoding='utf8'):
path = os.path.join(
os.path.abspath(os.path.dirname(__file__)),
*paths
)
return codecs.open(path, mode, encoding)
with open_local(['sanic', '__init__.py'], encoding='latin1') as fp:
try:
version = re.findall(r"^__version__ = '([^']+)'\r?$",
fp.read(), re.M)[0]
except IndexError:
raise RuntimeError('Unable to determine version.')
install_requires = [
'httptools>=0.0.9',
'ujson>=1.35',
'aiofiles>=0.3.0',
]
if os.name != 'nt':
install_requires.append('uvloop>=0.5.3')
with open_local(['README.rst']) as rm:
long_description = rm.read()
setup(
name='sanic',
version=version,
url='http://github.com/channelcat/sanic/',
license='MIT',
author='Channel Cat',
author_email='channelcat@gmail.com',
description=(
setup_kwargs = {
'name': 'sanic',
'version': version,
'url': 'http://github.com/channelcat/sanic/',
'license': 'MIT',
'author': 'Channel Cat',
'author_email': 'channelcat@gmail.com',
'description': (
'A microframework based on uvloop, httptools, and learnings of flask'),
packages=['sanic'],
platforms='any',
install_requires=install_requires,
classifiers=[
'Development Status :: 2 - Pre-Alpha',
'long_description': long_description,
'packages': ['sanic'],
'platforms': 'any',
'classifiers': [
'Development Status :: 4 - Beta',
'Environment :: Web Environment',
'License :: OSI Approved :: MIT License',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
],
)
}
env_dependency = '; sys_platform != "win32" and implementation_name == "cpython"'
ujson = 'ujson>=1.35' + env_dependency
uvloop = 'uvloop>=0.5.3' + env_dependency
requirements = [
'httptools>=0.0.9',
uvloop,
ujson,
'aiofiles>=0.3.0',
'websockets>=5.0,<6.0',
'multidict>=4.0,<5.0',
]
if strtobool(os.environ.get("SANIC_NO_UJSON", "no")):
print("Installing without uJSON")
requirements.remove(ujson)
# 'nt' means windows OS
if strtobool(os.environ.get("SANIC_NO_UVLOOP", "no")):
print("Installing without uvLoop")
requirements.remove(uvloop)
setup_kwargs['install_requires'] = requirements
setup(**setup_kwargs)

View File

@@ -0,0 +1,22 @@
-----BEGIN CERTIFICATE-----
MIIDtTCCAp2gAwIBAgIJAO6wb0FSc/rNMA0GCSqGSIb3DQEBCwUAMEUxCzAJBgNV
BAYTAlVTMRMwEQYDVQQIEwpTb21lLVN0YXRlMSEwHwYDVQQKExhJbnRlcm5ldCBX
aWRnaXRzIFB0eSBMdGQwHhcNMTcwMzAzMTUyODAzWhcNMTkxMTI4MTUyODAzWjBF
MQswCQYDVQQGEwJVUzETMBEGA1UECBMKU29tZS1TdGF0ZTEhMB8GA1UEChMYSW50
ZXJuZXQgV2lkZ2l0cyBQdHkgTHRkMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIB
CgKCAQEAsy7Zb3p4yCEnUtPLwqeJrwj9u/ZmcFCrMAktFBx9hG6rY2r7mdB6Bflh
V5cUJXxnsNiDpYcxGhA8kry7pEork1vZ05DyZC9ulVlvxBouVShBcLLwdpaoTGqE
vYtejv6x7ogwMXOjkWWb1WpOv4CVhpeXJ7O/d1uAiYgcUpTpPp4ONG49IAouBHq3
h+o4nVvNfB0J8gaCtTsTZqi1Wt8WYs3XjxGJaKh//ealfRe1kuv40CWQ8gjaC8/1
w9pHdom3Wi/RwfDM3+dVGV6M5lAbPXMB4RK17Hk9P3hlJxJOpKBdgcBJPXtNrTwf
qEWWxk2mB/YVyB84AxjkkNoYyi2ggQIDAQABo4GnMIGkMB0GA1UdDgQWBBRa46Ix
9s9tmMqu+Zz1mocHghm4NTB1BgNVHSMEbjBsgBRa46Ix9s9tmMqu+Zz1mocHghm4
NaFJpEcwRTELMAkGA1UEBhMCVVMxEzARBgNVBAgTClNvbWUtU3RhdGUxITAfBgNV
BAoTGEludGVybmV0IFdpZGdpdHMgUHR5IEx0ZIIJAO6wb0FSc/rNMAwGA1UdEwQF
MAMBAf8wDQYJKoZIhvcNAQELBQADggEBACdrnM8zb7abxAJsU5WLn1IR0f2+EFA7
ezBEJBM4bn0IZrXuP5ThZ2wieJlshG0C16XN9+zifavHci+AtQwWsB0f/ppHdvWQ
7wt7JN88w+j0DNIYEadRCjWxR3gRAXPgKu3sdyScKFq8MvB49A2EdXRmQSTIM6Fj
teRbE+poxewFT0mhurf3xrtGiSALmv7uAzhRDqpYUzcUlbOGgkyFLYAOOdvZvei+
mfXDi4HKYxgyv53JxBARMdajnCHXM7zQ6Tjc8j1HRtmDQ3XapUB559KfxfODGQq5
zmeoZWU4duxcNXJM0Eiz1CJ39JoWwi8sqaGi/oskuyAh7YKyVTn8xa8=
-----END CERTIFICATE-----

Some files were not shown because too many files have changed in this diff Show More