Compare commits

...

482 Commits
0.5.2 ... 0.7.0

Author SHA1 Message Date
Eli Uriegas
21435c1863 Merge pull request #1045 from seemethere/increment_070
Increment to 0.7.0
2017-12-05 19:27:11 -08:00
Eli Uriegas
1ea3ab7fe8 Increment to 0.7.0
Signed-off-by: Eli Uriegas <seemethere101@gmail.com>
2017-12-05 19:13:16 -08:00
Raphael Deem
1b0ad2c3cd Merge pull request #1035 from yunstanford/patch-N
Adopt new websockets interface
2017-12-02 01:27:09 -08:00
Raphael Deem
aa4821864a Merge pull request #1039 from lixxu/master
check request.ip before using it
2017-11-28 19:47:34 -08:00
lixxu
283762224c clean codes 2017-11-28 14:47:43 +08:00
lixxu
f50a37fc88 ignore error if request.ip is None 2017-11-28 14:44:32 +08:00
Yun Xu
076f0515ca Fix flake8 2017-11-25 21:14:18 -08:00
Yun Xu
049f12096d fix unit tests 2017-11-25 21:07:38 -08:00
Yun Xu
f09c0393ba adopt new websockets interface 2017-11-25 21:01:22 -08:00
7
472bbcf293 Merge pull request #15 from channelcat/master
Merge upstream master branch
2017-11-25 20:49:09 -08:00
Raphael Deem
7a3f9daccf Merge pull request #1025 from nkoshell/route-version-params
Route version params
2017-11-20 23:43:22 -08:00
Nikita Koshelev
76511d61e0 Added removing duplicate 'v' for Router.add() version parameter
Fix sanic/router.py:123:80: E501 line too long (80 > 79 characters)
2017-11-18 01:39:00 +03:00
Nikita Koshelev
8e7475ccf6 Added regex escaping for Router.add() version parameter 2017-11-18 01:22:42 +03:00
Raphael Deem
820d8c7bf5 Merge pull request #1021 from EdwardBetts/spelling
Correct spelling mistakes.
2017-11-16 16:26:40 -08:00
Edward Betts
cfc75b4f1a Correct spelling mistakes. 2017-11-15 15:46:39 +00:00
Raphael Deem
98567fe5a8 Merge pull request #1008 from youknowone/pytest-xdist
Let SanicTestClient has its own port
2017-11-10 10:50:01 -08:00
Raphael Deem
05bb812e2b Merge pull request #1010 from Yaser-Amiri/master
Change unit tests names with repeated names.
2017-11-08 20:25:45 -08:00
Yaser Amiri
c9876a6c88 Change unit tests names with repeated names. 2017-11-08 14:14:57 +03:30
Raphael Deem
979b5a52d3 Merge pull request #1005 from joar/feature/static-strict-slashes
Add strict_slashes to {app, blueprint}.static()
2017-11-07 07:49:32 -08:00
Joar Wandborg
e70535e8d7 Use .get instead of .pop 2017-11-07 10:34:17 +01:00
Jeong YunWon
ed8725bf6c Let SanicTestClient has its own port
For parallel test running, the servers must have different ports.
See examples/pytest_xdist.py for example.
2017-11-06 17:29:32 +09:00
Raphael Deem
098cd70e82 Merge pull request #1007 from furious-luke/master
Call connection_open after websocket handshake
2017-11-05 14:26:19 -08:00
Raphael Deem
969dac2033 Merge pull request #1004 from Stibbons/optionalize_log_config
Optionalize app.run dictConfig (fix #1000)
2017-11-04 12:35:38 -07:00
Gaetan Semet
49b1d667f1 Optionalize app.run dictConfig (fix #1000)
Signed-off-by: Gaetan Semet <gaetan@xeberon.net>
2017-11-04 15:58:27 +01:00
Luke Hodkinson
bca1e08411 Call connection_open after websocket handshake
It seems that due to (recent?) changes in the websocket library, we
now need to call "connection_open" to flag that the websocket is now
ready to use. I've added that call just after the call to
"connection_made".
2017-11-04 22:04:59 +11:00
Raphael Deem
bf6ed217c2 Merge pull request #1006 from r0fls/routing-fix
check if method is added in strict slash logic
2017-11-04 00:09:52 -07:00
Raphael Deem
bb8e9c6438 check if method is added in strict slash logic 2017-11-03 18:36:06 -07:00
Joar Wandborg
f128ed5b1f Set threshold to 1MiB instead of 0.97MiB
Reference: https://en.wikipedia.org/wiki/Mebibyte#Definition
2017-11-03 14:37:01 +01:00
Joar Wandborg
ff5786d61b pep8 2017-11-03 14:33:24 +01:00
Joar Wandborg
ca596c8ecd Add strict_slashes to {Sanic, Blueprint}().static() 2017-11-02 15:44:36 +01:00
Raphael Deem
c3bcafb514 Merge pull request #997 from ignatenkobrain/localhost
tests: do not assume that locahost == 127.0.0.1
2017-11-01 00:22:14 -07:00
Igor Gnatenko
a9c7d95e9b tests: do not assume that locahost == 127.0.0.1
Signed-off-by: Igor Gnatenko <ignatenkobrain@fedoraproject.org>
2017-10-31 09:39:09 +01:00
Raphael Deem
01042c1d98 Merge pull request #992 from r0fls/968
remove port from ip
2017-10-25 22:15:07 -07:00
Raphael Deem
5bf722c7ae remove bare exceptions 2017-10-25 21:58:31 -07:00
Raphael Deem
c2191153cf remove port from ip 2017-10-23 21:37:59 -07:00
davidtgq
5bcbc5a337 Replaced COMMON_STATUS_CODES with a simple 200 check for more fast (#982)
* Replaced COMMON_STATUS_CODES with a simple 200 check for more fast

* Added IPware algorithm

* Remove HTTP prefix from Django-style headers
Remove right_most_proxy because it's outside spec

* Remove obvious docstrings

* Revert "Replaced COMMON_STATUS_CODES with a simple 200 check for more fast"

This reverts commit 15b6980

* Revert "Added IPware algorithm"

This reverts commit bdf66cb

WTF HOW DO I GIT

* Revert "Revert "Replaced COMMON_STATUS_CODES with a simple 200 check for more fast""

This reverts commit d8df095

* Revert "Added IPware algorithm"

This reverts commit bdf66cb

* Delete ip.py
2017-10-19 16:43:07 -07:00
Eli Uriegas
f721f90add Merge pull request #820 from youknowone/worker-protocol
Protocol configurable gunicorn worker
2017-10-19 16:21:28 -07:00
Eli Uriegas
0e92d8ce2c Merge branch 'master' into worker-protocol 2017-10-19 16:21:18 -07:00
Eli Uriegas
727d6a1b61 Merge pull request #972 from lanfon72/patch-2
to fix condition error that used in `log_response`
2017-10-19 16:16:57 -07:00
Raphael Deem
666c0847b7 Merge pull request #976 from ashleysommer/fix_websocket_timeout
Fix Websocket protocol timeouts after #939
2017-10-18 21:20:52 -07:00
Raphael Deem
0a411f9bba Merge pull request #985 from ashleysommer/ashleysommer-docs-spf
Add Sanic-Plugins-Framework library to Extensions doc
2017-10-18 21:20:13 -07:00
Ashley Sommer
49f3ba39f9 Add Sanic-Plugins-Framework library to Extensions doc
I made a new tool for devs to use for easily and quickly creating Sanic Plugins (extensions), and for application builders to easily use those plugins in their app.
2017-10-18 17:52:03 +10:00
Raphael Deem
794128a053 Merge pull request #981 from kszucs/manifest
Include LICENSE file in manifest
2017-10-17 10:31:19 -07:00
Krisztián Szűcs
e6be3b2313 include LICENSE file in manifest 2017-10-17 16:05:24 +02:00
Raphael Deem
c5cdcf0f95 Merge pull request #975 from ashleysommer/timeouts_documentation
Add documentation for new Timeout values, after #939
2017-10-16 09:13:49 -07:00
Ashley Sommer
ea5b07f636 Update websocket protocol to accomodate changes in HTTP protocol from https://github.com/channelcat/sanic/pull/939
Fixes https://github.com/channelcat/sanic/issues/969
2017-10-16 11:06:33 +10:00
Ashley Sommer
477e6b8663 Add documentation for REQUEST_TIMEOUT, RESPONSE_TIMEOUT and KEEP_ALIVE_TIMEOUT config values.
Fixed some inconsistent default values.
2017-10-16 10:53:45 +10:00
Raphael Deem
a0d8418b40 Merge pull request #965 from samael500/master
fix issue #959
2017-10-13 14:58:46 -07:00
Raphael Deem
006fb08024 Merge pull request #966 from yunstanford/patch-M
Sanic routes should not pass angled params with empty names
2017-10-13 02:18:20 -07:00
lanf0n
4578f6016b to fix condition error that used in log_response
`request` class is derived from `dict`, so it will never be `True`.
2017-10-13 16:48:02 +08:00
Raphael Deem
5b06bcc57d Merge pull request #967 from samael500/custom_filename
Custom filename
2017-10-13 01:35:11 -07:00
Raphael Deem
d4bb14a511 Merge pull request #971 from pcinkh/socket_disconnects_speedup
Critical speedup websocket disconnects from O(N) to O(1)
2017-10-13 01:25:14 -07:00
pcinkh
6d2f5da506 Speedup websocket disconnects. 2017-10-11 14:02:26 +03:00
Yun Xu
c96df86111 make flake8 happy 2017-10-09 07:58:04 -07:00
Maks Skorokhod
86f87cf4ac 🔧 no use f'string' 2017-10-09 17:55:35 +03:00
Yun Xu
770a8fb288 raise exception for invalid param syntax 2017-10-09 07:54:39 -07:00
Maks Skorokhod
c4e3a98ea7 add test for custom filename 2017-10-09 17:45:42 +03:00
Maks Skorokhod
07e95dba4f 🔁 customize filename in file response 2017-10-09 17:45:22 +03:00
7
9bc1abcd00 Merge pull request #14 from channelcat/master
merge upstream master branch
2017-10-09 07:19:57 -07:00
Maks Skorokhod
4d515b05f3 fix missed assertion 2017-10-09 17:18:04 +03:00
Maks Skorokhod
64edf7ad9c upd test for connection lost error 2017-10-09 16:00:32 +03:00
Maks Skorokhod
7610c0fb2e 🔧 log Connection lost only if debug 2017-10-09 15:50:36 +03:00
Raphael Deem
0189e4ed59 Merge pull request #962 from ProstoMaxim/fix_logs
Fix logs
2017-10-08 20:16:11 -07:00
Raphael Deem
8018c9b91d Merge pull request #961 from r0fls/fix-920
fix false cookie encoding and output
2017-10-06 23:57:30 -07:00
Max Murashov
4b3920daba Fix logs 2017-10-06 16:53:30 +03:00
Raphael Deem
d876e3ed5c fix false cookie encoding and output 2017-10-05 22:20:50 -07:00
Raphael Deem
086b5daa53 Merge pull request #960 from piotrbulinski/refactor_server_access_log
Refactor access log for server.HttpProtocol
2017-10-05 20:20:28 -07:00
Piotr Buliński
4b877e3f6b Update server.py 2017-10-05 09:28:13 +02:00
Piotr Buliński
8ce749e339 Update server.py 2017-10-05 09:27:18 +02:00
Piotr Buliński
752ddfa7fc Merge branch 'master' into refactor_server_access_log 2017-10-05 09:26:19 +02:00
Raphael Deem
8700c96c4d Merge pull request #942 from yunstanford/patch-logging-refactor
Patch logging refactor
2017-10-05 00:22:02 -07:00
Piotr Bulinski
e3852ceeca Refactor access log for server 2017-10-04 12:50:57 +02:00
Yun Xu
225ea49b6f resolve conflicts again 2017-10-01 01:22:27 -07:00
Raphael Deem
15fd49037f Merge pull request #939 from ashleysommer/keepalive_timeout
Split RequestTimeout, ResponseTimeout, and KeepAliveTimeout into different timeouts
2017-09-30 22:15:50 -07:00
Raphael Deem
2fb4697e12 Merge pull request #952 from ahopkins/patch-1
Update extensions.md
2017-09-29 18:33:30 -07:00
Eli Uriegas
1a9f770317 Merge pull request #957 from lanfon72/master
add sphinx extension to add asyncio-specific markups
2017-09-29 11:17:29 -07:00
lanf0n
62871ec9b3 add sphinx extension to add asyncio-specific markups 2017-09-30 01:16:26 +08:00
Raphael Deem
39c64214ee Merge pull request #953 from r0fls/949
support vhosts in static routes
2017-09-27 01:28:43 -07:00
Raphael Deem
9aec5febb8 support vhosts in static routes 2017-09-27 01:24:49 -07:00
Adam Hopkins
91b2167eba Update extensions.md
Add - [JWT](https://github.com/ahopkins/sanic-jwt): Authentication extension for JSON Web Tokens (JWT) extension package.
2017-09-27 11:07:06 +03:00
Raphael Deem
00d40a35cd Merge pull request #951 from lixxu/master
fix bug and set scheme to http if not provided
2017-09-26 21:57:54 -07:00
lixxu
f96ab02767 set scheme to http if not provided 2017-09-27 09:59:49 +08:00
Raphael Deem
4ce699e57f Merge pull request #944 from blazehu/master
add __repr__ for sanic request
2017-09-25 13:58:09 -07:00
Raphael Deem
4ee042c330 Merge pull request #948 from chiuczek/json-dependency-injection
Use dependency injection to allow alternative json parser or encoder
2017-09-24 21:05:39 -07:00
Yun Xu
0b23f4ff81 resolve conflicts 2017-09-23 06:19:09 -07:00
Hugh McNamara
5cef1634ed use json_loads function in json property of request 2017-09-22 10:19:15 +01:00
Eli Uriegas
1b0286916e Merge pull request #947 from lanfon72/patch-1
to fix if platform is windows.
2017-09-19 10:15:35 -07:00
Hugh McNamara
a8f764c161 make method instead of property for alternative json decoding of request 2017-09-19 18:12:53 +01:00
Hugh McNamara
1d719252cb use dependency injection to allow alternative json parser or encoder 2017-09-19 14:58:49 +01:00
lanf0n
d8cebe1188 to fix if platform is windows. 2017-09-19 18:14:25 +08:00
Eli Uriegas
329ebf6a5d Merge pull request #946 from trthhrtz/patch-1
Update getting_started.md
2017-09-18 11:13:48 -07:00
Kuzma Leshakov
c836441a75 Update getting_started.md
Hello World example at the main Readme file (https://github.com/channelcat/sanic/blob/master/README.rst) is different, it returns json. Here is returned text. In the following examples, such as Routing (http://sanic.readthedocs.io/en/latest/sanic/routing.html) is again used json. Therefore I suggest to make examples the same, having json as output
2017-09-18 11:37:32 +03:00
huyuhan
074d36eeba add __repr__ for sanic request 2017-09-15 21:15:05 +08:00
huyuhan
f6eb35f67d add __repr__ for sanic request 2017-09-15 21:05:25 +08:00
huyuhan
77f70a0792 add __repr__ for sanic request 2017-09-15 20:56:44 +08:00
huyuhan
12dafd07b8 add __repr__ for sanic request 2017-09-15 18:34:56 +08:00
Eli Uriegas
9fb8bec715 Merge pull request #943 from crvv/master
fix #763, sanic can't decode latin1 encoded header value
2017-09-14 13:38:44 -07:00
Wèi Cōngruì
eb1146c6b6 fix #763, sanic can't decode latin1 encoded header value 2017-09-14 19:23:02 +08:00
Yun Xu
730f7c5e41 add doc for customizing logging config 2017-09-13 18:30:38 -07:00
Yun Xu
5cabc9cff2 update doc 2017-09-13 18:16:58 -07:00
Yun Xu
ddc039ed2e update doc 2017-09-13 18:14:46 -07:00
Eli Uriegas
a146ebd856 Merge pull request #941 from aiosin/master
add status codes and teapot example
2017-09-13 11:39:14 -07:00
Yun Xu
5ee7b6caeb fixing small issue 2017-09-13 10:35:34 -07:00
Yun Xu
9c4b0f7b15 fix flake8 2017-09-13 07:40:42 -07:00
aiosin
2e5d1ddff9 add status codes and teapot example 2017-09-13 14:08:29 +02:00
Yun Xu
24bdb1ce98 add unit tests/refactoring 2017-09-12 23:42:42 -07:00
Ashley Sommer
8eb59ad4dc Fixed error where the RequestTimeout test wasn't actually testing the correct behaviour
Fixed error where KeepAliveTimeout wasn't being triggered in the test suite, when using uvloop
Fixed test cases when using other asyncio loops such as uvloop
Fixed Flake8 linting errors
2017-09-13 10:18:36 +10:00
Raphael Deem
d8c8ccd180 Merge pull request #932 from lixxu/master
static files url building using url_for
2017-09-12 12:59:04 -07:00
Yun Xu
a46e004f07 apply new loggers 2017-09-11 22:12:49 -07:00
Ashley Sommer
173f94216a Fixed the delays, and expected responses, in the keepalive_timeout tests 2017-09-12 13:40:43 +10:00
Ashley Sommer
1a74accd65 finished the keepalive_timeout tests 2017-09-12 13:09:42 +10:00
Ashley Sommer
2979e03148 WIP - Split RequestTimeout, ResponseTimout, and KeepAliveTimeout into different timeouts, with different callbacks. 2017-09-11 17:17:33 +10:00
Yun Xu
4bdb9a2c8e prototype 2017-09-10 23:19:09 -07:00
Yun Xu
8f6fa5e9ff old logging cleanup 2017-09-10 18:44:54 -07:00
Yun Xu
986135ff76 remove DefaultFilter 2017-09-10 18:39:42 -07:00
Yun Xu
c9cbc00e36 use access_log as param 2017-09-10 18:38:52 -07:00
Yun Xu
c9a40c180a remove some logging stuff 2017-09-10 11:11:16 -07:00
7
125cb17fcb Merge pull request #13 from channelcat/master
sync from upstream master branch
2017-09-09 16:52:13 -07:00
Raphael Deem
53a5bd2319 Merge pull request #936 from yunstanford/patch-debug-logging
Patch debug logging
2017-09-08 20:49:16 -07:00
Raphael Deem
4fd68f9af3 Merge pull request #935 from iad42/patch-1
Added information on request.token
2017-09-08 20:49:01 -07:00
Yun Xu
c4417b399b fixing debug logging 2017-09-08 17:47:05 -07:00
7
c2a3e42a53 Merge pull request #12 from channelcat/master
merge upstream master branch
2017-09-08 17:39:50 -07:00
Anatoly Ivanov
73c04f5a89 Added information on request.token
The manual lacked info about request.token, which keeps authorization data. See https://github.com/channelcat/sanic/blob/master/sanic/request.py#L84 for details
2017-09-08 14:21:49 +03:00
lixxu
195f707f14 missing '/' in doc 2017-09-06 19:19:59 +08:00
lixxu
bc20dc5c62 use url_for for url building for static files 2017-09-06 19:17:52 +08:00
Raphael Deem
8b4ca51805 Merge pull request #931 from Tim-Erwin/envvar_prefix
make the prefix for environment variables alterable
2017-09-05 11:22:56 -07:00
Tim Mundt
e2e25eb751 fixed flake convention 2017-09-05 11:05:31 +02:00
Tim Mundt
9572ecc5ea test for env var prefix 2017-09-05 10:58:48 +02:00
Tim Mundt
97d8b9e908 documentation for env var prefix; allow passing in the prefix through the app constructor 2017-09-05 10:41:55 +02:00
Tim Mundt
c59a8a60eb make the prefix for environment variables alterable 2017-09-05 09:53:33 +02:00
Raphael Deem
158da0927a Merge pull request #901 from lixxu/master
add name option for route building
2017-08-31 15:29:03 -07:00
Eli Uriegas
78a7338346 Merge pull request #922 from timka/patch-1
Example logging X-Request-Id transparently
2017-08-31 10:35:48 -07:00
Eli Uriegas
90e5c8d39b Merge pull request #904 from jiaxiaolei/master
feat(examples): add `authorized_sanic.py`
2017-08-31 10:35:23 -07:00
Raphael Deem
7a6f2d8336 Merge pull request #926 from manisenkov/patch-1
Fix LICENSE date and name
2017-08-30 17:49:59 -07:00
Maksim Anisenkov
f49554aa57 Fix LICENSE date and name 2017-08-30 15:30:22 +02:00
Timur
0a72168f8f Example logging X-Request-Id transparently 2017-08-29 23:05:57 +03:00
Raphael Deem
5011bfef55 Merge pull request #917 from CharAct3/bugfix/fix_unauthorized
fix #914, change arguments of Unauthorized.__init__
2017-08-24 13:45:58 -07:00
Darren
6038813d03 fix #914, change arguments of Unauthorized.__init__ 2017-08-24 22:59:25 +08:00
Raphael Deem
fee9de96de Merge pull request #908 from Ezi4Zy/master
fix: error param
2017-08-23 16:22:13 -07:00
xmsun
35e028cd99 fix: error param 2017-08-22 16:40:42 +08:00
lixxu
145cdd5c1b Merge branch 'use-route-name-for-method' 2017-08-22 14:02:56 +08:00
lixxu
762b2782ee use name to define route name for different methods on same url 2017-08-22 14:02:38 +08:00
jiaxiaolei
91f031b661 feat(examples): add authorized_sanic.py
You can check a request if the client is authorized
to access a resource by the decorator `authorized`
2017-08-21 22:40:07 +08:00
lixxu
eab809d410 add name option for route building 2017-08-21 18:05:34 +08:00
Raphael Deem
826f1b4713 Merge pull request #898 from jiaxiaolei/master
feat(examples): add `add_task_sanic.py`
2017-08-21 00:32:38 -07:00
Raphael Deem
fa1a95ae91 Merge pull request #900 from yunstanford/patch-default-strict-slashes
Patch default strict slashes
2017-08-21 00:31:42 -07:00
Yun Xu
63babae63d add doc 2017-08-21 00:28:01 -07:00
Raphael Deem
db9924a399 Merge pull request #899 from hatarist/patch-2
Add a line on headers in the "Request Data" docs
2017-08-20 23:51:07 -07:00
Yun Xu
5d23c7644b add unit tests 2017-08-20 23:37:22 -07:00
Yun Xu
ef81a9f547 make strict_slashes default value configurable 2017-08-20 23:11:38 -07:00
7
747c21da70 Merge pull request #11 from channelcat/master
merge upstream master branch
2017-08-20 22:51:19 -07:00
Igor Hatarist
439ff11d13 Added a line on headers in the "Request Data" docs 2017-08-20 19:28:09 +03:00
jiaxiaolei
947364e15f feat(exapmles): add add_task_sanic.py 2017-08-20 11:11:14 +08:00
Eli Uriegas
750115b727 Merge pull request #894 from pkuphy/patch-1
fix typo
2017-08-18 10:44:35 -07:00
pkuphy
a55efc832d fix typo 2017-08-19 01:03:54 +08:00
Raphael Deem
c96bd21389 Merge pull request #892 from jiaxiaolei/master
docs(README): Make it clear and easy to read.
2017-08-18 02:09:19 -07:00
jiaxiaolei
dd241bd6fa docs(README): Make it clear and easy to read. 2017-08-18 17:00:34 +08:00
Raphael Deem
0dbde7400f Merge pull request #889 from dongweiming/doc
Fix blueprint doc
2017-08-16 00:19:58 -07:00
dongweiming
2587f6753d Fix blueprint doc 2017-08-15 22:04:25 +08:00
Raphael Deem
4155e76a81 Merge pull request #886 from yunstanford/fix-cov-report
Fix cov report
2017-08-14 16:21:50 -07:00
Yun Xu
756bd19181 do not fail if no files for coverage combine 2017-08-10 08:39:02 -07:00
Yun Xu
fbb2344895 fix cov report 2017-08-10 07:55:38 -07:00
7
bda6c85638 Merge pull request #10 from channelcat/master
merge upstream master branch
2017-08-10 07:47:45 -07:00
Raphael Deem
df4a149cd0 Merge pull request #885 from yunstanford/master
add triggers events when async create_server
2017-08-09 15:59:44 -07:00
Yun Xu
80f27b1db9 add unit tests and make flake8 happy 2017-08-08 22:21:40 -07:00
Yun Xu
d5d1d3b45a add trigger before_start events in create_server 2017-08-08 21:58:10 -07:00
Eli Uriegas
c797c3f22d Merge pull request #883 from miguelgrinberg/websocket-subprotocols
Weboscket subprotocol negotiation
2017-08-08 11:49:21 -07:00
Miguel Grinberg
375ed23216 Weboscket subprotocol negotiation
Fixes #874
2017-08-08 11:40:44 -07:00
Raphael Deem
7b66a56cad Merge pull request #870 from MichaelYusko/small-amendment
Did the small changes for better readable
2017-08-03 18:39:26 -07:00
MichaelYusko
7216bf7835 merge master into local branch 2017-08-03 12:11:47 +03:00
Eli Uriegas
8b24c35ac7 Merge pull request #878 from seemethere/increment_060
Increment to 0.6.0
2017-08-02 19:13:39 -07:00
Eli Uriegas
f80a6ae228 Increment to 0.6.0 2017-08-02 19:11:53 -07:00
7
9b3fbe4593 fixed small doc issue (#877) 2017-08-02 10:15:18 -07:00
Yun Xu
f99a723627 fixed small doc issue 2017-08-02 09:05:33 -07:00
7
181ffb00a7 Merge pull request #9 from channelcat/master
merging upstream master branch
2017-08-02 09:04:31 -07:00
Eli Uriegas
222eca64d7 Merge pull request #875 from cctse/patch-1
add some example links for readme
2017-08-01 09:36:37 -07:00
akc
1b687f3feb add some example links 2017-08-01 16:32:15 +08:00
Eli Uriegas
2228104bff Merge pull request #862 from zyguan/revert-599fbce
revert 599fbce
2017-07-31 13:51:04 -07:00
Raphael Deem
402c3752c4 Merge pull request #871 from Frzk/unauthorized-exception
Simplified the Unauthorized exception __init__ signature.
2017-07-31 12:23:12 -07:00
François KUBLER
69a8bb5e1f Fixed a trailing white space in the docstring. 2017-07-28 22:29:45 +02:00
Raphael Deem
0d76f9e030 Merge pull request #873 from yunstanford/fix-timeout-issue
Fix timeout issue
2017-07-28 09:49:22 -07:00
Yun Xu
eb8f65c58b switch to use dist: precise 2017-07-27 22:21:19 -07:00
7
e0e27a671e Merge pull request #8 from channelcat/master
merge upstream master branch
2017-07-27 20:01:59 -07:00
François KUBLER
b65eb69d9f Simplified the Unauthorized exception __init__ signature.
(again).
Use of **kwargs makes it more straight forward and easier to use.
2017-07-27 23:00:27 +02:00
Raphael Deem
8118e542fb Merge pull request #866 from Nikamura/patch-1
Fix typo in documentation
2017-07-26 13:18:52 -07:00
Raphael Deem
c866759bd4 Merge pull request #868 from cclauss/patch-2
Comment: F821 undefined name is done on purpose
2017-07-26 13:18:29 -07:00
MichaelYusko
429f7377cb Did the small changes for better readable 2017-07-26 19:32:23 +03:00
cclauss
40776e5324 Comment: F821 undefined name is done on purpose
Comment helps readers and `# noqa` silences linters
2017-07-26 12:44:30 +02:00
Karolis Mažukna
621343112d Fix typo in documentation 2017-07-25 13:29:17 +03:00
Raphael Deem
eb06e6ba51 Merge pull request #863 from zyguan/issue-760
handle keep-alive timeout gracefully
2017-07-25 00:54:43 -07:00
zyguan
da91b16244 add tests 2017-07-24 18:21:15 +08:00
zyguan
918e2ba8d0 Revert "fix #752"
This reverts commit 599fbcee6e.
2017-07-24 11:53:11 +08:00
zyguan
f50dc83829 handle keep-alive timeout gracefully 2017-07-24 01:37:36 +08:00
Raphael Deem
e8a9b4743b Merge pull request #861 from yunstanford/add-jinja-sanic
Add jinja2-sanic
2017-07-22 20:16:21 -07:00
Yun Xu
f34226425e add jinja2-sanic 2017-07-22 18:41:53 -07:00
7
e27c7ba36f Merge pull request #7 from channelcat/master
merge upstreaming master branch
2017-07-22 18:28:38 -07:00
Raphael Deem
1aad527956 Merge pull request #824 from Frzk/unauthorized-exception
Simplified the `Unauthorized.__init__` signature.
2017-07-21 23:50:36 -07:00
Raphael Deem
173c62acb6 Merge branch 'master' into unauthorized-exception 2017-07-21 01:54:45 -07:00
Raphael Deem
76605d7dfe Merge pull request #858 from mohd-akram/freebsd-syslog
Fix FreeBSD syslog path
2017-07-20 20:50:25 -07:00
Mohamed Akram
32be1a6496 Fix FreeBSD syslog path 2017-07-20 03:02:40 +04:00
Raphael Deem
c7d43aa544 Merge pull request #853 from yunstanford/patch-proxy-fix
proxy fix
2017-07-17 01:01:26 -07:00
Yun Xu
198bf55b7b flake8 fix 2017-07-14 17:23:18 -07:00
Yun Xu
75378d3567 add remote_addr property for proxy fix 2017-07-14 09:29:16 -07:00
7
55cb371569 Merge pull request #6 from channelcat/master
merge upstreaming master branch
2017-07-13 20:07:15 -07:00
Raphael Deem
5bb97d25d0 Merge pull request #839 from asvetlov/patch-1
Drop benchmarks from README
2017-07-13 14:41:57 -07:00
Andrew Svetlov
b2017cae77 Drop benchmarks from readme 2017-07-13 23:41:04 +02:00
Raphael Deem
35af903d4a Merge pull request #851 from zenixls2/issue-805
don't let default LOGGING to dictConfig influence already existed configs
2017-07-13 12:49:29 -07:00
zenix
426e00b6f4 dont let dictConfig influence already exists configs 2017-07-13 15:09:04 +09:00
Raphael Deem
8e62b3e438 Merge pull request #850 from r0fls/versioning
Versioning
2017-07-12 22:37:14 -07:00
Raphael Deem
4265ad5f23 add versioning 2017-07-12 22:19:42 -07:00
Raphael Deem
c181eb0539 Merge branch 'master' of https://github.com/channelcat/sanic 2017-07-12 21:12:15 -07:00
Raphael Deem
e0f06753c6 Merge pull request #831 from yunstanford/auto-doc
Improve Documentation.
2017-07-12 20:20:51 -07:00
Raphael Deem
8aafd72ef0 Merge branch 'auto-doc' of https://github.com/yunstanford/sanic 2017-07-12 20:19:30 -07:00
Raphael Deem
48549ce97b Merge pull request #847 from youknowone/gunicorn
ensure loop.close() and sys.exit() in gunicorn worker
2017-07-12 18:19:39 -07:00
Jeong YunWon
47abf83960 Protocol configurable gunicorn worker 2017-07-12 22:30:13 +09:00
Jeong YunWon
be0f3731b4 ensure loop.close() and sys.exit() in gunicorn worker 2017-07-12 22:26:58 +09:00
Raphael Deem
b755431b93 Merge pull request #844 from sfermigier/patch-1
Add missing code block qualifier
2017-07-10 15:26:57 -07:00
Stefane Fermigier
04ff393875 Add missing code block qualifier 2017-07-10 22:11:12 +02:00
Raphael Deem
7841274300 Merge pull request #843 from yunstanford/case-insensitive-check
Case insensitive check
2017-07-10 12:44:25 -07:00
Yun Xu
235687d983 should call lower just once 2017-07-10 12:37:21 -07:00
Yun Xu
3d75e6ed95 case-insensitive check for header fields 2017-07-10 12:29:47 -07:00
Andrew Svetlov
eb9af8bceb Drop aiohttp from benchmark table
The reason is: aiohttp with disabled access log shows about 16,000 RPS on sanic's own benchmark.
It's pretty much faster than 3,000 RPS from the table.

I'm not a Sanic dev team member. You should not trust users to update this table but manage periodic updates yourself.
If you don't want to do it --- it's up to you.
Please just drop very incorrect and outdated numbers from README in this case.
2017-07-09 08:18:45 +02:00
7
39ea434513 Merge pull request #5 from channelcat/master
merge upstream master branch
2017-07-08 14:23:00 -07:00
Raphael Deem
f0a956467c Merge pull request #815 from yunstanford/master
add graceful timeout when shutdown
2017-07-08 11:31:37 -07:00
Yun Xu
e48bd08095 make flake8 happy 2017-07-02 10:05:33 -07:00
Yun Xu
5d00717f39 improve doc and remove warnings 2017-07-02 10:02:04 -07:00
Yun Xu
3fff685c44 add auto-doc support 2017-07-01 23:46:34 -07:00
Raphael Deem
1e75265eed Merge pull request #756 from qwesda/master
fixes #755 fragmented headers
2017-06-30 18:24:51 -07:00
Eli Uriegas
b6ac3ef445 Merge pull request #826 from yunstanford/pytest-sanic
Pytest sanic
2017-06-30 18:17:15 -07:00
Raphael Deem
421f78f3e6 Merge pull request #814 from Frzk/forbidden-exception
Added a Forbidden exception
2017-06-30 18:11:23 -07:00
Eli Uriegas
b71fdcfc20 Merge pull request #829 from Frzk/config_doc
Fixed an error : `Sanic.__init__` doesn't have a `load_vars` parameter.
2017-06-30 10:06:30 -07:00
François KUBLER
021e9b228a Fixed a small error : Sanic.__init__ doesn't have a load_vars parameter.
It is `load_env`.
2017-06-30 16:24:41 +02:00
Raphael Deem
00d4533022 Merge pull request #821 from Frzk/bearer-support
Inverted the order of prefixes in Request.token property.
2017-06-29 09:43:34 -07:00
Yun Xu
fd5faeb5dd add an example 2017-06-29 09:14:21 -07:00
Yun Xu
e7c8035ed7 add pytest-sanic 2017-06-29 09:06:17 -07:00
François KUBLER
e427e38da8 Simplified the Unauthorized.__init__ signature.
It doesn't really make sense to have a `realm` parameter in the method signature.
Instead, one can simply set the realm in the `challenge` dict if necessary.

Also fixed the tests accordingly (and added a new one for "Bearer" auth-scheme).
2017-06-29 12:34:52 +02:00
François KUBLER
1f24abc3d2 Fixed support for "Bearer" and "Token" auth-schemes.
Removed the test for "Authentication: Bearer Token <TOKEN>" which was not supposed to exist (see https://github.com/channelcat/sanic/pull/821)
Also added a call to `split` when retrieving the token value to handle cases where there are leading or trailing spaces.
2017-06-29 10:23:49 +02:00
François
76e62779ba Merge branch 'master' into forbidden-exception 2017-06-28 17:25:40 +02:00
Eli Uriegas
1af343ef50 Merge pull request #823 from ojii/cookies-warning
Added a warning to the cookies documentation about security
2017-06-27 19:35:35 -07:00
Jonas Obrist
412ffd1592 Added a warning to the cookies documentation about security 2017-06-28 11:05:59 +09:00
Daniel Schwarz
b141fec573 Merge remote-tracking branch 'upstream/master'
# Conflicts:
#	sanic/server.py
2017-06-27 13:32:49 +02:00
François KUBLER
d2e14abfd5 Inverted the order of prefixes in Request.token property.
As suggested by @allan-simon
See: https://github.com/channelcat/sanic/pull/811#pullrequestreview-46144327
2017-06-27 12:57:47 +02:00
Raphael Deem
d4abca0480 Merge pull request #818 from youknowone/debug
Introduce debug mode for HTTP protocol
2017-06-26 22:02:37 -07:00
Raphael Deem
529f5822ee Merge pull request #819 from r0fls/817
convert environment vars to int if digits
2017-06-26 21:54:54 -07:00
Raphael Deem
395d85a12f use try/except 2017-06-26 21:35:01 -07:00
Raphael Deem
4379a4b067 float logic 2017-06-26 20:59:59 -07:00
Raphael Deem
ad8e1cbf62 convert environment vars to int if digits 2017-06-26 20:49:41 -07:00
Jeong YunWon
dc5a70b0de Introduce debug mode for HTTP protocol 2017-06-26 21:13:13 +09:00
Yun Xu
b5d1f52ea4 make flake8 happy 2017-06-25 10:22:40 -07:00
Yun Xu
221cf235b5 fix a unit test 2017-06-25 01:03:28 -07:00
Yun Xu
7720e31a31 add unit test 2017-06-25 00:51:59 -07:00
Yun Xu
d812affef0 add graceful_shutdown_timeout to gunicorn worker 2017-06-25 00:51:14 -07:00
Yun Xu
5c19eb34bf add graceful_shutdown_timeout 2017-06-24 19:00:33 -07:00
7
e18ebaee3d Merge pull request #4 from channelcat/master
merge upstream master branch
2017-06-24 18:21:13 -07:00
Eli Uriegas
dbcbf12456 Merge pull request #811 from Frzk/bearer-support
Added support for 'Authorization: Bearer <TOKEN>' header...
2017-06-23 10:32:21 -07:00
Eli Uriegas
c04b44057c Merge pull request #813 from Frzk/unauthorized-exception
Added an Unauthorized exception
2017-06-23 10:30:51 -07:00
François KUBLER
60aa60f48e Fixed the test for the new Unauthorized exception. 2017-06-23 17:16:31 +02:00
François KUBLER
2848d7c80e Added a Forbidden exception
Also added a small test.
2017-06-23 16:44:57 +02:00
François KUBLER
9fcdacb624 Modified the name of an argument. 2017-06-23 16:29:04 +02:00
François KUBLER
cf1713b085 Added a Unauthorized exception.
Also added a few tests related to this new exception.
2017-06-23 16:12:15 +02:00
7
f049a4ca67 Recycling gunicorn worker (#800)
* add recycling feature to gunicorn worker

* add unit tests

* add more unit tests, and remove redundant trigger_events call

* fixed up unit tests

* make flake8 happy

* address feedbacks

* make flake8 happy

* add doc
2017-06-22 13:26:50 -07:00
François KUBLER
55f860da2f Added support for 'Authorization: Bearer <TOKEN>' header in Request.token property.
Also added a test case for that kind of header.
2017-06-22 18:11:23 +02:00
Eli Uriegas
b5369e611c Merge pull request #764 from stopspazzing/master
Clean up of examples.
2017-06-20 14:53:37 -07:00
Jeremy Zimmerman
3d1dd1c6ac re-add extensions.md to fix merge conflict. 2017-06-20 14:49:12 -07:00
Eli Uriegas
10a363b275 Merge pull request #809 from jrocketfingers/feature/allow-textual-responses
Allow textual responses when using test_client and aiohttp 2
2017-06-20 10:30:26 -07:00
Nikola Kolevski
d865c5e2b6 Conform to pep8 2017-06-20 13:22:28 +02:00
Nikola Kolevski
9fac37588c Allow textual responses when using test_client and aiohttp 2 2017-06-20 13:15:30 +02:00
Jeremy Zimmerman
aac0d58417 Delete extensions.md
non-core content moved to wiki
2017-06-19 15:18:32 -07:00
Eli Uriegas
b37e6187d4 Merge pull request #802 from yunstanford/add-match-info
Add match_info property to request class
2017-06-18 10:15:46 -07:00
Yun Xu
20138ee85f add match_info to request 2017-06-17 09:47:58 -07:00
Raphael Deem
6dc569cde5 Merge pull request #795 from ekampf/patch-1
Prevent `run` from overriding logging config set in constructor
2017-06-15 23:39:09 -07:00
Eran Kampf
77cf0b678a Fix has_log value 2017-06-15 11:21:08 -07:00
Eran Kampf
2dfb061063 Prevent run from overriding logging config set in constructor
When creating the `Sanic` instance I provide it with a customized `log_config`.
Calling `run` overrides these settings unless I provide it *again* with the same `log_config`.
This is confusing and error prone. `run` shouldnt override configurations set in the `Sanic` constructor...
2017-06-15 10:39:00 -07:00
7
e4669e2581 Merge pull request #3 from channelcat/master
merge upstream master branch
2017-06-12 22:15:31 -07:00
Eli Uriegas
df47cf72d3 Merge pull request #791 from seemethere/add_docs_requirements
Add docs requirements
2017-06-12 10:55:45 -07:00
Eli Uriegas
ba1b34e375 Add docs requirements
Closes channelcat/sanic#787
2017-06-12 10:29:38 -07:00
Eli Uriegas
950b5ee529 Merge pull request #789 from yunstanford/coverage-report
Coverage report
2017-06-11 23:38:04 -07:00
Raphael Deem
041c48de19 Merge pull request #790 from r0fls/752
fix #752
2017-06-11 23:24:32 -07:00
Raphael Deem
599fbcee6e fix #752 2017-06-11 23:20:04 -07:00
Yun Xu
ce2df8030c quick fix for test_gunicorn_worker test 2017-06-11 09:06:48 -07:00
Yun Xu
47e761bbe2 add coverage report 2017-06-11 08:49:35 -07:00
7
0646baa18d Merge pull request #2 from channelcat/master
remote-tracking with upstream
2017-06-10 11:54:11 -07:00
Eli Uriegas
38997c1b47 Merge pull request #786 from yunstanford/handle_stream_404
Handle stream 404
2017-06-10 10:03:54 -07:00
Yun Xu
acaafabc23 retry build 2017-06-10 09:57:32 -07:00
Yun Xu
6a80bdafa6 add unit tests 2017-06-10 09:48:30 -07:00
Yun Xu
cf30ed745c also should handle InvalidUsage exception 2017-06-10 09:42:48 -07:00
Eli Uriegas
a399fb4044 Merge pull request #785 from youknowone/gunicorn
Gunicorn worker hints app weather it is being terminated
2017-06-09 11:00:12 -07:00
Yun Xu
24b946e850 make flake8 happy 2017-06-09 08:43:23 -07:00
Yun Xu
236daf48ff add unit tests 2017-06-09 08:42:48 -07:00
Yun Xu
4942af27dc handle NotFound 2017-06-09 08:33:34 -07:00
7
3adb90071b Merge pull request #1 from channelcat/master
merge upstream project
2017-06-09 08:23:14 -07:00
Jeong YunWon
29b4a2a08c Gunicorn worker hints app weather it is being terminated
For now, `Sanic.is_running` is set when the worker is started but not
unset when it is about to stopped. Setting the flag for quit signal
will not affect working requests, but the `Sanic.is_running` flag still
can be used to support graceful termination.
2017-06-09 14:51:15 +09:00
Eli Uriegas
e1331fc0a2 Merge pull request #783 from yunstanford/master
add content_type property in request
2017-06-08 17:30:07 -07:00
Yun Xu
3802f8ff65 unit tests 2017-06-08 17:25:22 -07:00
Eli Uriegas
4b0abdbe7c Merge pull request #782 from yunstanford/sanic-transmute-doc
add sanic-transmute
2017-06-08 14:27:29 -07:00
Yun Xu
81889fd7a3 add unit tests 2017-06-07 20:48:07 -07:00
Yun Xu
aac99c45c0 add content_type property in request 2017-06-07 20:46:48 -07:00
Yun Xu
566a6369a5 add sanic-transmute 2017-06-07 20:27:54 -07:00
Eli Uriegas
4fdf340d04 Merge pull request #773 from mbatchkarov/await-json-too
Testing: store JSON response in local request
2017-06-07 12:05:30 -07:00
Miroslav Batchkarov
ddd7145153 check json is None if body is not JSON 2017-06-07 10:03:27 +01:00
Miroslav Batchkarov
3f22b644b6 wrap call to json in try-except to make tests pass 2017-06-07 09:57:07 +01:00
Eli Uriegas
639c9f579d Merge pull request #774 from algtmatt/feature/logging_doc_fixes
Small logging docs fixes
2017-06-05 10:56:00 -07:00
Matthew Snyder
735b8665f1 Small logging docs fixes 2017-06-05 10:42:17 -07:00
Miroslav Batchkarov
199fa50a9d also store json result in local request 2017-06-05 16:24:23 +01:00
Jeremy Zimmerman
aac5ad8504 Merge remote-tracking branch 'origin/master' 2017-06-01 16:53:36 -07:00
Jeremy Zimmerman
349c108ebc re-added request_stream example. 2017-06-01 16:52:56 -07:00
Eli Uriegas
3b464782ef Merge pull request #765 from ttopholm/master
Added content_type to be set for son response
2017-06-01 16:29:08 -05:00
Tue Topholm
3d97fd8d2a Removed whitespace 2017-06-01 23:09:37 +02:00
Tue Topholm
c102e76146 Fixed line width 2017-06-01 23:01:27 +02:00
Jeremy Zimmerman
beee7b68bf reverted back to default 0.0.0.0 host 2017-06-01 14:01:13 -07:00
Tue Topholm
f47e571d92 Added content_type to be set for son response 2017-06-01 22:53:56 +02:00
Jeremy Zimmerman
4b5320a8f0 Clean up of examples. Removes non-core examples, optimizes and restyles remaining to strictly follow PEP 8 styling guidelines. Non-Core examples will be moved to Wiki. 2017-06-01 11:53:05 -07:00
Daniel Schwarz
30c2c89c6b fix partial url parsing 2017-05-30 16:13:49 +02:00
Raphael Deem
4a1d1a0dc1 Merge pull request #750 from xenu256/patch-1
aiomysql has DictCursor
2017-05-29 22:54:09 -07:00
Raphael Deem
360adc9130 Merge pull request #757 from monobot/asyncOrmV020
update asyncorm version example to 0.2.0
2017-05-29 22:51:46 -07:00
Raphael Deem
cc21abe843 Merge pull request #751 from ak04nv/patch-1
Update jinja_example.py
2017-05-28 23:50:16 -07:00
monobot
9a27555763 update asyncorm version example to 0.2.0 2017-05-29 00:01:56 +01:00
Daniel Schwarz
aaef2fbd01 fix flake8 errors 2017-05-28 18:46:07 +02:00
Anton Kochnev
5bb640ca17 Update jinja_example.py
Added python version check for enabling async mode.
2017-05-28 14:37:41 +08:00
Daniel Schwarz
0e5c7a62cb remove debug messages 2017-05-27 22:36:08 +02:00
Daniel Schwarz
1b33e05f74 fix debug log messages 2017-05-27 16:32:39 +02:00
Daniel Schwarz
53a04309ff add header_fragment handeling 2017-05-27 16:28:57 +02:00
Daniel Schwarz
dc411651b6 add check for header and value 2017-05-27 15:36:57 +02:00
Daniel Schwarz
514540b90b add debug for header values 2017-05-27 15:32:37 +02:00
Tadas Talaikis
a5249d1f5d aiomysql has DictCursor 2017-05-27 11:06:45 +03:00
Eli Uriegas
21aa3f6578 Merge pull request #748 from messense/feature/websocket-config
Add websocket max_size and max_queue configuration
2017-05-26 10:44:49 -05:00
messense
0024edbbb9 Add websocket max_size and max_queue configuration 2017-05-26 11:15:28 +08:00
Raphael Deem
23cb39b557 Merge pull request #744 from algtmatt/feature/from_file_doc_fix
Update config file loader docs
2017-05-24 18:16:10 -07:00
Eli Uriegas
48de321869 Merge pull request #697 from 38elements/stream
Add Request.stream
2017-05-24 16:22:52 -07:00
Raphael Deem
c6d68009d2 Merge pull request #745 from messense/feature/gunicorn-worker-test-case
Add a simple integration test for Gunicorn worker
2017-05-23 20:53:01 -07:00
Raphael Deem
2cab267405 Merge pull request #734 from ashleysommer/static_large_file_stream
Add option to static helper to use streaming for large files.
2017-05-23 20:52:12 -07:00
messense
6bdc0d2e5e Fix Gunicorn worker 2017-05-23 11:28:12 +08:00
messense
3eed81c1eb Add a simple integration test for Gunicorn worker 2017-05-23 11:04:27 +08:00
Raphael Deem
b447807b36 Merge pull request #742 from r0fls/700
changes required for unix socket support
2017-05-22 19:29:32 -07:00
Eli Uriegas
2771c8c32e Merge pull request #738 from messense/feature/conduct
Add Code of Conduct
2017-05-22 15:34:01 -07:00
Eli Uriegas
21a88bc2d3 Add maintainers email address 2017-05-22 15:31:05 -07:00
Matthew Snyder
57c1838f68 Update config file loader docs 2017-05-22 14:10:08 -07:00
Raphael Deem
52b0254ec6 unix socket support; fixes #700 2017-05-21 03:15:06 -07:00
Raphael Deem
49631542ce Merge pull request #732 from jrocketfingers/feature/explicit-register-middleware
Extract register_middleware into a method.
2017-05-21 03:06:12 -07:00
Raphael Deem
4b80ffb9eb Merge pull request #740 from r0fls/739
add abort function
2017-05-21 02:20:02 -07:00
Raphael Deem
9efa7c116d remove redundant code; decode response 2017-05-20 23:27:00 -07:00
Raphael Deem
5d9c8d59a0 add abort() test 2017-05-20 14:43:57 -07:00
Raphael Deem
1a60201f68 flake8 spacing 2017-05-20 10:44:09 -07:00
Raphael Deem
f3186abf09 SANIC_EXCEPTIONS -> _sanic_exceptions 2017-05-20 10:29:00 -07:00
Raphael Deem
6bcc0d3c7f Merge remote-tracking branch 'upstream/master' 2017-05-20 02:17:26 -07:00
Raphael Deem
57b9a57dde Update README.rst 2017-05-20 02:17:12 -07:00
Raphael Deem
28994f4b64 update todo 2017-05-20 02:15:45 -07:00
Raphael Deem
588b4712bf add exception decorator 2017-05-20 01:24:34 -07:00
Raphael Deem
d3b6208057 add abort function 2017-05-19 18:52:19 -07:00
Ashley Sommer
ef80953b1b Fix flake8 line length error. 2017-05-20 09:56:05 +10:00
ashleysommer
72db1188c7 Add an option to the static() helper to switch on streaming for large files.
By default uses a 1M threshold.
ie. if the static file to serve is >= 1M it will stream the file.
This threshold value is configurable by passing an int instead of a bool to `stream_large_files` parameter of `static()`.
2017-05-20 09:56:05 +10:00
Raphael Deem
0858d3c544 Merge pull request #733 from ashleysommer/file_stream
Add file_stream response handler
2017-05-19 16:48:12 -07:00
Ashley Sommer
5c5656f981 Moved file_stream tests to test_responses.py 2017-05-20 09:41:36 +10:00
Raphael Deem
58a9c92d75 fix 739 2017-05-19 13:35:04 -07:00
Eli Uriegas
a6dc4646db Merge pull request #737 from 38elements/deploying
Fix Running via Gunicorn in deploying.md
2017-05-19 12:10:36 -07:00
messense
8ff553e926 Add Code of Conduct 2017-05-19 23:09:44 +08:00
38elements
848a5c61f0 Fix Running via Gunicorn in deploying.md 2017-05-19 23:22:57 +09:00
Raphael Deem
d49000e9f4 Merge pull request #736 from fanjindong/examples_read
debug 'Blueprint names must be unique'
2017-05-19 01:45:06 -07:00
fanjindong
a82145c4e6 debug 'Blueprint names must be unique' 2017-05-19 16:26:56 +08:00
Ashley Sommer
181edb7235 Test file() and file_stream() response helpers.
Added test for `file()` response helper and `file_stream()` response helper.
2017-05-19 13:01:21 +10:00
Ashley Sommer
ff2ae11ac8 Remove exception print(e) statement. 2017-05-19 13:00:01 +10:00
Johnny Rocketfingers
3f841f3b21 Switch to non-hardcoded register_middleware. 2017-05-18 22:08:44 +02:00
ashleysommer
181977ad4e Added brief documentation with an example for file_stream
Added test to ensure `file_stream()` works in the test suite.
2017-05-18 18:12:26 +10:00
ashleysommer
e155fe403d Add file_stream response handler
For streaming large static files
Like `file()` but breaks the file into chunks and sends it with a `StreamingHTTPResponse`
Chunk size is configurable, but defaults to 4k, this seemed to be the sweet spot in my testing.
Also supports ContentRange same as `file()` does.
2017-05-18 18:04:28 +10:00
Johnny Rocketfingers
bf5438d573 Extract register_middleware into a method. 2017-05-18 06:36:11 +02:00
Raphael Deem
0e4aaf8856 Merge pull request #731 from jrocketfingers/fix/token-missing-auth-headers
Check that the Authorization headers are actually provided.
2017-05-17 13:10:12 -07:00
Raphael Deem
5c44ce1637 Merge pull request #719 from messense/feature/worker-uvloop
Gunicorn worker should not require uvloop
2017-05-17 12:47:19 -07:00
Raphael Deem
974fe25a11 Merge pull request #722 from messense/feature/ci-without-ext
Add py3*-no-ext test env
2017-05-17 12:47:05 -07:00
Johnny
58bae83558 Add a regression test. 2017-05-17 11:15:45 +02:00
Johnny
5d309af86f Check that the headers are actually provided. 2017-05-17 11:08:50 +02:00
messense
ec857d1c53 Drop tox-travis 2017-05-17 12:21:56 +08:00
messense
2f84cdd708 Fix websocket handler bug on Python3.5 with no uvloop 2017-05-17 12:12:25 +08:00
messense
7cc02e84ed Fix json loads bug on Python 3.5 2017-05-17 12:12:25 +08:00
Raphael Deem
87c2a5bc97 Merge pull request #724 from suoning/doc-logger
update logging doc
2017-05-16 21:10:57 -07:00
Eli Uriegas
826c2e0f4e Merge pull request #725 from 38elements/contributing
Add rule in CONTRIBUTING.md
2017-05-15 14:45:57 -07:00
Eli Uriegas
b5e25e13b7 Merge pull request #727 from argaen/update_aiocache_example
Update aiocache example to latest version
2017-05-15 11:39:02 -07:00
argaen
f9653114d1 Update aiocache example to latest version 2017-05-15 20:30:52 +02:00
Eli Uriegas
6b7e19891b Get rid of un-needed s, Fix some formatting. 2017-05-15 10:54:47 -07:00
38elements
a677f14423 Add rule in CONTRIBUTING.md 2017-05-15 21:28:35 +09:00
suoning
dddce3f30d update logging, Remove the comments 2017-05-15 13:59:03 +08:00
Raphael Deem
be93d670a3 Merge pull request #717 from jrocketfingers/fix/ipv6-access-log
Fix "TypeError: not all arguments converted during string formatting"
2017-05-14 20:28:07 -07:00
suoning
68d4bb6ffe update logging doc 2017-05-15 10:54:30 +08:00
suoning
a27471178a update logger doc 2017-05-15 10:25:19 +08:00
messense
66fcb0cc8f Add py3*-no-ext test env 2017-05-15 10:10:50 +08:00
messense
05d0ddc281 Gunicorn worker should not require uvloop 2017-05-15 00:01:51 +08:00
Johnny Rocketfingers
b1890f50b6 Conform to pep8 2017-05-14 10:15:11 +02:00
Johnny Rocketfingers
b44c707e94 Prevent incorrect tuple size on get_extra_info errors
According to https://docs.python.org/3/library/asyncio-protocol.html#asyncio.BaseTransport.get_extra_info,
get_extra_info fails by returning None. This is an attempt in
normalization of the response in cases of AF_INET, AF_INET6 and
erroneous return values.
2017-05-14 09:56:56 +02:00
Johnny
4c7675939a Fix "TypeError: not all arguments converted during string formatting"
socket.getpeername() returns AF_INET6 address family four-tuple, with
flowid and scopeid.

In server's write_response, an exception is raised when an IPv6 client
connects due to four-tuple elements having two unused elements (flowid
and scopeid).

This makes sure that only the first two (host and port) are used in log
string formatting.
2017-05-13 17:35:04 +02:00
Eli Uriegas
fa1b7de52a Merge pull request #706 from messense/feature/remove-log-file
Remove timedRotatingFile log config
2017-05-12 10:56:19 -07:00
Eli Uriegas
666f8c8d3c Merge pull request #712 from stopspazzing/master
Fixed plotly_example, now works
2017-05-12 10:39:57 -07:00
Jeremy Zimmerman
996c0b3280 Fixed with a working example
Remember:
K.I.S.S
2017-05-11 13:40:16 -07:00
Eli Uriegas
f9d428de8b Merge pull request #711 from stopspazzing/master
Use of register_blueprint will be deprecated, why not upgrade?
2017-05-11 12:53:43 -07:00
Jeremy Zimmerman
a17b3f1b84 Use of register_blueprint will be deprecated, why not upgrade? 2017-05-11 12:33:57 -07:00
Jeremy Zimmerman
f39512aa63 double if statement (#707)
* Migrated `%` string formating

* double if statement

combined double 'if' to a single 'if' with 'and'

* Revert "Fix "Prefer `format()` over string interpolation operator" issue"
2017-05-11 11:49:32 -07:00
Raphael Deem
23ee9f64d4 Merge pull request #710 from monobot/asyncorm_update
modify the asyncorm example, with the new lazy querysets
2017-05-11 10:35:52 -07:00
monobot
ad68739df7 modify the asyncorm example, for the new lazy querysets 2017-05-11 18:15:04 +01:00
38elements
a50d8421b8 Add heading in streaming.md 2017-05-11 19:18:58 +09:00
messense
3ea1b07906 Revert "Add access.log and error.log to .gitignore"
This reverts commit fc0d69616c.
2017-05-11 11:19:03 +08:00
messense
c3683662c2 Remove timedRotatingFile log config 2017-05-11 11:18:59 +08:00
Eli Uriegas
861865807a Merge pull request #705 from stopspazzing/patch-2
spelling mistake
2017-05-10 10:52:29 -07:00
Jeremy Zimmerman
18930082e2 spelling mistake
fixed incorrect spelling
2017-05-10 09:38:57 -07:00
38elements
6a14e49479 Replace stream decorator to stream parameter 2017-05-09 22:31:15 +09:00
Raphael Deem
c530d5f016 Merge pull request #704 from seemethere/fix_camel_case
Fix camel case module
2017-05-08 20:50:39 -07:00
Eli Uriegas
ac5e1a6ebd Fix import 2017-05-08 20:47:20 -07:00
Eli Uriegas
bb6de53f28 Fix docs 2017-05-08 20:44:19 -07:00
Eli Uriegas
bfcd499cc2 Remove default_filter module, put into logging 2017-05-08 20:41:34 -07:00
Eli Uriegas
3e88ec18e2 Actually add file >.> 2017-05-08 20:37:44 -07:00
Eli Uriegas
1fb640c313 Fix camel case module 2017-05-08 20:36:57 -07:00
Eli Uriegas
bece3d2bcf Merge pull request #702 from seemethere/increment_054
Increment to 0.5.4
2017-05-08 17:44:50 -07:00
Eli Uriegas
307d866bb6 Increment to 0.5.4 2017-05-08 17:44:23 -07:00
Eli Uriegas
95c9514a44 Merge pull request #701 from seemethere/increment_053
Increment to 0.5.3
2017-05-08 17:42:54 -07:00
Eli Uriegas
768be433d6 Increment to 0.5.3 2017-05-08 17:42:26 -07:00
38elements
4d4f38fb35 is_request_stream for CompositionView and HTTPMethodView 2017-05-09 01:04:03 +09:00
38elements
15ad07f03d Fix streaming.md 2017-05-08 00:10:36 +09:00
38elements
0b53c413a7 Add stream decorator for HTTPMethodView 2017-05-07 21:33:15 +09:00
38elements
931397c7e1 Add stream for CompositionView 2017-05-07 18:38:48 +09:00
38elements
ef2cc7ebf5 Add Request.stream 2017-05-07 18:38:48 +09:00
Eli Uriegas
8d3fd75ec2 Merge pull request #695 from kszucs/aiopeewee
aiopeewee example
2017-05-05 11:31:37 -07:00
Szucs Krisztian
a42b254c33 use async version of model_to_dict 2017-05-05 10:51:12 +02:00
Szucs Krisztian
d24e1ae110 removed lines from distributed example 2017-05-05 08:25:18 +02:00
Szucs Krisztian
2e7badab4e aiopeewee example 2017-05-05 08:19:09 +02:00
Raphael Deem
7cf3d49f00 Merge pull request #690 from 38elements/sanic_endpoint_test
Remove utils.py
2017-05-03 23:56:13 -07:00
38elements
25037006bf Remove utils.py 2017-05-04 15:52:18 +09:00
Eli Uriegas
f611eb2c2b Merge pull request #686 from 38elements/loop
Remove loop argument in run() and create_server()
2017-05-03 15:27:16 -07:00
38elements
e12c10b087 Remove loop argument in run() and create_server() 2017-05-03 17:52:19 +09:00
Raphael Deem
9527e5ded8 Merge pull request #685 from r0fls/document-create-server
document create_server method
2017-05-02 22:45:51 -07:00
Raphael Deem
23b4b20b4f document create_server method 2017-05-02 22:44:42 -07:00
Eli Uriegas
7f9ecd659c Merge pull request #682 from graingert/fix-readme.rst-encoding
fix README.rst -> long_description encoding
2017-05-02 10:48:45 -07:00
Eli Uriegas
bb31d465f2 Add distribution types 2017-05-02 10:47:55 -07:00
Thomas Grainger
834468e8e7 fix README.rst -> long_description encoding 2017-05-02 17:37:17 +01:00
Eli Uriegas
4720513672 Merge pull request #679 from graingert/verify-readme-rst-pypi
verify readme for PyPI
2017-05-02 09:31:01 -07:00
Eli Uriegas
a480110d43 Merge pull request #681 from 38elements/deprecation
Remove before_start, before_stop, after_start and after_stop
2017-05-02 09:28:33 -07:00
38elements
0a2c95cc10 Remove before_start, before_stop, after_start and after_stop 2017-05-02 23:07:09 +09:00
Thomas Grainger
9d2e32902d check readme in travis 2017-05-02 10:17:24 +01:00
Thomas Grainger
77b6413526 validate readme for PyPI 2017-05-02 10:05:06 +01:00
Thomas Grainger
9e502099e0 add readme to package directly 2017-05-02 10:05:05 +01:00
Eli Uriegas
c6a7e44ae7 Merge pull request #678 from stopspazzing/patch-1
misspelling
2017-05-01 17:27:31 -07:00
Jeremy Zimmerman
1bf06312b8 misspelling
acutal -> actual
2017-05-01 17:07:38 -07:00
Eli Uriegas
c35721abbd Merge pull request #670 from ticosax/set-exit-code-on-error
In case of error when starting sanic
2017-05-01 15:16:02 -07:00
Eli Uriegas
7f3c417078 Merge pull request #677 from abuckenheimer/master
added exception chain rendering in debug #675
2017-05-01 15:15:34 -07:00
Alec Buckenheimer
69511c2783 added exception chain rendering in debug #675 2017-05-01 12:56:33 -04:00
Eli Uriegas
158a94d34c Merge pull request #674 from 38elements/test-for-uri-template
Add test for uri_template
2017-04-30 20:22:51 -07:00
Eli Uriegas
db58bd68f5 Merge pull request #673 from 38elements/routing
Fix docs/sanic/routing.md
2017-04-30 20:22:30 -07:00
38elements
e65f08a2c8 Fix docs/sanic/routing.md 2017-04-30 22:03:16 +09:00
38elements
ab8f616385 Add test for uri_template 2017-04-30 21:57:32 +09:00
Eli Uriegas
6dc6f9bbb5 Merge pull request #671 from banteg/uri-template
Expose matched request uri template
2017-04-28 14:22:59 -07:00
banteg
7754bb995b expose matched request uri template 2017-04-29 02:39:56 +07:00
Nicolas Delaby
d1fefce61c fixup! In case of error when starting sanic 2017-04-28 20:06:44 +02:00
Nicolas Delaby
c3abdab9c4 The main process that spawn sub processes doesn't run any loop.
let's not try to stop one
2017-04-28 19:57:49 +02:00
Nicolas Delaby
8b13e103fd In case of error when starting sanic
Don't exit with code 0
2017-04-28 19:08:51 +02:00
Eli Uriegas
5f94f65f4f Merge pull request #669 from 38elements/log
Add access.log and error.log to .gitignore
2017-04-28 07:40:42 -07:00
38elements
fc0d69616c Add access.log and error.log to .gitignore 2017-04-28 18:52:15 +09:00
Raphael Deem
656f5b93d6 Merge pull request #668 from seemethere/stop_workers_on_sigint
Add the killing of children
2017-04-27 22:39:48 -07:00
Eli Uriegas
436d37c079 Add the killing of children
Kills children processes when parent process receives a signal to
shutdown.

Solves for #594
2017-04-27 17:47:08 -07:00
Eli Uriegas
ed0081fcf7 Merge pull request #625 from zenixls2/master
based on issue #608, create access log
2017-04-27 14:31:53 -07:00
Eli Uriegas
140062f8a3 Merge pull request #662 from rsrdesarrollo/master
invariant: body after request is processed must be binary
2017-04-27 14:31:28 -07:00
Raphael Deem
75f5fa7c06 Merge pull request #666 from pyx/issue-665
Fix #665 - ImportError not preserved in __main__.py
2017-04-27 00:32:09 -07:00
Philip Xu
9152a1a266 Improved on wording 2017-04-27 03:16:38 -04:00
Philip Xu
ade89ab795 Fix #665 - ImportError not preserved in __main__.py 2017-04-26 22:57:19 -04:00
zenix
95cfdee8b8 Merge branch 'master' of https://github.com/channelcat/sanic 2017-04-26 14:50:42 +09:00
zenix
63a27cc5e2 add document on logging 2017-04-26 14:50:21 +09:00
Eli Uriegas
472face796 Add link to issue tracking sanic projects! 2017-04-25 21:50:49 -07:00
Raphael Deem
8d537a6d0b Merge pull request #663 from mmaybeno/fix_jinja_example_typo
Fix typo for jinja example and converted to dir
2017-04-25 20:29:12 -07:00
Matt Maybeno
b3101d339e Fix typo for jinja example and converted to dir 2017-04-25 20:10:46 -07:00
Raúl Sampedro
85acddddba invariant: body after request is processed must be binary 2017-04-25 12:00:27 +02:00
zenix
c9d747d97f fix merge error 2017-04-25 11:46:13 +09:00
zenix
0bba267808 Merge branch 'master' of https://github.com/channelcat/sanic 2017-04-25 11:07:40 +09:00
Raphael Deem
74cc7be922 Merge branch 'master' into master 2017-04-24 00:47:01 -07:00
zenix
0bbf826b21 fix typo 2017-04-13 16:52:40 +09:00
zenix
02d1900e2f try to fix container error 2017-04-13 16:49:36 +09:00
zenix
73da11b04c switch to use streaming for access log and error log 2017-04-13 16:26:13 +09:00
zenix
4af07e3731 change naming of default log config 2017-04-13 13:49:45 +09:00
zenix
7f60f85cd4 Merge branch 'master' of https://github.com/channelcat/sanic 2017-04-13 13:35:37 +09:00
zenix
c5f137c715 fix original code logic 2017-04-12 18:52:01 +09:00
zenix
66923bc0e3 remove unused param 2017-04-12 18:48:16 +09:00
zenix
8bf7b5a323 remove unused dependency 2017-04-12 18:45:38 +09:00
zenix
36d4d85849 change to use default python config code 2017-04-12 18:44:47 +09:00
zenix
5f0e05f3bf fix flake8 2017-04-12 18:08:06 +09:00
zenix
6fb60ae0b1 Merge branch 'master' of https://github.com/channelcat/sanic 2017-04-12 18:01:12 +09:00
zenix
f872ceb0d9 fix bug in access logging when error happens 2017-04-12 17:39:17 +09:00
zenix
bf46bcf376 Merge branch 'logging' 2017-04-11 19:03:35 +09:00
zenix
f330c3f8c5 add logging based on issue #608, add default config 2017-04-11 18:59:07 +09:00
119 changed files with 5779 additions and 1700 deletions

1
.gitignore vendored
View File

@@ -14,3 +14,4 @@ settings.py
docs/_build/
docs/_api/
build/*
.DS_Store

View File

@@ -1,10 +1,25 @@
sudo: false
dist: precise
language: python
python:
- '3.5'
- '3.6'
install: pip install tox-travis
script: tox
cache:
directories:
- $HOME/.cache/pip
matrix:
include:
- env: TOX_ENV=py35
python: 3.5
- env: TOX_ENV=py35-no-ext
python: 3.5
- env: TOX_ENV=py36
python: 3.6
- env: TOX_ENV=py36-no-ext
python: 3.6
- env: TOX_ENV=flake8
python: 3.6
- env: TOX_ENV=check
python: 3.6
install: pip install -U tox
script: tox -e $TOX_ENV
deploy:
provider: pypi
user: channelcat
@@ -12,3 +27,4 @@ deploy:
secure: OgADRQH3+dTL5swGzXkeRJDNbLpFzwqYnXB4iLD0Npvzj9QnKyQVvkbaeq6VmV9dpEFb5ULaAKYQq19CrXYDm28yanUSn6jdJ4SukaHusi7xt07U6H7pmoX/uZ2WZYqCSLM8cSp8TXY/3oV3rY5Jfj/AibE5XTbim5/lrhsvW6NR+ALzxc0URRPAHDZEPpojTCjSTjpY0aDsaKWg4mXVRMFfY3O68j6KaIoukIZLuoHfePLKrbZxaPG5VxNhMHEaICdxVxE/dO+7pQmQxXuIsEOHK1QiVJ9YrSGcNqgEqhN36kYP8dqMeVB07sv8Xa6o/Uax2/wXS2HEJvuwP1YD6WkoZuo9ZB85bcMdg7BV9jJDbVFVPJwc75BnTLHrMa3Q1KrRlKRDBUXBUsQivPuWhFNwUgvEayq2qSI3aRQR4Z0O+DfboEhXYojSoD64/EWBTZ7vhgbvOTGEdukUQSYrKj9P8jc1s8exomTsAiqdFxTUpzfiammUSL+M93lP4urtahl1jjXFX7gd3DzdEEb0NsGkx5lm/qdsty8/TeAvKUmC+RVU6T856W6MqN0P+yGbpWUARcSE7fwztC3SPxwAuxvIN3BHmRhOUHoORPNG2VpfbnscIzBKJR4v0JKzbpi0IDa66K+tCGsCEvQuL4cxVOtoUySPWNSUAyUWWUrGM2k=
on:
tags: true
distributions: "sdist bdist_wheel"

74
CONDUCT.md Normal file
View File

@@ -0,0 +1,74 @@
# Contributor Covenant Code of Conduct
## Our Pledge
In the interest of fostering an open and welcoming environment, we as
contributors and maintainers pledge to making participation in our project and
our community a harassment-free experience for everyone, regardless of age, body
size, disability, ethnicity, gender identity and expression, level of experience,
nationality, personal appearance, race, religion, or sexual identity and
orientation.
## Our Standards
Examples of behavior that contributes to creating a positive environment
include:
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery and unwelcome sexual attention or
advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic
address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Our Responsibilities
Project maintainers are responsible for clarifying the standards of acceptable
behavior and are expected to take appropriate and fair corrective action in
response to any instances of unacceptable behavior.
Project maintainers have the right and responsibility to remove, edit, or
reject comments, commits, code, wiki edits, issues, and other contributions
that are not aligned to this Code of Conduct, or to ban temporarily or
permanently any contributor for other behaviors that they deem inappropriate,
threatening, offensive, or harmful.
## Scope
This Code of Conduct applies both within project spaces and in public spaces
when an individual is representing the project or its community. Examples of
representing a project or community include using an official project e-mail
address, posting via an official social media account, or acting as an appointed
representative at an online or offline event. Representation of a project may be
further defined and clarified by project maintainers.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported by contacting the project team at sanic-maintainers@googlegroups.com. All
complaints will be reviewed and investigated and will result in a response that
is deemed necessary and appropriate to the circumstances. The project team is
obligated to maintain confidentiality with regard to the reporter of an incident.
Further details of specific enforcement policies may be posted separately.
Project maintainers who do not follow or enforce the Code of Conduct in good
faith may face temporary or permanent repercussions as determined by other
members of the project's leadership.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
available at [http://contributor-covenant.org/version/1/4][version]
[homepage]: http://contributor-covenant.org
[version]: http://contributor-covenant.org/version/1/4/

View File

@@ -4,6 +4,11 @@ Thank you for your interest! Sanic is always looking for contributors. If you
don't feel comfortable contributing code, adding docstrings to the source files
is very appreciated.
We are committed to providing a friendly, safe and welcoming environment for all,
regardless of gender, sexual orientation, disability, ethnicity, religion,
or similar personal characteristic.
Our [code of conduct](./CONDUCT.md) sets the standards for behavior.
## Installation
To develop on sanic (and mainly to just run the tests) it is highly recommend to
@@ -29,14 +34,19 @@ See it's that simple!
## Pull requests!
So the pull request approval rules are pretty simple:
1. All pull requests must pass unit tests
2. All pull requests must be reviewed and approved by at least
one current collaborator on the project
3. All pull requests must pass flake8 checks
4. If you decide to remove/change anything from any common interface
1. All pull requests must pass unit tests.
2. All pull requests must be reviewed and approved by at least
one current collaborator on the project.
3. All pull requests must pass flake8 checks.
4. All pull requests must be consistent with the existing code.
5. If you decide to remove/change anything from any common interface
a deprecation message should accompany it.
5. If you implement a new feature you should have at least one unit
6. If you implement a new feature you should have at least one unit
test to accompany it.
7. An example must be one of the following:
* Example of how to use Sanic
* Example of how to use Sanic extensions
* Example of how to use Sanic and asynchronous library
## Documentation

View File

@@ -1,6 +1,6 @@
MIT License
Copyright (c) [year] [fullname]
Copyright (c) 2016-present Channel Cat
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
@@ -18,4 +18,4 @@ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
SOFTWARE.

7
MANIFEST.in Normal file
View File

@@ -0,0 +1,7 @@
include README.rst
include MANIFEST.in
include LICENSE
include setup.py
recursive-exclude * __pycache__
recursive-exclude * *.py[co]

View File

@@ -1,5 +1,5 @@
Sanic
=================================
=====
|Join the chat at https://gitter.im/sanic-python/Lobby| |Build Status| |PyPI| |PyPI version|
@@ -9,33 +9,7 @@ On top of being Flask-like, Sanic supports async request handlers. This means y
Sanic is developed `on GitHub <https://github.com/channelcat/sanic/>`_. Contributions are welcome!
Benchmarks
----------
All tests were run on an AWS medium instance running ubuntu, using 1
process. Each script delivered a small JSON response and was tested with
wrk using 100 connections. Pypy was tested for Falcon and Flask but did
not speed up requests.
+-----------+-----------------------+----------------+---------------+
| Server | Implementation | Requests/sec | Avg Latency |
+===========+=======================+================+===============+
| Sanic | Python 3.5 + uvloop | 33,342 | 2.96ms |
+-----------+-----------------------+----------------+---------------+
| Wheezy | gunicorn + meinheld | 20,244 | 4.94ms |
+-----------+-----------------------+----------------+---------------+
| Falcon | gunicorn + meinheld | 18,972 | 5.27ms |
+-----------+-----------------------+----------------+---------------+
| Bottle | gunicorn + meinheld | 13,596 | 7.36ms |
+-----------+-----------------------+----------------+---------------+
| Flask | gunicorn + meinheld | 4,988 | 20.08ms |
+-----------+-----------------------+----------------+---------------+
| Kyoukai | Python 3.5 + uvloop | 3,889 | 27.44ms |
+-----------+-----------------------+----------------+---------------+
| Aiohttp | Python 3.5 + uvloop | 2,979 | 33.42ms |
+-----------+-----------------------+----------------+---------------+
| Tornado | Python 3.5 | 2,138 | 46.66ms |
+-----------+-----------------------+----------------+---------------+
If you have a project that utilizes Sanic make sure to comment on the `issue <https://github.com/channelcat/sanic/issues/396>`_ that we use to track those projects!
Hello World Example
-------------------
@@ -57,13 +31,13 @@ Hello World Example
Installation
------------
- ``python -m pip install sanic``
- ``pip install sanic``
To install sanic without uvloop or json using bash, you can provide either or both of these environmental variables
using any truthy string like `'y', 'yes', 't', 'true', 'on', '1'` and setting the NO_X to true will stop that features
installation.
- ``SANIC_NO_UVLOOP=true SANIC_NO_UJSON=true python -m pip install sanic``
- ``SANIC_NO_UVLOOP=true SANIC_NO_UJSON=true pip install sanic``
Documentation
@@ -81,11 +55,21 @@ Documentation
:target: https://pypi.python.org/pypi/sanic/
.. |PyPI version| image:: https://img.shields.io/pypi/pyversions/sanic.svg
:target: https://pypi.python.org/pypi/sanic/
Examples
--------
`Non-Core examples <https://github.com/channelcat/sanic/wiki/Examples/>`_. Examples of plugins and Sanic that are outside the scope of Sanic core.
`Extensions <https://github.com/channelcat/sanic/wiki/Extensions/>`_. Sanic extensions created by the community.
`Projects <https://github.com/channelcat/sanic/wiki/Projects/>`_. Sanic in production use.
TODO
----
* Streamed file processing
* http2
* http2
Limitations
-----------
* No wheels for uvloop and httptools on Windows :(

View File

@@ -1,20 +1,225 @@
# Minimal makefile for Sphinx documentation
# Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
SPHINXPROJ = Sanic
SOURCEDIR = .
PAPER =
BUILDDIR = _build
# Put it first so that "make" without argument is like "make help".
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
.PHONY: help
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " applehelp to make an Apple Help Book"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " epub3 to make an epub3"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
@echo " text to make text files"
@echo " man to make manual pages"
@echo " texinfo to make Texinfo files"
@echo " info to make Texinfo files and run them through makeinfo"
@echo " gettext to make PO message catalogs"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " xml to make Docutils-native XML files"
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
@echo " coverage to run coverage check of the documentation (if enabled)"
@echo " dummy to check syntax errors of document sources"
.PHONY: help Makefile
.PHONY: clean
clean:
rm -rf $(BUILDDIR)/*
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: html
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
.PHONY: dirhtml
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
.PHONY: singlehtml
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
.PHONY: pickle
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
.PHONY: json
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
.PHONY: htmlhelp
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
.PHONY: qthelp
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/aiographite.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/aiographite.qhc"
.PHONY: applehelp
applehelp:
$(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp
@echo
@echo "Build finished. The help book is in $(BUILDDIR)/applehelp."
@echo "N.B. You won't be able to view it unless you put it in" \
"~/Library/Documentation/Help or install it in your application" \
"bundle."
.PHONY: devhelp
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/aiographite"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/aiographite"
@echo "# devhelp"
.PHONY: epub
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
.PHONY: epub3
epub3:
$(SPHINXBUILD) -b epub3 $(ALLSPHINXOPTS) $(BUILDDIR)/epub3
@echo
@echo "Build finished. The epub3 file is in $(BUILDDIR)/epub3."
.PHONY: latex
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
.PHONY: latexpdf
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
.PHONY: latexpdfja
latexpdfja:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through platex and dvipdfmx..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
.PHONY: text
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
.PHONY: man
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
.PHONY: texinfo
texinfo:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
@echo "Run \`make' in that directory to run these through makeinfo" \
"(use \`make info' here to do that automatically)."
.PHONY: info
info:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo "Running Texinfo files through makeinfo..."
make -C $(BUILDDIR)/texinfo info
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
.PHONY: gettext
gettext:
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
@echo
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
.PHONY: changes
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
.PHONY: linkcheck
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
.PHONY: doctest
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."
.PHONY: coverage
coverage:
$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
@echo "Testing of coverage in the sources finished, look at the " \
"results in $(BUILDDIR)/coverage/python.txt."
.PHONY: xml
xml:
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
@echo
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
.PHONY: pseudoxml
pseudoxml:
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
@echo
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
.PHONY: dummy
dummy:
$(SPHINXBUILD) -b dummy $(ALLSPHINXOPTS) $(BUILDDIR)/dummy
@echo
@echo "Build finished. Dummy builder generates no files."

View File

@@ -13,6 +13,9 @@ import sys
# Add support for Markdown documentation using Recommonmark
from recommonmark.parser import CommonMarkParser
# Add support for auto-doc
from recommonmark.transform import AutoStructify
# Ensure that sanic is present in the path, to allow sphinx-apidoc to
# autogenerate documentation from docstrings
root_directory = os.path.dirname(os.getcwd())
@@ -22,7 +25,7 @@ import sanic
# -- General configuration ------------------------------------------------
extensions = ['sphinx.ext.autodoc']
extensions = ['sphinx.ext.autodoc', 'sphinxcontrib.asyncio']
templates_path = ['_templates']
@@ -140,3 +143,12 @@ epub_exclude_files = ['search.html']
# -- Custom Settings -------------------------------------------------------
suppress_warnings = ['image.nonlocal_uri']
# app setup hook
def setup(app):
app.add_config_value('recommonmark_config', {
'enable_eval_rst': True,
'enable_auto_doc_ref': True,
}, True)
app.add_transform(AutoStructify)

View File

@@ -16,14 +16,17 @@ Guides
sanic/blueprints
sanic/config
sanic/cookies
sanic/decorators
sanic/streaming
sanic/class_based_views
sanic/custom_protocol
sanic/ssl
sanic/logging
sanic/testing
sanic/deploying
sanic/extensions
sanic/contributing
sanic/api_reference
Module Documentation
@@ -32,4 +35,5 @@ Module Documentation
.. toctree::
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

View File

@@ -1,19 +1,64 @@
@ECHO OFF
pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=.
set BUILDDIR=_build
set SPHINXPROJ=Sanic
set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% .
set I18NSPHINXOPTS=%SPHINXOPTS% .
if NOT "%PAPER%" == "" (
set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS%
set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS%
)
if "%1" == "" goto help
%SPHINXBUILD% >NUL 2>NUL
if "%1" == "help" (
:help
echo.Please use `make ^<target^>` where ^<target^> is one of
echo. html to make standalone HTML files
echo. dirhtml to make HTML files named index.html in directories
echo. singlehtml to make a single large HTML file
echo. pickle to make pickle files
echo. json to make JSON files
echo. htmlhelp to make HTML files and a HTML help project
echo. qthelp to make HTML files and a qthelp project
echo. devhelp to make HTML files and a Devhelp project
echo. epub to make an epub
echo. epub3 to make an epub3
echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter
echo. text to make text files
echo. man to make manual pages
echo. texinfo to make Texinfo files
echo. gettext to make PO message catalogs
echo. changes to make an overview over all changed/added/deprecated items
echo. xml to make Docutils-native XML files
echo. pseudoxml to make pseudoxml-XML files for display purposes
echo. linkcheck to check all external links for integrity
echo. doctest to run all doctests embedded in the documentation if enabled
echo. coverage to run coverage check of the documentation if enabled
echo. dummy to check syntax errors of document sources
goto end
)
if "%1" == "clean" (
for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i
del /q /s %BUILDDIR%\*
goto end
)
REM Check if sphinx-build is available and fallback to Python version if any
%SPHINXBUILD% 1>NUL 2>NUL
if errorlevel 9009 goto sphinx_python
goto sphinx_ok
:sphinx_python
set SPHINXBUILD=python -m sphinx.__init__
%SPHINXBUILD% 2> nul
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
@@ -26,11 +71,211 @@ if errorlevel 9009 (
exit /b 1
)
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
goto end
:sphinx_ok
:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
if "%1" == "html" (
%SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/html.
goto end
)
if "%1" == "dirhtml" (
%SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml.
goto end
)
if "%1" == "singlehtml" (
%SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml.
goto end
)
if "%1" == "pickle" (
%SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can process the pickle files.
goto end
)
if "%1" == "json" (
%SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can process the JSON files.
goto end
)
if "%1" == "htmlhelp" (
%SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can run HTML Help Workshop with the ^
.hhp project file in %BUILDDIR%/htmlhelp.
goto end
)
if "%1" == "qthelp" (
%SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can run "qcollectiongenerator" with the ^
.qhcp project file in %BUILDDIR%/qthelp, like this:
echo.^> qcollectiongenerator %BUILDDIR%\qthelp\aiographite.qhcp
echo.To view the help file:
echo.^> assistant -collectionFile %BUILDDIR%\qthelp\aiographite.ghc
goto end
)
if "%1" == "devhelp" (
%SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished.
goto end
)
if "%1" == "epub" (
%SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The epub file is in %BUILDDIR%/epub.
goto end
)
if "%1" == "epub3" (
%SPHINXBUILD% -b epub3 %ALLSPHINXOPTS% %BUILDDIR%/epub3
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The epub3 file is in %BUILDDIR%/epub3.
goto end
)
if "%1" == "latex" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
if errorlevel 1 exit /b 1
echo.
echo.Build finished; the LaTeX files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "latexpdf" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
cd %BUILDDIR%/latex
make all-pdf
cd %~dp0
echo.
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "latexpdfja" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
cd %BUILDDIR%/latex
make all-pdf-ja
cd %~dp0
echo.
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "text" (
%SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The text files are in %BUILDDIR%/text.
goto end
)
if "%1" == "man" (
%SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The manual pages are in %BUILDDIR%/man.
goto end
)
if "%1" == "texinfo" (
%SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo.
goto end
)
if "%1" == "gettext" (
%SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The message catalogs are in %BUILDDIR%/locale.
goto end
)
if "%1" == "changes" (
%SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes
if errorlevel 1 exit /b 1
echo.
echo.The overview file is in %BUILDDIR%/changes.
goto end
)
if "%1" == "linkcheck" (
%SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck
if errorlevel 1 exit /b 1
echo.
echo.Link check complete; look for any errors in the above output ^
or in %BUILDDIR%/linkcheck/output.txt.
goto end
)
if "%1" == "doctest" (
%SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest
if errorlevel 1 exit /b 1
echo.
echo.Testing of doctests in the sources finished, look at the ^
results in %BUILDDIR%/doctest/output.txt.
goto end
)
if "%1" == "coverage" (
%SPHINXBUILD% -b coverage %ALLSPHINXOPTS% %BUILDDIR%/coverage
if errorlevel 1 exit /b 1
echo.
echo.Testing of coverage in the sources finished, look at the ^
results in %BUILDDIR%/coverage/python.txt.
goto end
)
if "%1" == "xml" (
%SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The XML files are in %BUILDDIR%/xml.
goto end
)
if "%1" == "pseudoxml" (
%SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml.
goto end
)
if "%1" == "dummy" (
%SPHINXBUILD% -b dummy %ALLSPHINXOPTS% %BUILDDIR%/dummy
if errorlevel 1 exit /b 1
echo.
echo.Build finished. Dummy builder generates no files.
goto end
)
:end
popd

View File

@@ -0,0 +1,150 @@
API Reference
=============
Submodules
----------
sanic.app module
----------------
.. automodule:: sanic.app
:members:
:undoc-members:
:show-inheritance:
sanic.blueprints module
-----------------------
.. automodule:: sanic.blueprints
:members:
:undoc-members:
:show-inheritance:
sanic.config module
-------------------
.. automodule:: sanic.config
:members:
:undoc-members:
:show-inheritance:
sanic.constants module
----------------------
.. automodule:: sanic.constants
:members:
:undoc-members:
:show-inheritance:
sanic.cookies module
--------------------
.. automodule:: sanic.cookies
:members:
:undoc-members:
:show-inheritance:
sanic.exceptions module
-----------------------
.. automodule:: sanic.exceptions
:members:
:undoc-members:
:show-inheritance:
sanic.handlers module
---------------------
.. automodule:: sanic.handlers
:members:
:undoc-members:
:show-inheritance:
sanic.log module
----------------
.. automodule:: sanic.log
:members:
:undoc-members:
:show-inheritance:
sanic.request module
--------------------
.. automodule:: sanic.request
:members:
:undoc-members:
:show-inheritance:
sanic.response module
---------------------
.. automodule:: sanic.response
:members:
:undoc-members:
:show-inheritance:
sanic.router module
-------------------
.. automodule:: sanic.router
:members:
:undoc-members:
:show-inheritance:
sanic.server module
-------------------
.. automodule:: sanic.server
:members:
:undoc-members:
:show-inheritance:
sanic.static module
-------------------
.. automodule:: sanic.static
:members:
:undoc-members:
:show-inheritance:
sanic.testing module
--------------------
.. automodule:: sanic.testing
:members:
:undoc-members:
:show-inheritance:
sanic.views module
------------------
.. automodule:: sanic.views
:members:
:undoc-members:
:show-inheritance:
sanic.websocket module
----------------------
.. automodule:: sanic.websocket
:members:
:undoc-members:
:show-inheritance:
sanic.worker module
-------------------
.. automodule:: sanic.worker
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: sanic
:members:
:undoc-members:
:show-inheritance:

View File

@@ -93,7 +93,14 @@ def ignore_404s(request, exception):
Static files can be served globally, under the blueprint prefix.
```python
bp.static('/folder/to/serve', '/web/path')
# suppose bp.name == 'bp'
bp.static('/web/path', '/folder/to/serve')
# also you can pass name parameter to it for url_for
bp.static('/web/path', '/folder/to/server', name='uploads')
app.url_for('static', name='bp.uploads', filename='file.txt') == '/bp/web/path/file.txt'
```
## Start and stop
@@ -169,10 +176,10 @@ app.run(host='0.0.0.0', port=8000, debug=True)
If you wish to generate a URL for a route inside of a blueprint, remember that the endpoint name
takes the format `<blueprint_name>.<handler_name>`. For example:
```
```python
@blueprint_v1.route('/')
async def root(request):
url = app.url_for('v1.post_handler', post_id=5) # --> '/v1/post/5'
url = request.app.url_for('v1.post_handler', post_id=5) # --> '/v1/post/5'
return redirect(url)

View File

@@ -1,6 +1,6 @@
# Configuration
Any reasonably complex application will need configuration that is not baked into the acutal code. Settings might be different for different environments or installations.
Any reasonably complex application will need configuration that is not baked into the actual code. Settings might be different for different environments or installations.
## Basics
@@ -29,12 +29,18 @@ In general the convention is to only have UPPERCASE configuration parameters. Th
There are several ways how to load configuration.
### From environment variables.
### From Environment Variables
Any variables defined with the `SANIC_` prefix will be applied to the sanic config. For example, setting `SANIC_REQUEST_TIMEOUT` will be loaded by the application automatically. You can pass the `load_vars` boolean to the Sanic constructor to override that:
Any variables defined with the `SANIC_` prefix will be applied to the sanic config. For example, setting `SANIC_REQUEST_TIMEOUT` will be loaded by the application automatically and fed into the `REQUEST_TIMEOUT` config variable. You can pass a different prefix to Sanic:
```python
app = Sanic(load_vars=False)
app = Sanic(load_env='MYAPP_')
```
Then the above variable would be `MYAPP_REQUEST_TIMEOUT`. If you want to disable loading from environment variables you can set it to `False` instead:
```python
app = Sanic(load_env=False)
```
### From an Object
@@ -52,7 +58,7 @@ You could use a class or any other object as well.
### From a File
Usually you will want to load configuration from a file that is not part of the distributed application. You can load configuration from a file using `from_file(/path/to/config_file)`. However, that requires the program to know the path to the config file. So instead you can specify the location of the config file in an environment variable and tell Sanic to use that to find the config file:
Usually you will want to load configuration from a file that is not part of the distributed application. You can load configuration from a file using `from_pyfile(/path/to/config_file)`. However, that requires the program to know the path to the config file. So instead you can specify the location of the config file in an environment variable and tell Sanic to use that to find the config file:
```
app = Sanic('myapp')
@@ -66,7 +72,7 @@ $ MYAPP_SETTINGS=/path/to/config_file python3 myapp.py
INFO: Goin' Fast @ http://0.0.0.0:8000
```
The config files are regular Python files which are executed in order to load them. This allows you to use arbitrary logic for constructing the right configuration. Only uppercase varibales are added to the configuration. Most commonly the configuration consists of simple key value pairs:
The config files are regular Python files which are executed in order to load them. This allows you to use arbitrary logic for constructing the right configuration. Only uppercase variables are added to the configuration. Most commonly the configuration consists of simple key value pairs:
```
# config_file
@@ -79,8 +85,35 @@ DB_USER = 'appuser'
Out of the box there are just a few predefined values which can be overwritten when creating the application.
| Variable | Default | Description |
| ----------------- | --------- | --------------------------------- |
| REQUEST_MAX_SIZE | 100000000 | How big a request may be (bytes) |
| REQUEST_TIMEOUT | 60 | How long a request can take (sec) |
| KEEP_ALIVE | True | Disables keep-alive when False |
| Variable | Default | Description |
| ------------------ | --------- | --------------------------------------------- |
| REQUEST_MAX_SIZE | 100000000 | How big a request may be (bytes) |
| REQUEST_TIMEOUT | 60 | How long a request can take to arrive (sec) |
| RESPONSE_TIMEOUT | 60 | How long a response can take to process (sec) |
| KEEP_ALIVE | True | Disables keep-alive when False |
| KEEP_ALIVE_TIMEOUT | 5 | How long to hold a TCP connection open (sec) |
### The different Timeout variables:
A request timeout measures the duration of time between the instant when a new open TCP connection is passed to the Sanic backend server, and the instant when the whole HTTP request is received. If the time taken exceeds the `REQUEST_TIMEOUT` value (in seconds), this is considered a Client Error so Sanic generates a HTTP 408 response and sends that to the client. Adjust this value higher if your clients routinely pass very large request payloads or upload requests very slowly.
A response timeout measures the duration of time between the instant the Sanic server passes the HTTP request to the Sanic App, and the instant a HTTP response is sent to the client. If the time taken exceeds the `RESPONSE_TIMEOUT` value (in seconds), this is considered a Server Error so Sanic generates a HTTP 503 response and sets that to the client. Adjust this value higher if your application is likely to have long-running process that delay the generation of a response.
### What is Keep Alive? And what does the Keep Alive Timeout value do?
Keep-Alive is a HTTP feature indroduced in HTTP 1.1. When sending a HTTP request, the client (usually a web browser application) can set a Keep-Alive header to indicate for the http server (Sanic) to not close the TCP connection after it has send the response. This allows the client to reuse the existing TCP connection to send subsequent HTTP requests, and ensures more efficient network traffic for both the client and the server.
The `KEEP_ALIVE` config variable is set to `True` in Sanic by default. If you don't need this feature in your application, set it to `False` to cause all client connections to close immediately after a response is sent, regardless of the Keep-Alive header on the request.
The amount of time the server holds the TCP connection open is decided by the server itself. In Sanic, that value is configured using the `KEEP_ALIVE_TIMEOUT` value. By default, it is set to 5 seconds, this is the same default setting as the Apache HTTP server and is a good balance between allowing enough time for the client to send a new request, and not holding open too many connections at once. Do not exceed 75 seconds unless you know your clients are using a browser which supports TCP connections held open for that long.
For reference:
```
Apache httpd server default keepalive timeout = 5 seconds
Nginx server default keepalive timeout = 75 seconds
Nginx performance tuning guidelines uses keepalive = 15 seconds
IE (5-9) client hard keepalive limit = 60 seconds
Firefox client hard keepalive limit = 115 seconds
Opera 11 client hard keepalive limit = 120 seconds
Chrome 13+ client keepalive limit > 300+ seconds
```

View File

@@ -1,75 +0,0 @@
# Cookies
Cookies are pieces of data which persist inside a user's browser. Sanic can
both read and write cookies, which are stored as key-value pairs.
## Reading cookies
A user's cookies can be accessed via the `Request` object's `cookies` dictionary.
```python
from sanic.response import text
@app.route("/cookie")
async def test(request):
test_cookie = request.cookies.get('test')
return text("Test cookie set to: {}".format(test_cookie))
```
## Writing cookies
When returning a response, cookies can be set on the `Response` object.
```python
from sanic.response import text
@app.route("/cookie")
async def test(request):
response = text("There's a cookie up in this response")
response.cookies['test'] = 'It worked!'
response.cookies['test']['domain'] = '.gotta-go-fast.com'
response.cookies['test']['httponly'] = True
return response
```
## Deleting cookies
Cookies can be removed semantically or explicitly.
```python
from sanic.response import text
@app.route("/cookie")
async def test(request):
response = text("Time to eat some cookies muahaha")
# This cookie will be set to expire in 0 seconds
del response.cookies['kill_me']
# This cookie will self destruct in 5 seconds
response.cookies['short_life'] = 'Glad to be here'
response.cookies['short_life']['max-age'] = 5
del response.cookies['favorite_color']
# This cookie will remain unchanged
response.cookies['favorite_color'] = 'blue'
response.cookies['favorite_color'] = 'pink'
del response.cookies['favorite_color']
return response
```
Response cookies can be set like dictionary values and have the following
parameters available:
- `expires` (datetime): The time for the cookie to expire on the
client's browser.
- `path` (string): The subset of URLs to which this cookie applies. Defaults to /.
- `comment` (string): A comment (metadata).
- `domain` (string): Specifies the domain for which the cookie is valid. An
explicitly specified domain must always start with a dot.
- `max-age` (number): Number of seconds the cookie should live for.
- `secure` (boolean): Specifies whether the cookie will only be sent via
HTTPS.
- `httponly` (boolean): Specifies whether the cookie cannot be read by
Javascript.

87
docs/sanic/cookies.rst Normal file
View File

@@ -0,0 +1,87 @@
Cookies
=======
Cookies are pieces of data which persist inside a user's browser. Sanic can
both read and write cookies, which are stored as key-value pairs.
.. warning::
Cookies can be freely altered by the client. Therefore you cannot just store
data such as login information in cookies as-is, as they can be freely altered
by the client. To ensure data you store in cookies is not forged or tampered
with by the client, use something like `itsdangerous`_ to cryptographically
sign the data.
Reading cookies
---------------
A user's cookies can be accessed via the ``Request`` object's ``cookies`` dictionary.
.. code-block:: python
from sanic.response import text
@app.route("/cookie")
async def test(request):
test_cookie = request.cookies.get('test')
return text("Test cookie set to: {}".format(test_cookie))
Writing cookies
---------------
When returning a response, cookies can be set on the ``Response`` object.
.. code-block:: python
from sanic.response import text
@app.route("/cookie")
async def test(request):
response = text("There's a cookie up in this response")
response.cookies['test'] = 'It worked!'
response.cookies['test']['domain'] = '.gotta-go-fast.com'
response.cookies['test']['httponly'] = True
return response
Deleting cookies
----------------
Cookies can be removed semantically or explicitly.
.. code-block:: python
from sanic.response import text
@app.route("/cookie")
async def test(request):
response = text("Time to eat some cookies muahaha")
# This cookie will be set to expire in 0 seconds
del response.cookies['kill_me']
# This cookie will self destruct in 5 seconds
response.cookies['short_life'] = 'Glad to be here'
response.cookies['short_life']['max-age'] = 5
del response.cookies['favorite_color']
# This cookie will remain unchanged
response.cookies['favorite_color'] = 'blue'
response.cookies['favorite_color'] = 'pink'
del response.cookies['favorite_color']
return response
Response cookies can be set like dictionary values and have the following
parameters available:
- ``expires`` (datetime): The time for the cookie to expire on the client's browser.
- ``path`` (string): The subset of URLs to which this cookie applies. Defaults to /.
- ``comment`` (string): A comment (metadata).
- ``domain`` (string): Specifies the domain for which the cookie is valid. An
explicitly specified domain must always start with a dot.
- ``max-age`` (number): Number of seconds the cookie should live for.
- ``secure`` (boolean): Specifies whether the cookie will only be sent via HTTPS.
- ``httponly`` (boolean): Specifies whether the cookie cannot be read by Javascript.
.. _itsdangerous: https://pythonhosted.org/itsdangerous/

View File

@@ -54,5 +54,25 @@ In order to run Sanic application with Gunicorn, you need to use the special `sa
for Gunicorn `worker-class` argument:
```
gunicorn --bind 0.0.0.0:1337 --worker-class sanic.worker.GunicornWorker
gunicorn myapp:app --bind 0.0.0.0:1337 --worker-class sanic.worker.GunicornWorker
```
If your application suffers from memory leaks, you can configure Gunicorn to gracefully restart a worker
after it has processed a given number of requests. This can be a convenient way to help limit the effects
of the memory leak.
See the [Gunicorn Docs](http://docs.gunicorn.org/en/latest/settings.html#max-requests) for more information.
## Asynchronous support
This is suitable if you *need* to share the sanic process with other applications, in particular the `loop`.
However be advised that this method does not support using multiple processes, and is not the preferred way
to run the app in general.
Here is an incomplete example (please see `run_async.py` in examples for something more practical):
```python
server = app.create_server(host="0.0.0.0", port=8000)
loop = asyncio.get_event_loop()
task = asyncio.ensure_future(server)
loop.run_forever()
```

View File

@@ -17,6 +17,19 @@ def i_am_ready_to_die(request):
raise ServerError("Something bad happened", status_code=500)
```
You can also use the `abort` function with the appropriate status code:
```python
from sanic.exceptions import abort
from sanic.response import text
@app.route('/youshallnotpass')
def no_no(request):
abort(401)
# this won't happen
text("OK")
```
## Handling exceptions
To override Sanic's default handling of an exception, the `@app.exception`

View File

@@ -1,12 +1,13 @@
# Extensions
A list of Sanic extensions created by the community.
- [Sanic-Plugins-Framework](https://github.com/ashleysommer/sanicpluginsframework): Library for easily creating and using Sanic plugins.
- [Sessions](https://github.com/subyraman/sanic_session): Support for sessions.
Allows using redis, memcache or an in memory store.
- [CORS](https://github.com/ashleysommer/sanic-cors): A port of flask-cors.
- [Compress](https://github.com/subyraman/sanic_compress): Allows you to easily gzip Sanic responses. A port of Flask-Compress.
- [Jinja2](https://github.com/lixxu/sanic-jinja2): Support for Jinja2 template.
- [JWT](https://github.com/ahopkins/sanic-jwt): Authentication extension for JSON Web Tokens (JWT).
- [OpenAPI/Swagger](https://github.com/channelcat/sanic-openapi): OpenAPI support, plus a Swagger UI.
- [Pagination](https://github.com/lixxu/python-paginate): Simple pagination support.
- [Motor](https://github.com/lixxu/sanic-motor): Simple motor wrapper.
@@ -22,3 +23,6 @@ A list of Sanic extensions created by the community.
- [sanic-graphql](https://github.com/graphql-python/sanic-graphql): GraphQL integration with Sanic
- [sanic-prometheus](https://github.com/dkruchinin/sanic-prometheus): Prometheus metrics for Sanic
- [Sanic-RestPlus](https://github.com/ashleysommer/sanic-restplus): A port of Flask-RestPlus for Sanic. Full-featured REST API with SwaggerUI generation.
- [sanic-transmute](https://github.com/yunstanford/sanic-transmute): A Sanic extension that generates APIs from python function and classes, and also generates Swagger UI/documentation automatically.
- [pytest-sanic](https://github.com/yunstanford/pytest-sanic): A pytest plugin for Sanic. It helps you to test your code asynchronously.
- [jinja2-sanic](https://github.com/yunstanford/jinja2-sanic): a jinja2 template renderer for Sanic.([Documentation](http://jinja2-sanic.readthedocs.io/en/latest/))

View File

@@ -9,15 +9,16 @@ syntax, so earlier versions of python won't work.
```python
from sanic import Sanic
from sanic.response import text
from sanic.response import json
app = Sanic(__name__)
app = Sanic()
@app.route("/")
async def test(request):
return text('Hello world!')
return json({"hello": "world"})
app.run(host="0.0.0.0", port=8000, debug=True)
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8000)
```
3. Run the server: `python3 main.py`

85
docs/sanic/logging.md Normal file
View File

@@ -0,0 +1,85 @@
# Logging
Sanic allows you to do different types of logging (access log, error log) on the requests based on the [python3 logging API](https://docs.python.org/3/howto/logging.html). You should have some basic knowledge on python3 logging if you want to create a new configuration.
### Quick Start
A simple example using default settings would be like this:
```python
from sanic import Sanic
app = Sanic('test')
@app.route('/')
async def test(request):
return response.text('Hello World!')
if __name__ == "__main__":
app.run(debug=True, access_log=True)
```
To use your own logging config, simply use `logging.config.dictConfig`, or
pass `log_config` when you initialize `Sanic` app:
```python
app = Sanic('test', log_config=LOGGING_CONFIG)
```
And to close logging, simply assign access_log=False:
```python
if __name__ == "__main__":
app.run(access_log=False)
```
This would skip calling logging functions when handling requests.
And you could even do further in production to gain extra speed:
```python
if __name__ == "__main__":
# disable debug messages
app.run(debug=False, access_log=False)
```
### Configuration
By default, log_config parameter is set to use sanic.log.LOGGING_CONFIG_DEFAULTS dictionary for configuration.
There are three `loggers` used in sanic, and **must be defined if you want to create your own logging configuration**:
- root:<br>
Used to log internal messages.
- sanic.error:<br>
Used to log error logs.
- sanic.access:<br>
Used to log access logs.
#### Log format:
In addition to default parameters provided by python (asctime, levelname, message),
Sanic provides additional parameters for access logger with:
- host (str)<br>
request.ip
- request (str)<br>
request.method + " " + request.url
- status (int)<br>
response.status
- byte (int)<br>
len(response.body)
The default access log format is
```python
%(asctime)s - (%(name)s)[%(levelname)s][%(host)s]: %(request)s %(message)s %(status)d %(byte)d
```

View File

@@ -4,7 +4,7 @@ Middleware are functions which are executed before or after requests to the
server. They can be used to modify the *request to* or *response from*
user-defined handler functions.
Additionally, Sanic providers listeners which allow you to run code at various points of your application's lifecycle.
Additionally, Sanic provides listeners which allow you to run code at various points of your application's lifecycle.
## Middleware

View File

@@ -71,8 +71,14 @@ The following variables are accessible as properties on `Request` objects:
return text("You are trying to create a user with the following POST: %s" % request.body)
```
- `headers` (dict) - A case-insensitive dictionary that contains the request headers.
- `ip` (str) - IP address of the requester.
- `port` (str) - Port address of the requester.
- `socket` (tuple) - (IP, port) of the requester.
- `app` - a reference to the Sanic application object that is handling this request. This is useful when inside blueprints or other handlers in modules that do not have access to the global `app` object.
```python
@@ -94,6 +100,8 @@ The following variables are accessible as properties on `Request` objects:
- `host`: The host associated with the request: `localhost:8080`
- `path`: The path of the request: `/posts/1/`
- `query_string`: The query string of the request: `foo=bar` or a blank string `''`
- `uri_template`: Template for matching route handler: `/posts/<id>/`
- `token`: The value of Authorization header: `Basic YWRtaW46YWRtaW4=`
## Accessing values using `get` and `getlist`

View File

@@ -60,6 +60,16 @@ async def index(request):
return response.stream(streaming_fn, content_type='text/plain')
```
## File Streaming
For large files, a combination of File and Streaming above
```python
from sanic import response
@app.route('/big_file.png')
async def handle_request(request):
return await response.file_stream('/srv/www/whatever.png')
```
## Redirect
```python

View File

@@ -52,7 +52,7 @@ async def integer_handler(request, integer_arg):
async def number_handler(request, number_arg):
return text('Number - {}'.format(number_arg))
@app.route('/person/<name:[A-z]>')
@app.route('/person/<name:[A-z]+>')
async def person_handler(request, name):
return text('Person - {}'.format(name))
@@ -215,3 +215,120 @@ and `recv` methods to send and receive data respectively.
WebSocket support requires the [websockets](https://github.com/aaugustin/websockets)
package by Aymeric Augustin.
## About `strict_slashes`
You can make `routes` strict to trailing slash or not, it's configurable.
```python
# provide default strict_slashes value for all routes
app = Sanic('test_route_strict_slash', strict_slashes=True)
# you can also overwrite strict_slashes value for specific route
@app.get('/get', strict_slashes=False)
def handler(request):
return text('OK')
# It also works for blueprints
bp = Blueprint('test_bp_strict_slash', strict_slashes=True)
@bp.get('/bp/get', strict_slashes=False)
def handler(request):
return text('OK')
app.blueprint(bp)
```
## User defined route name
You can pass `name` to change the route name to avoid using the default name (`handler.__name__`).
```python
app = Sanic('test_named_route')
@app.get('/get', name='get_handler')
def handler(request):
return text('OK')
# then you need use `app.url_for('get_handler')`
# instead of # `app.url_for('handler')`
# It also works for blueprints
bp = Blueprint('test_named_bp')
@bp.get('/bp/get', name='get_handler')
def handler(request):
return text('OK')
app.blueprint(bp)
# then you need use `app.url_for('test_named_bp.get_handler')`
# instead of `app.url_for('test_named_bp.handler')`
# different names can be used for same url with different methods
@app.get('/test', name='route_test')
def handler(request):
return text('OK')
@app.post('/test', name='route_post')
def handler2(request):
return text('OK POST')
@app.put('/test', name='route_put')
def handler3(request):
return text('OK PUT')
# below url are the same, you can use any of them
# '/test'
app.url_for('route_test')
# app.url_for('route_post')
# app.url_for('route_put')
# for same handler name with different methods
# you need specify the name (it's url_for issue)
@app.get('/get')
def handler(request):
return text('OK')
@app.post('/post', name='post_handler')
def handler(request):
return text('OK')
# then
# app.url_for('handler') == '/get'
# app.url_for('post_handler') == '/post'
```
## Build URL for static files
You can use `url_for` for static file url building now.
If it's for file directly, `filename` can be ignored.
```python
app = Sanic('test_static')
app.static('/static', './static')
app.static('/uploads', './uploads', name='uploads')
app.static('/the_best.png', '/home/ubuntu/test.png', name='best_png')
bp = Blueprint('bp', url_prefix='bp')
bp.static('/static', './static')
bp.static('/uploads', './uploads', name='uploads')
bp.static('/the_best.png', '/home/ubuntu/test.png', name='best_png')
app.blueprint(bp)
# then build the url
app.url_for('static', filename='file.txt') == '/static/file.txt'
app.url_for('static', name='static', filename='file.txt') == '/static/file.txt'
app.url_for('static', name='uploads', filename='file.txt') == '/uploads/file.txt'
app.url_for('static', name='best_png') == '/the_best.png'
# blueprint url building
app.url_for('static', name='bp.static', filename='file.txt') == '/bp/static/file.txt'
app.url_for('static', name='bp.uploads', filename='file.txt') == '/bp/uploads/file.txt'
app.url_for('static', name='bp.best_png') == '/bp/static/the_best.png'
```

View File

@@ -6,16 +6,40 @@ filename. The file specified will then be accessible via the given endpoint.
```python
from sanic import Sanic
from sanic.blueprints import Blueprint
app = Sanic(__name__)
# Serves files from the static folder to the URL /static
app.static('/static', './static')
# use url_for to build the url, name defaults to 'static' and can be ignored
app.url_for('static', filename='file.txt') == '/static/file.txt'
app.url_for('static', name='static', filename='file.txt') == '/static/file.txt'
# Serves the file /home/ubuntu/test.png when the URL /the_best.png
# is requested
app.static('/the_best.png', '/home/ubuntu/test.png')
app.static('/the_best.png', '/home/ubuntu/test.png', name='best_png')
# you can use url_for to build the static file url
# you can ignore name and filename parameters if you don't define it
app.url_for('static', name='best_png') == '/the_best.png'
app.url_for('static', name='best_png', filename='any') == '/the_best.png'
# you need define the name for other static files
app.static('/another.png', '/home/ubuntu/another.png', name='another')
app.url_for('static', name='another') == '/another.png'
app.url_for('static', name='another', filename='any') == '/another.png'
# also, you can use static for blueprint
bp = Blueprint('bp', url_prefix='/bp')
bp.static('/static', './static')
# servers the file directly
bp.static('/the_best.png', '/home/ubuntu/test.png', name='best_png')
app.blueprint(bp)
app.url_for('static', name='bp.static', filename='file.txt') == '/bp/static/file.txt'
app.url_for('static', name='bp.best_png') == '/bp/test_best.png'
app.run(host="0.0.0.0", port=8000)
```
Note: currently you cannot build a URL for a static file using `url_for`.

View File

@@ -1,5 +1,79 @@
# Streaming
## Request Streaming
Sanic allows you to get request data by stream, as below. When the request ends, `request.stream.get()` returns `None`. Only post, put and patch decorator have stream argument.
```python
from sanic import Sanic
from sanic.views import CompositionView
from sanic.views import HTTPMethodView
from sanic.views import stream as stream_decorator
from sanic.blueprints import Blueprint
from sanic.response import stream, text
bp = Blueprint('blueprint_request_stream')
app = Sanic('request_stream')
class SimpleView(HTTPMethodView):
@stream_decorator
async def post(self, request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
@app.post('/stream', stream=True)
async def handler(request):
async def streaming(response):
while True:
body = await request.stream.get()
if body is None:
break
body = body.decode('utf-8').replace('1', 'A')
response.write(body)
return stream(streaming)
@bp.put('/bp_stream', stream=True)
async def bp_handler(request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8').replace('1', 'A')
return text(result)
async def post_handler(request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
app.blueprint(bp)
app.add_route(SimpleView.as_view(), '/method_view')
view = CompositionView()
view.add(['POST'], post_handler, stream=True)
app.add_route(view, '/composition_view')
if __name__ == '__main__':
app.run(host='127.0.0.1', port=8000)
```
## Response Streaming
Sanic allows you to stream content to the client with the `stream` method. This method accepts a coroutine callback which is passed a `StreamingHTTPResponse` object that is written to. A simple example is like follows:
```python
@@ -29,4 +103,4 @@ async def index(request):
response.write(record[0])
return stream(stream_from_db)
```
```

View File

@@ -59,15 +59,69 @@ the available arguments to aiohttp can be found
[in the documentation for ClientSession](https://aiohttp.readthedocs.io/en/stable/client_reference.html#client-session).
### Deprecated: `sanic_endpoint_test`
## pytest-sanic
Prior to version 0.3.2, testing was provided through the `sanic_endpoint_test` method. This method will be deprecated in the next major version after 0.4.0; please use the `test_client` instead.
[pytest-sanic](https://github.com/yunstanford/pytest-sanic) is a pytest plugin, it helps you to test your code asynchronously.
Just write tests like,
```
from sanic.utils import sanic_endpoint_test
def test_index_returns_200():
request, response = sanic_endpoint_test(app)
assert response.status == 200
```python
async def test_sanic_db_find_by_id(app):
"""
Let's assume that, in db we have,
{
"id": "123",
"name": "Kobe Bryant",
"team": "Lakers",
}
"""
doc = await app.db["players"].find_by_id("123")
assert doc.name == "Kobe Bryant"
assert doc.team == "Lakers"
```
[pytest-sanic](https://github.com/yunstanford/pytest-sanic) also provides some useful fixtures, like loop, unused_port,
test_server, test_client.
```python
@pytest.yield_fixture
def app():
app = Sanic("test_sanic_app")
@app.route("/test_get", methods=['GET'])
async def test_get(request):
return response.json({"GET": True})
@app.route("/test_post", methods=['POST'])
async def test_post(request):
return response.json({"POST": True})
yield app
@pytest.fixture
def test_cli(loop, app, test_client):
return loop.run_until_complete(test_client(app, protocol=WebSocketProtocol))
#########
# Tests #
#########
async def test_fixture_test_client_get(test_cli):
"""
GET request
"""
resp = await test_cli.get('/test_get')
assert resp.status == 200
resp_json = await resp.json()
assert resp_json == {"GET": True}
async def test_fixture_test_client_post(test_cli):
"""
POST request
"""
resp = await test_cli.post('/test_post')
assert resp.status == 200
resp_json = await resp.json()
assert resp_json == {"POST": True}
```

50
docs/sanic/versioning.md Normal file
View File

@@ -0,0 +1,50 @@
# Versioning
You can pass the `version` keyword to the route decorators, or to a blueprint initializer. It will result in the `v{version}` url prefix where `{version}` is the version number.
## Per route
You can pass a version number to the routes directly.
```python
from sanic import response
@app.route('/text', version=1)
def handle_request(request):
return response.text('Hello world! Version 1')
@app.route('/text', version=2)
def handle_request(request):
return response.text('Hello world! Version 2')
app.run(port=80)
```
Then with curl:
```bash
curl localhost/v1/text
curl localhost/v2/text
```
## Global blueprint version
You can also pass a version number to the blueprint, which will apply to all routes.
```python
from sanic import response
from sanic.blueprints import Blueprint
bp = Blueprint('test', version=1)
@bp.route('/html')
def handle_request(request):
return response.html('<p>Hello world!</p>')
```
Then with curl:
```bash
curl localhost/v1/html
```

View File

@@ -0,0 +1,17 @@
# -*- coding: utf-8 -*-
import asyncio
from sanic import Sanic
app = Sanic()
async def notify_server_started_after_five_seconds():
await asyncio.sleep(5)
print('Server successfully started!')
app.add_task(notify_server_started_after_five_seconds())
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,26 +0,0 @@
from sanic import Sanic
from sanic import response
import aiohttp
app = Sanic(__name__)
async def fetch(session, url):
"""
Use session object to perform 'get' request on url
"""
async with session.get(url) as result:
return await result.json()
@app.route('/')
async def handle_request(request):
url = "https://api.github.com/repos/channelcat/sanic"
async with aiohttp.ClientSession() as session:
result = await fetch(session, url)
return response.json(result)
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000, workers=2)

View File

@@ -1,140 +0,0 @@
from sanic import Sanic
from sanic.exceptions import NotFound
from sanic.response import json
from sanic.views import HTTPMethodView
from asyncorm import configure_orm
from asyncorm.exceptions import QuerysetError
from library.models import Book
from library.serializer import BookSerializer
app = Sanic(name=__name__)
@app.listener('before_server_start')
def orm_configure(sanic, loop):
db_config = {'database': 'sanic_example',
'host': 'localhost',
'user': 'sanicdbuser',
'password': 'sanicDbPass',
}
# configure_orm needs a dictionary with:
# * the database configuration
# * the application/s where the models are defined
orm_app = configure_orm({'loop': loop, # always use the sanic loop!
'db_config': db_config,
'modules': ['library', ], # list of apps
})
# orm_app is the object that orchestrates the whole ORM
# sync_db should be run only once, better do that as external command
# it creates the tables in the database!!!!
# orm_app.sync_db()
# for all the 404 lets handle the exceptions
@app.exception(NotFound)
def ignore_404s(request, exception):
return json({'method': request.method,
'status': exception.status_code,
'error': exception.args[0],
'results': None,
})
# now the propper sanic workflow
class BooksView(HTTPMethodView):
def arg_parser(self, request):
parsed_args = {}
for k, v in request.args.items():
parsed_args[k] = v[0]
return parsed_args
async def get(self, request):
filtered_by = self.arg_parser(request)
if filtered_by:
q_books = await Book.objects.filter(**filtered_by)
else:
q_books = await Book.objects.all()
books = [BookSerializer.serialize(book) for book in q_books]
return json({'method': request.method,
'status': 200,
'results': books or None,
'count': len(books),
})
async def post(self, request):
# populate the book with the data in the request
book = Book(**request.json)
# and await on save
await book.save()
return json({'method': request.method,
'status': 201,
'results': BookSerializer.serialize(book),
})
class BookView(HTTPMethodView):
async def get_object(self, request, book_id):
try:
# await on database consults
book = await Book.objects.get(**{'id': book_id})
except QuerysetError as e:
raise NotFound(e.args[0])
return book
async def get(self, request, book_id):
# await on database consults
book = await self.get_object(request, book_id)
return json({'method': request.method,
'status': 200,
'results': BookSerializer.serialize(book),
})
async def put(self, request, book_id):
# await on database consults
book = await self.get_object(request, book_id)
# await on save
await book.save(**request.json)
return json({'method': request.method,
'status': 200,
'results': BookSerializer.serialize(book),
})
async def patch(self, request, book_id):
# await on database consults
book = await self.get_object(request, book_id)
# await on save
await book.save(**request.json)
return json({'method': request.method,
'status': 200,
'results': BookSerializer.serialize(book),
})
async def delete(self, request, book_id):
# await on database consults
book = await self.get_object(request, book_id)
# await on its deletion
await book.delete()
return json({'method': request.method,
'status': 200,
'results': None
})
app.add_route(BooksView.as_view(), '/books/')
app.add_route(BookView.as_view(), '/books/<book_id:int>/')
if __name__ == '__main__':
app.run()

View File

@@ -1,21 +0,0 @@
from asyncorm.model import Model
from asyncorm.fields import CharField, IntegerField, DateField
BOOK_CHOICES = (
('hard cover', 'hard cover book'),
('paperback', 'paperback book')
)
# This is a simple model definition
class Book(Model):
name = CharField(max_length=50)
synopsis = CharField(max_length=255)
book_type = CharField(max_length=15, null=True, choices=BOOK_CHOICES)
pages = IntegerField(null=True)
date_created = DateField(auto_now=True)
class Meta():
ordering = ['name', ]
unique_together = ['name', 'synopsis']

View File

@@ -1,15 +0,0 @@
from asyncorm.model import ModelSerializer, SerializerMethod
from library.models import Book
class BookSerializer(ModelSerializer):
book_type = SerializerMethod()
def get_book_type(self, instance):
return instance.book_type_display()
class Meta():
model = Book
fields = [
'id', 'name', 'synopsis', 'book_type', 'pages', 'date_created'
]

View File

@@ -1,2 +0,0 @@
asyncorm==0.0.7
sanic==0.4.1

View File

@@ -0,0 +1,42 @@
# -*- coding: utf-8 -*-
from sanic import Sanic
from functools import wraps
from sanic.response import json
app = Sanic()
def check_request_for_authorization_status(request):
# Note: Define your check, for instance cookie, session.
flag = True
return flag
def authorized():
def decorator(f):
@wraps(f)
async def decorated_function(request, *args, **kwargs):
# run some method that checks the request
# for the client's authorization status
is_authorized = check_request_for_authorization_status(request)
if is_authorized:
# the user is authorized.
# run the handler method and return the response
response = await f(request, *args, **kwargs)
return response
else:
# the user is not authorized.
return json({'status': 'not_authorized'}, 403)
return decorated_function
return decorator
@app.route("/")
@authorized()
async def test(request):
return json({'status': 'authorized'})
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,12 +1,12 @@
from sanic import Sanic
from sanic import Blueprint
from sanic.response import json, text
from sanic.response import json
app = Sanic(__name__)
blueprint = Blueprint('name', url_prefix='/my_blueprint')
blueprint2 = Blueprint('name', url_prefix='/my_blueprint2')
blueprint3 = Blueprint('name', url_prefix='/my_blueprint3')
blueprint2 = Blueprint('name2', url_prefix='/my_blueprint2')
blueprint3 = Blueprint('name3', url_prefix='/my_blueprint3')
@blueprint.route('/foo')
@@ -18,6 +18,7 @@ async def foo(request):
async def foo2(request):
return json({'msg': 'hi from blueprint2'})
@blueprint3.websocket('/foo')
async def foo3(request, ws):
while True:

View File

@@ -1,46 +0,0 @@
"""
Example of caching using aiocache package. To run it you will need a Redis
instance running in localhost:6379. You can also try with SimpleMemoryCache.
Running this example you will see that the first call lasts 3 seconds and
the rest are instant because the value is retrieved from the Redis.
If you want more info about the package check
https://github.com/argaen/aiocache
"""
import asyncio
import aiocache
from sanic import Sanic
from sanic.response import json
from sanic.log import log
from aiocache import cached
from aiocache.serializers import JsonSerializer
app = Sanic(__name__)
@app.listener('before_server_start')
def init_cache(sanic, loop):
aiocache.settings.set_defaults(
class_="aiocache.RedisCache",
# class_="aiocache.SimpleMemoryCache",
loop=loop
)
@cached(key="my_custom_key", serializer=JsonSerializer())
async def expensive_call():
log.info("Expensive has been called")
await asyncio.sleep(3)
return {"test": True}
@app.route("/")
async def test(request):
log.info("Received GET /")
return json(await expensive_call())
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,41 +0,0 @@
from sanic import Sanic
from sanic import response
from tornado.platform.asyncio import BaseAsyncIOLoop, to_asyncio_future
from distributed import LocalCluster, Client
app = Sanic(__name__)
def square(x):
return x**2
@app.listener('after_server_start')
async def setup(app, loop):
# configure tornado use asyncio's loop
ioloop = BaseAsyncIOLoop(loop)
# init distributed client
app.client = Client('tcp://localhost:8786', loop=ioloop, start=False)
await to_asyncio_future(app.client._start())
@app.listener('before_server_stop')
async def stop(app, loop):
await to_asyncio_future(app.client._shutdown())
@app.route('/<value:int>')
async def test(request, value):
future = app.client.submit(square, value)
result = await to_asyncio_future(future._result())
return response.text(f'The square of {value} is {result}')
if __name__ == '__main__':
# Distributed cluster should run somewhere else
with LocalCluster(scheduler_port=8786, nanny=False, n_workers=2,
threads_per_worker=1) as cluster:
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,136 +0,0 @@
# This demo requires aioredis and environmental variables established in ENV_VARS
import json
import logging
import os
from datetime import datetime
import aioredis
import sanic
from sanic import Sanic
ENV_VARS = ["REDIS_HOST", "REDIS_PORT",
"REDIS_MINPOOL", "REDIS_MAXPOOL",
"REDIS_PASS", "APP_LOGFILE"]
app = Sanic(name=__name__)
logger = None
@app.middleware("request")
async def log_uri(request):
# Simple middleware to log the URI endpoint that was called
logger.info("URI called: {0}".format(request.url))
@app.listener('before_server_start')
async def before_server_start(app, loop):
logger.info("Starting redis pool")
app.redis_pool = await aioredis.create_pool(
(app.config.REDIS_HOST, int(app.config.REDIS_PORT)),
minsize=int(app.config.REDIS_MINPOOL),
maxsize=int(app.config.REDIS_MAXPOOL),
password=app.config.REDIS_PASS)
@app.listener('after_server_stop')
async def after_server_stop(app, loop):
logger.info("Closing redis pool")
app.redis_pool.close()
await app.redis_pool.wait_closed()
@app.middleware("request")
async def attach_db_connectors(request):
# Just put the db objects in the request for easier access
logger.info("Passing redis pool to request object")
request["redis"] = request.app.redis_pool
@app.route("/state/<user_id>", methods=["GET"])
async def access_state(request, user_id):
try:
# Check to see if the value is in cache, if so lets return that
with await request["redis"] as redis_conn:
state = await redis_conn.get(user_id, encoding="utf-8")
if state:
return sanic.response.json({"msg": "Success",
"status": 200,
"success": True,
"data": json.loads(state),
"finished_at": datetime.now().isoformat()})
# Then state object is not in redis
logger.critical("Unable to find user_data in cache.")
return sanic.response.HTTPResponse({"msg": "User state not found",
"success": False,
"status": 404,
"finished_at": datetime.now().isoformat()}, status=404)
except aioredis.ProtocolError:
logger.critical("Unable to connect to state cache")
return sanic.response.HTTPResponse({"msg": "Internal Server Error",
"status": 500,
"success": False,
"finished_at": datetime.now().isoformat()}, status=500)
@app.route("/state/<user_id>/push", methods=["POST"])
async def set_state(request, user_id):
try:
# Pull a connection from the pool
with await request["redis"] as redis_conn:
# Set the value in cache to your new value
await redis_conn.set(user_id, json.dumps(request.json), expire=1800)
logger.info("Successfully pushed state to cache")
return sanic.response.HTTPResponse({"msg": "Successfully pushed state to cache",
"success": True,
"status": 200,
"finished_at": datetime.now().isoformat()})
except aioredis.ProtocolError:
logger.critical("Unable to connect to state cache")
return sanic.response.HTTPResponse({"msg": "Internal Server Error",
"status": 500,
"success": False,
"finished_at": datetime.now().isoformat()}, status=500)
def configure():
# Setup environment variables
env_vars = [os.environ.get(v, None) for v in ENV_VARS]
if not all(env_vars):
# Send back environment variables that were not set
return False, ", ".join([ENV_VARS[i] for i, flag in env_vars if not flag])
else:
# Add all the env vars to our app config
app.config.update({k: v for k, v in zip(ENV_VARS, env_vars)})
setup_logging()
return True, None
def setup_logging():
logging_format = "[%(asctime)s] %(process)d-%(levelname)s "
logging_format += "%(module)s::%(funcName)s():l%(lineno)d: "
logging_format += "%(message)s"
logging.basicConfig(
filename=app.config.APP_LOGFILE,
format=logging_format,
level=logging.DEBUG)
def main(result, missing):
if result:
try:
app.run(host="0.0.0.0", port=8080, debug=True)
except:
logging.critical("User killed server. Closing")
else:
logging.critical("Unable to start. Missing environment variables [{0}]".format(missing))
if __name__ == "__main__":
result, missing = configure()
logger = logging.getLogger()
main(result, missing)

View File

@@ -37,7 +37,6 @@ server's error_handler to an instance of our CustomHandler
"""
from sanic import Sanic
from sanic import response
app = Sanic(__name__)
@@ -49,8 +48,7 @@ app.error_handler = handler
async def test(request):
# Here, something occurs which causes an unexpected exception
# This exception will flow to our custom handler.
1 / 0
return response.json({"test": True})
raise SanicException('You Broke It!')
app.run(host="0.0.0.0", port=8000, debug=True)
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000, debug=True)

View File

@@ -1,27 +0,0 @@
# Render templates in a Flask like way from a "template" directory in the project
from sanic import Sanic
from sanic import response
from jinja2 import Evironment, PackageLoader, select_autoescape
app = Sanic(__name__)
# Load the template environment with async support
template_env = Environment(
loader=jinja2.PackageLoader('yourapplication', 'templates'),
autoescape=jinja2.select_autoescape(['html', 'xml']),
enable_async=True
)
# Load the template from file
template = template_env.get_template("example_template.html")
@app.route('/')
async def test(request):
data = request.json
rendered_template = await template.render_async(**data)
return response.html(rendered_template)
app.run(host="0.0.0.0", port=8080, debug=True)

View File

@@ -8,11 +8,12 @@ app = Sanic(__name__)
sem = None
@app.listener('before_server_start')
def init(sanic, loop):
global sem
CONCURRENCY_PER_WORKER = 4
sem = asyncio.Semaphore(CONCURRENCY_PER_WORKER, loop=loop)
concurrency_per_worker = 4
sem = asyncio.Semaphore(concurrency_per_worker, loop=loop)
async def bounded_fetch(session, url):
"""

View File

@@ -0,0 +1,86 @@
'''
Based on example from https://github.com/Skyscanner/aiotask-context
and `examples/{override_logging,run_async}.py`.
Needs https://github.com/Skyscanner/aiotask-context/tree/52efbc21e2e1def2d52abb9a8e951f3ce5e6f690 or newer
$ pip install git+https://github.com/Skyscanner/aiotask-context.git
'''
import asyncio
import uuid
import logging
from signal import signal, SIGINT
from sanic import Sanic
from sanic import response
import uvloop
import aiotask_context as context
log = logging.getLogger(__name__)
class RequestIdFilter(logging.Filter):
def filter(self, record):
record.request_id = context.get('X-Request-ID')
return True
LOG_SETTINGS = {
'version': 1,
'disable_existing_loggers': False,
'handlers': {
'console': {
'class': 'logging.StreamHandler',
'level': 'DEBUG',
'formatter': 'default',
'filters': ['requestid'],
},
},
'filters': {
'requestid': {
'()': RequestIdFilter,
},
},
'formatters': {
'default': {
'format': '%(asctime)s %(levelname)s %(name)s:%(lineno)d %(request_id)s | %(message)s',
},
},
'loggers': {
'': {
'level': 'DEBUG',
'handlers': ['console'],
'propagate': True
},
}
}
app = Sanic(__name__, log_config=LOG_SETTINGS)
@app.middleware('request')
async def set_request_id(request):
request_id = request.headers.get('X-Request-ID') or str(uuid.uuid4())
context.set("X-Request-ID", request_id)
@app.route("/")
async def test(request):
log.debug('X-Request-ID: %s', context.get('X-Request-ID'))
log.info('Hello from test!')
return response.json({"test": True})
if __name__ == '__main__':
asyncio.set_event_loop(uvloop.new_event_loop())
server = app.create_server(host="0.0.0.0", port=8000)
loop = asyncio.get_event_loop()
loop.set_task_factory(context.task_factory)
task = asyncio.ensure_future(server)
try:
loop.run_forever()
except:
loop.stop()

View File

@@ -7,6 +7,7 @@ from sanic import response
app = Sanic(__name__)
@app.route('/')
def handle_request(request):
return response.json(
@@ -14,7 +15,8 @@ def handle_request(request):
headers={'X-Served-By': 'sanic'},
status=200
)
@app.route('/unauthorized')
def handle_request(request):
return response.json(

View File

@@ -14,6 +14,8 @@ log = logging.getLogger()
# Set logger to override default basicConfig
sanic = Sanic()
@sanic.route("/")
def test(request):
log.info("received request; responding with 'hey'")

View File

@@ -1,85 +0,0 @@
from sanic import Sanic
from sanic_session import InMemorySessionInterface
from sanic_jinja2 import SanicJinja2
import json
import plotly
import pandas as pd
import numpy as np
app = Sanic(__name__)
jinja = SanicJinja2(app)
session = InMemorySessionInterface(cookie_name=app.name, prefix=app.name)
@app.middleware('request')
async def print_on_request(request):
print(request.headers)
await session.open(request)
@app.middleware('response')
async def print_on_response(request, response):
await session.save(request, response)
@app.route('/')
async def index(request):
rng = pd.date_range('1/1/2011', periods=7500, freq='H')
ts = pd.Series(np.random.randn(len(rng)), index=rng)
graphs = [
dict(
data=[
dict(
x=[1, 2, 3],
y=[10, 20, 30],
type='scatter'
),
],
layout=dict(
title='first graph'
)
),
dict(
data=[
dict(
x=[1, 3, 5],
y=[10, 50, 30],
type='bar'
),
],
layout=dict(
title='second graph'
)
),
dict(
data=[
dict(
x=ts.index, # Can use the pandas data structures directly
y=ts
)
]
)
]
# Add "ids" to each of the graphs to pass up to the client
# for templating
ids = ['graph-{}'.format(i) for i, _ in enumerate(graphs)]
# Convert the figures to JSON
# PlotlyJSONEncoder appropriately converts pandas, datetime, etc
# objects to their JSON equivalents
graphJSON = json.dumps(graphs, cls=plotly.utils.PlotlyJSONEncoder)
return jinja.render('index.html', request,
ids=ids,
graphJSON=graphJSON)
if __name__ == '__main__':
app.run(host='0.0.0.0', port=8000, debug=True)

View File

@@ -1,5 +0,0 @@
pandas==0.19.2
plotly==2.0.7
sanic==0.5.0
sanic-jinja2==0.5.1
sanic-session==0.1.3

49
examples/pytest_xdist.py Normal file
View File

@@ -0,0 +1,49 @@
"""pytest-xdist example for sanic server
Install testing tools:
$ pip install pytest pytest-xdist
Run with xdist params:
$ pytest examples/pytest_xdist.py -n 8 # 8 workers
"""
import re
from sanic import Sanic
from sanic.response import text
from sanic.testing import PORT as PORT_BASE, SanicTestClient
import pytest
@pytest.fixture(scope="session")
def test_port(worker_id):
m = re.search(r'[0-9]+', worker_id)
if m:
num_id = m.group(0)
else:
num_id = 0
port = PORT_BASE + int(num_id)
return port
@pytest.fixture(scope="session")
def app():
app = Sanic()
@app.route('/')
async def index(request):
return text('OK')
return app
@pytest.fixture(scope="session")
def client(app, test_port):
return SanicTestClient(app, test_port)
@pytest.mark.parametrize('run_id', range(100))
def test_index(client, run_id):
request, response = client._sanic_endpoint_test('get', '/')
assert response.status == 200
assert response.text == 'OK'

View File

@@ -7,7 +7,8 @@ app = Sanic(__name__)
@app.route('/')
def handle_request(request):
return response.redirect('/redirect')
@app.route('/redirect')
async def test(request):
return response.json({"Redirected": True})

View File

@@ -0,0 +1,10 @@
import requests
# Warning: This is a heavy process.
data = ""
for i in range(1, 250000):
data += str(i)
r = requests.post('http://0.0.0.0:8000/stream', data=data)
print(r.text)

View File

@@ -0,0 +1,65 @@
from sanic import Sanic
from sanic.views import CompositionView
from sanic.views import HTTPMethodView
from sanic.views import stream as stream_decorator
from sanic.blueprints import Blueprint
from sanic.response import stream, text
bp = Blueprint('blueprint_request_stream')
app = Sanic('request_stream')
class SimpleView(HTTPMethodView):
@stream_decorator
async def post(self, request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
@app.post('/stream', stream=True)
async def handler(request):
async def streaming(response):
while True:
body = await request.stream.get()
if body is None:
break
body = body.decode('utf-8').replace('1', 'A')
response.write(body)
return stream(streaming)
@bp.put('/bp_stream', stream=True)
async def bp_handler(request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8').replace('1', 'A')
return text(result)
async def post_handler(request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
app.blueprint(bp)
app.add_route(SimpleView.as_view(), '/method_view')
view = CompositionView()
view.add(['POST'], post_handler, stream=True)
app.add_route(view, '/composition_view')
if __name__ == '__main__':
app.run(host='0.0.0.0', port=8000)

View File

@@ -18,4 +18,4 @@ async def test(request):
def timeout(request, exception):
return response.text('RequestTimeout from error_handler.', 408)
app.run(host='0.0.0.0', port=8000)
app.run(host='0.0.0.0', port=8000)

View File

@@ -1,12 +1,12 @@
from sanic import Sanic
from sanic import response
from multiprocessing import Event
from signal import signal, SIGINT
import asyncio
import uvloop
app = Sanic(__name__)
@app.route("/")
async def test(request):
return response.json({"answer": "42"})

View File

@@ -1,62 +0,0 @@
# encoding: utf-8
"""
You need the aiomysql
"""
import os
import aiomysql
from sanic import Sanic
from sanic.response import json
database_name = os.environ['DATABASE_NAME']
database_host = os.environ['DATABASE_HOST']
database_user = os.environ['DATABASE_USER']
database_password = os.environ['DATABASE_PASSWORD']
app = Sanic()
@app.listener("before_server_start")
async def get_pool(app, loop):
"""
the first param is the global instance ,
so we can store our connection pool in it .
and it can be used by different request
:param args:
:param kwargs:
:return:
"""
app.pool = {
"aiomysql": await aiomysql.create_pool(host=database_host, user=database_user, password=database_password,
db=database_name,
maxsize=5)}
async with app.pool['aiomysql'].acquire() as conn:
async with conn.cursor() as cur:
await cur.execute('DROP TABLE IF EXISTS sanic_polls')
await cur.execute("""CREATE TABLE sanic_polls (
id serial primary key,
question varchar(50),
pub_date timestamp
);""")
for i in range(0, 100):
await cur.execute("""INSERT INTO sanic_polls
(id, question, pub_date) VALUES ({}, {}, now())
""".format(i, i))
@app.route("/")
async def test():
result = []
data = {}
async with app.pool['aiomysql'].acquire() as conn:
async with conn.cursor() as cur:
await cur.execute("SELECT question, pub_date FROM sanic_polls")
async for row in cur:
result.append({"question": row[0], "pub_date": row[1]})
if result or len(result) > 0:
data['data'] = res
return json(data)
if __name__ == '__main__':
app.run(host="127.0.0.1", workers=4, port=12000)

View File

@@ -1,65 +0,0 @@
""" To run this example you need additional aiopg package
"""
import os
import asyncio
import uvloop
import aiopg
from sanic import Sanic
from sanic.response import json
database_name = os.environ['DATABASE_NAME']
database_host = os.environ['DATABASE_HOST']
database_user = os.environ['DATABASE_USER']
database_password = os.environ['DATABASE_PASSWORD']
connection = 'postgres://{0}:{1}@{2}/{3}'.format(database_user,
database_password,
database_host,
database_name)
async def get_pool():
return await aiopg.create_pool(connection)
app = Sanic(name=__name__)
@app.listener('before_server_start')
async def prepare_db(app, loop):
"""
Let's create some table and add some data
"""
async with aiopg.create_pool(connection) as pool:
async with pool.acquire() as conn:
async with conn.cursor() as cur:
await cur.execute('DROP TABLE IF EXISTS sanic_polls')
await cur.execute("""CREATE TABLE sanic_polls (
id serial primary key,
question varchar(50),
pub_date timestamp
);""")
for i in range(0, 100):
await cur.execute("""INSERT INTO sanic_polls
(id, question, pub_date) VALUES ({}, {}, now())
""".format(i, i))
@app.route("/")
async def handle(request):
result = []
async def test_select():
async with aiopg.create_pool(connection) as pool:
async with pool.acquire() as conn:
async with conn.cursor() as cur:
await cur.execute("SELECT question, pub_date FROM sanic_polls")
async for row in cur:
result.append({"question": row[0], "pub_date": row[1]})
res = await test_select()
return json({'polls': result})
if __name__ == '__main__':
app.run(host='0.0.0.0',
port=8000,
debug=True)

View File

@@ -1,67 +0,0 @@
""" To run this example you need additional aiopg package
"""
import os
import asyncio
import datetime
import uvloop
from aiopg.sa import create_engine
import sqlalchemy as sa
from sanic import Sanic
from sanic.response import json
database_name = os.environ['DATABASE_NAME']
database_host = os.environ['DATABASE_HOST']
database_user = os.environ['DATABASE_USER']
database_password = os.environ['DATABASE_PASSWORD']
connection = 'postgres://{0}:{1}@{2}/{3}'.format(database_user,
database_password,
database_host,
database_name)
metadata = sa.MetaData()
polls = sa.Table('sanic_polls', metadata,
sa.Column('id', sa.Integer, primary_key=True),
sa.Column('question', sa.String(50)),
sa.Column("pub_date", sa.DateTime))
app = Sanic(name=__name__)
@app.listener('before_server_start')
async def prepare_db(app, loop):
""" Let's add some data
"""
async with create_engine(connection) as engine:
async with engine.acquire() as conn:
await conn.execute('DROP TABLE IF EXISTS sanic_polls')
await conn.execute("""CREATE TABLE sanic_polls (
id serial primary key,
question varchar(50),
pub_date timestamp
);""")
for i in range(0, 100):
await conn.execute(
polls.insert().values(question=i,
pub_date=datetime.datetime.now())
)
@app.route("/")
async def handle(request):
async with create_engine(connection) as engine:
async with engine.acquire() as conn:
result = []
async for row in conn.execute(polls.select()):
result.append({"question": row.question,
"pub_date": row.pub_date})
return json({"polls": result})
if __name__ == '__main__':
app.run(host='0.0.0.0', port=8000)

View File

@@ -1,34 +0,0 @@
""" To run this example you need additional aioredis package
"""
from sanic import Sanic, response
import aioredis
app = Sanic(__name__)
@app.route("/")
async def handle(request):
async with request.app.redis_pool.get() as redis:
await redis.set('test-my-key', 'value')
val = await redis.get('test-my-key')
return response.text(val.decode('utf-8'))
@app.listener('before_server_start')
async def before_server_start(app, loop):
app.redis_pool = await aioredis.create_pool(
('localhost', 6379),
minsize=5,
maxsize=10,
loop=loop
)
@app.listener('after_server_stop')
async def after_server_stop(app, loop):
app.redis_pool.close()
await app.redis_pool.wait_closed()
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,51 +0,0 @@
import os
import asyncio
import uvloop
from asyncpg import connect, create_pool
from sanic import Sanic
from sanic.response import json
DB_CONFIG = {
'host': '<host>',
'user': '<user>',
'password': '<password>',
'port': '<port>',
'database': '<database>'
}
def jsonify(records):
"""
Parse asyncpg record response into JSON format
"""
return [dict(r.items()) for r in records]
app = Sanic(__name__)
@app.listener('before_server_start')
async def register_db(app, loop):
app.pool = await create_pool(**DB_CONFIG, loop=loop, max_size=100)
async with app.pool.acquire() as connection:
await connection.execute('DROP TABLE IF EXISTS sanic_post')
await connection.execute("""CREATE TABLE sanic_post (
id serial primary key,
content varchar(50),
post_date timestamp
);""")
for i in range(0, 1000):
await connection.execute(f"""INSERT INTO sanic_post
(id, content, post_date) VALUES ({i}, {i}, now())""")
@app.get('/')
async def root_get(request):
async with app.pool.acquire() as connection:
results = await connection.fetch('SELECT * FROM sanic_post')
return json({'posts': jsonify(results)})
if __name__ == '__main__':
app.run(host='127.0.0.1', port=8080)

View File

@@ -1,41 +0,0 @@
""" sanic motor (async driver for mongodb) example
Required packages:
pymongo==3.4.0
motor==1.1
sanic==0.2.0
"""
from sanic import Sanic
from sanic import response
app = Sanic('motor_mongodb')
def get_db():
from motor.motor_asyncio import AsyncIOMotorClient
mongo_uri = "mongodb://127.0.0.1:27017/test"
client = AsyncIOMotorClient(mongo_uri)
return client['test']
@app.route('/objects', methods=['GET'])
async def get(request):
db = get_db()
docs = await db.test_col.find().to_list(length=100)
for doc in docs:
doc['id'] = str(doc['_id'])
del doc['_id']
return response.json(docs)
@app.route('/post', methods=['POST'])
async def new(request):
doc = request.json
print(doc)
db = get_db()
object_id = await db.test_col.save(doc)
return response.json({'object_id': str(object_id)})
if __name__ == "__main__":
app.run(host='0.0.0.0', port=8000, debug=True)

View File

@@ -1,116 +0,0 @@
## You need the following additional packages for this example
# aiopg
# peewee_async
# peewee
## sanic imports
from sanic import Sanic
from sanic.response import json
## peewee_async related imports
import peewee
from peewee import Model, BaseModel
from peewee_async import Manager, PostgresqlDatabase, execute
from functools import partial
# we instantiate a custom loop so we can pass it to our db manager
## from peewee_async docs:
# Also theres no need to connect and re-connect before executing async queries
# with manager! Its all automatic. But you can run Manager.connect() or
# Manager.close() when you need it.
class AsyncManager(Manager):
"""Inherit the peewee_async manager with our own object
configuration
database.allow_sync = False
"""
def __init__(self, _model_class, *args, **kwargs):
super(AsyncManager, self).__init__(*args, **kwargs)
self._model_class = _model_class
self.database.allow_sync = False
def _do_fill(self, method, *args, **kwargs):
_class_method = getattr(super(AsyncManager, self), method)
pf = partial(_class_method, self._model_class)
return pf(*args, **kwargs)
def new(self, *args, **kwargs):
return self._do_fill('create', *args, **kwargs)
def get(self, *args, **kwargs):
return self._do_fill('get', *args, **kwargs)
def execute(self, query):
return execute(query)
def _get_meta_db_class(db):
"""creating a declartive class model for db"""
class _BlockedMeta(BaseModel):
def __new__(cls, name, bases, attrs):
_instance = super(_BlockedMeta, cls).__new__(cls, name, bases, attrs)
_instance.objects = AsyncManager(_instance, db)
return _instance
class _Base(Model, metaclass=_BlockedMeta):
def to_dict(self):
return self._data
class Meta:
database=db
return _Base
def declarative_base(*args, **kwargs):
"""Returns a new Modeled Class after inheriting meta and Model classes"""
db = PostgresqlDatabase(*args, **kwargs)
return _get_meta_db_class(db)
AsyncBaseModel = declarative_base(database='test',
host='127.0.0.1',
user='postgres',
password='mysecretpassword')
# let's create a simple key value store:
class KeyValue(AsyncBaseModel):
key = peewee.CharField(max_length=40, unique=True)
text = peewee.TextField(default='')
app = Sanic('peewee_example')
@app.route('/post/<key>/<value>')
async def post(request, key, value):
"""
Save get parameters to database
"""
obj = await KeyValue.objects.new(key=key, text=value)
return json({'object_id': obj.id})
@app.route('/get')
async def get(request):
"""
Load all objects from database
"""
all_objects = await KeyValue.objects.execute(KeyValue.select())
serialized_obj = []
for obj in all_objects:
serialized_obj.append({
'id': obj.id,
'key': obj.key,
'value': obj.text}
)
return json({'objects': serialized_obj})
if __name__ == "__main__":
app.run(host='0.0.0.0', port=8000)

13
examples/teapot.py Normal file
View File

@@ -0,0 +1,13 @@
from sanic import Sanic
from sanic import response as res
app = Sanic(__name__)
@app.route("/")
async def test(req):
return res.text("I\'m a teapot", status=418)
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000)

View File

@@ -18,33 +18,42 @@ def test_sync(request):
return response.json({"test": True})
@app.route("/dynamic/<name>/<id:int>")
def test_params(request, name, id):
return response.text("yeehaww {} {}".format(name, id))
@app.route("/dynamic/<name>/<i:int>")
def test_params(request, name, i):
return response.text("yeehaww {} {}".format(name, i))
@app.route("/exception")
def exception(request):
raise ServerError("It's dead jim")
@app.route("/await")
async def test_await(request):
import asyncio
await asyncio.sleep(5)
return response.text("I'm feeling sleepy")
@app.route("/file")
async def test_file(request):
return await response.file(os.path.abspath("setup.py"))
@app.route("/file_stream")
async def test_file_stream(request):
return await response.file_stream(os.path.abspath("setup.py"),
chunk_size=1024)
# ----------------------------------------------- #
# Exceptions
# ----------------------------------------------- #
@app.exception(ServerError)
async def test(request, exception):
return response.json({"exception": "{}".format(exception), "status": exception.status_code}, status=exception.status_code)
return response.json({"exception": "{}".format(exception), "status": exception.status_code},
status=exception.status_code)
# ----------------------------------------------- #
@@ -63,7 +72,8 @@ def post_json(request):
@app.route("/query_string")
def query_string(request):
return response.json({"parsed": True, "args": request.args, "url": request.url, "query_string": request.query_string})
return response.json({"parsed": True, "args": request.args, "url": request.url,
"query_string": request.query_string})
# ----------------------------------------------- #

23
examples/unix_socket.py Normal file
View File

@@ -0,0 +1,23 @@
from sanic import Sanic
from sanic import response
import socket
import os
app = Sanic(__name__)
@app.route("/test")
async def test(request):
return response.text("OK")
if __name__ == '__main__':
server_address = './uds_socket'
# Make sure the socket does not already exist
try:
os.unlink(server_address)
except OSError:
if os.path.exists(server_address):
raise
sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
sock.bind(server_address)
app.run(sock=sock)

View File

@@ -3,6 +3,7 @@ from sanic import response
app = Sanic(__name__)
@app.route('/')
async def index(request):
# generate a URL for the endpoint `post_handler`
@@ -10,9 +11,10 @@ async def index(request):
# the URL is `/posts/5`, redirect to it
return response.redirect(url)
@app.route('/posts/<post_id>')
async def post_handler(request, post_id):
return response.text('Post - {}'.format(post_id))
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000, debug=True)
app.run(host="0.0.0.0", port=8000, debug=True)

View File

@@ -11,29 +11,29 @@ from sanic.blueprints import Blueprint
app = Sanic()
bp = Blueprint("bp", host="bp.example.com")
@app.route('/', host=["example.com",
"somethingelse.com",
"therestofyourdomains.com"])
async def hello(request):
return response.text("Some defaults")
@app.route('/', host="example.com")
async def hello(request):
return response.text("Answer")
@app.route('/', host="sub.example.com")
async def hello(request):
return response.text("42")
@bp.route("/question")
async def hello(request):
return response.text("What is the meaning of life?")
@bp.route("/answer")
async def hello(request):
return response.text("42")
app.register_blueprint(bp)
app.blueprint(bp)
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000)

View File

@@ -20,4 +20,5 @@ async def feed(request, ws):
if __name__ == '__main__':
app.run()
app.run(host="0.0.0.0", port=8000, debug=True)

View File

@@ -9,3 +9,4 @@ pytest
tox
ujson
uvloop
gunicorn

4
requirements-docs.txt Normal file
View File

@@ -0,0 +1,4 @@
sphinx
sphinx_rtd_theme
recommonmark
sphinxcontrib-asyncio

View File

@@ -1,6 +1,6 @@
from sanic.app import Sanic
from sanic.blueprints import Blueprint
__version__ = '0.5.2'
__version__ = '0.7.0'
__all__ = ['Sanic', 'Blueprint']

View File

@@ -1,7 +1,7 @@
from argparse import ArgumentParser
from importlib import import_module
from sanic.log import log
from sanic.log import logger
from sanic.app import Sanic
if __name__ == "__main__":
@@ -35,10 +35,10 @@ if __name__ == "__main__":
app.run(host=args.host, port=args.port,
workers=args.workers, debug=args.debug, ssl=ssl)
except ImportError:
log.error("No module named {} found.\n"
" Example File: project/sanic_server.py -> app\n"
" Example Module: project.sanic_server.app"
.format(module_name))
except ImportError as e:
logger.error("No module named {} found.\n"
" Example File: project/sanic_server.py -> app\n"
" Example Module: project.sanic_server.app"
.format(e.name))
except ValueError as e:
log.error("{}".format(e))
logger.error("{}".format(e))

View File

@@ -1,4 +1,5 @@
import logging
import logging.config
import re
import warnings
from asyncio import get_event_loop, ensure_future, CancelledError
@@ -13,7 +14,7 @@ from sanic.config import Config
from sanic.constants import HTTP_METHODS
from sanic.exceptions import ServerError, URLBuildError, SanicException
from sanic.handlers import ErrorHandler
from sanic.log import log
from sanic.log import logger, error_logger, LOGGING_CONFIG_DEFAULTS
from sanic.response import HTTPResponse, StreamingHTTPResponse
from sanic.router import Router
from sanic.server import serve, serve_multiple, HttpProtocol, Signal
@@ -26,22 +27,19 @@ from sanic.websocket import WebSocketProtocol, ConnectionClosed
class Sanic:
def __init__(self, name=None, router=None, error_handler=None,
load_env=True, request_class=None):
# Only set up a default log handler if the
# end-user application didn't set anything up.
if not logging.root.handlers and log.level == logging.NOTSET:
formatter = logging.Formatter(
"%(asctime)s: %(levelname)s: %(message)s")
handler = logging.StreamHandler()
handler.setFormatter(formatter)
log.addHandler(handler)
log.setLevel(logging.INFO)
load_env=True, request_class=None,
strict_slashes=False, log_config=None,
configure_logging=True):
# Get name from previous stack frame
if name is None:
frame_records = stack()[1]
name = getmodulename(frame_records[1])
# logging
if configure_logging:
logging.config.dictConfig(log_config or LOGGING_CONFIG_DEFAULTS)
self.name = name
self.router = router or Router()
self.request_class = request_class
@@ -51,12 +49,15 @@ class Sanic:
self.response_middleware = deque()
self.blueprints = {}
self._blueprint_order = []
self.configure_logging = configure_logging
self.debug = None
self.sock = None
self.strict_slashes = strict_slashes
self.listeners = defaultdict(list)
self.is_running = False
self.is_request_stream = False
self.websocket_enabled = False
self.websocket_tasks = []
self.websocket_tasks = set()
# Register alternative method names
self.go_fast = self.run
@@ -105,12 +106,16 @@ class Sanic:
# Decorator
def route(self, uri, methods=frozenset({'GET'}), host=None,
strict_slashes=False):
strict_slashes=None, stream=False, version=None, name=None):
"""Decorate a function to be registered as a route
:param uri: path of the URL
:param methods: list or tuple of methods allowed
:param host:
:param strict_slashes:
:param stream:
:param version:
:param name: user defined route name for url_for
:return: decorated function
"""
@@ -119,44 +124,67 @@ class Sanic:
if not uri.startswith('/'):
uri = '/' + uri
if stream:
self.is_request_stream = True
if strict_slashes is None:
strict_slashes = self.strict_slashes
def response(handler):
if stream:
handler.is_stream = stream
self.router.add(uri=uri, methods=methods, handler=handler,
host=host, strict_slashes=strict_slashes)
host=host, strict_slashes=strict_slashes,
version=version, name=name)
return handler
return response
# Shorthand method decorators
def get(self, uri, host=None, strict_slashes=False):
def get(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=frozenset({"GET"}), host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version,
name=name)
def post(self, uri, host=None, strict_slashes=False):
def post(self, uri, host=None, strict_slashes=None, stream=False,
version=None, name=None):
return self.route(uri, methods=frozenset({"POST"}), host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, stream=stream,
version=version, name=name)
def put(self, uri, host=None, strict_slashes=False):
def put(self, uri, host=None, strict_slashes=None, stream=False,
version=None, name=None):
return self.route(uri, methods=frozenset({"PUT"}), host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, stream=stream,
version=version, name=name)
def head(self, uri, host=None, strict_slashes=False):
def head(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=frozenset({"HEAD"}), host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version,
name=name)
def options(self, uri, host=None, strict_slashes=False):
def options(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=frozenset({"OPTIONS"}), host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version,
name=name)
def patch(self, uri, host=None, strict_slashes=False):
def patch(self, uri, host=None, strict_slashes=None, stream=False,
version=None, name=None):
return self.route(uri, methods=frozenset({"PATCH"}), host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, stream=stream,
version=version, name=name)
def delete(self, uri, host=None, strict_slashes=False):
def delete(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=frozenset({"DELETE"}), host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version,
name=name)
def add_route(self, handler, uri, methods=frozenset({'GET'}), host=None,
strict_slashes=False):
strict_slashes=None, version=None, name=None):
"""A helper method to register class instance or
functions as a handler to the application url
routes.
@@ -166,28 +194,46 @@ class Sanic:
:param methods: list or tuple of methods allowed, these are overridden
if using a HTTPMethodView
:param host:
:param strict_slashes:
:param version:
:param name: user defined route name for url_for
:return: function or class instance
"""
stream = False
# Handle HTTPMethodView differently
if hasattr(handler, 'view_class'):
methods = set()
for method in HTTP_METHODS:
if getattr(handler.view_class, method.lower(), None):
_handler = getattr(handler.view_class, method.lower(), None)
if _handler:
methods.add(method)
if hasattr(_handler, 'is_stream'):
stream = True
# handle composition view differently
if isinstance(handler, CompositionView):
methods = handler.handlers.keys()
for _handler in handler.handlers.values():
if hasattr(_handler, 'is_stream'):
stream = True
break
if strict_slashes is None:
strict_slashes = self.strict_slashes
self.route(uri=uri, methods=methods, host=host,
strict_slashes=strict_slashes)(handler)
strict_slashes=strict_slashes, stream=stream,
version=version, name=name)(handler)
return handler
# Decorator
def websocket(self, uri, host=None, strict_slashes=False):
def websocket(self, uri, host=None, strict_slashes=None,
subprotocols=None, name=None):
"""Decorate a function to be registered as a websocket route
:param uri: path of the URL
:param subprotocols: optional list of strings with the supported
subprotocols
:param host:
:return: decorated function
"""
@@ -198,17 +244,25 @@ class Sanic:
if not uri.startswith('/'):
uri = '/' + uri
if strict_slashes is None:
strict_slashes = self.strict_slashes
def response(handler):
async def websocket_handler(request, *args, **kwargs):
request.app = self
protocol = request.transport.get_protocol()
ws = await protocol.websocket_handshake(request)
try:
protocol = request.transport.get_protocol()
except AttributeError:
# On Python3.5 the Transport classes in asyncio do not
# have a get_protocol() method as in uvloop
protocol = request.transport._protocol
ws = await protocol.websocket_handshake(request, subprotocols)
# schedule the application handler
# its future is kept in self.websocket_tasks in case it
# needs to be cancelled due to the server being stopped
fut = ensure_future(handler(request, ws, *args, **kwargs))
self.websocket_tasks.append(fut)
self.websocket_tasks.add(fut)
try:
await fut
except (CancelledError, ConnectionClosed):
@@ -218,16 +272,19 @@ class Sanic:
self.router.add(uri=uri, handler=websocket_handler,
methods=frozenset({'GET'}), host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, name=name)
return handler
return response
def add_websocket_route(self, handler, uri, host=None,
strict_slashes=False):
strict_slashes=None, name=None):
"""A helper method to register a function as a websocket route."""
return self.websocket(uri, host=host,
strict_slashes=strict_slashes)(handler)
if strict_slashes is None:
strict_slashes = self.strict_slashes
return self.websocket(uri, host=host, strict_slashes=strict_slashes,
name=name)(handler)
def enable_websocket(self, enable=True):
"""Enable or disable the support for websocket.
@@ -267,34 +324,38 @@ class Sanic:
return response
def register_middleware(self, middleware, attach_to='request'):
if attach_to == 'request':
self.request_middleware.append(middleware)
if attach_to == 'response':
self.response_middleware.appendleft(middleware)
return middleware
# Decorator
def middleware(self, middleware_or_request):
"""Decorate and register middleware to be called before a request.
Can either be called as @app.middleware or @app.middleware('request')
"""
def register_middleware(middleware, attach_to='request'):
if attach_to == 'request':
self.request_middleware.append(middleware)
if attach_to == 'response':
self.response_middleware.appendleft(middleware)
return middleware
# Detect which way this was called, @middleware or @middleware('AT')
if callable(middleware_or_request):
return register_middleware(middleware_or_request)
return self.register_middleware(middleware_or_request)
else:
return partial(register_middleware,
return partial(self.register_middleware,
attach_to=middleware_or_request)
# Static Files
def static(self, uri, file_or_directory, pattern=r'/?.+',
use_modified_since=True, use_content_range=False):
use_modified_since=True, use_content_range=False,
stream_large_files=False, name='static', host=None,
strict_slashes=None):
"""Register a root to serve files from. The input can either be a
file or a directory. See
"""
static_register(self, uri, file_or_directory, pattern,
use_modified_since, use_content_range)
use_modified_since, use_content_range,
stream_large_files, name, host, strict_slashes)
def blueprint(self, blueprint, **options):
"""Register a blueprint on the application.
@@ -344,12 +405,31 @@ class Sanic:
URLBuildError
"""
# find the route by the supplied view name
uri, route = self.router.find_route_by_view_name(view_name)
kw = {}
# special static files url_for
if view_name == 'static':
kw.update(name=kwargs.pop('name', 'static'))
elif view_name.endswith('.static'): # blueprint.static
kwargs.pop('name', None)
kw.update(name=view_name)
if not uri or not route:
raise URLBuildError(
'Endpoint with name `{}` was not found'.format(
view_name))
uri, route = self.router.find_route_by_view_name(view_name, **kw)
if not (uri and route):
raise URLBuildError('Endpoint with name `{}` was not found'.format(
view_name))
if view_name == 'static' or view_name.endswith('.static'):
filename = kwargs.pop('filename', None)
# it's static folder
if '<file_uri:' in uri:
folder_ = uri.split('<file_uri:', 1)[0]
if folder_.endswith('/'):
folder_ = folder_[:-1]
if filename.startswith('/'):
filename = filename[1:]
uri = '{}/{}'.format(folder_, filename)
if uri != '/' and uri.endswith('/'):
uri = uri[:-1]
@@ -373,6 +453,16 @@ class Sanic:
if netloc is None and external:
netloc = self.config.get('SERVER_NAME', '')
if external:
if not scheme:
if ':' in netloc[:8]:
scheme = netloc[:8].split(':', 1)[0]
else:
scheme = 'http'
if '://' in netloc[:8]:
netloc = netloc.split('://', 1)[-1]
for match in matched_params:
name, _type, pattern = self.router.parse_parameter_string(
match)
@@ -453,7 +543,8 @@ class Sanic:
# -------------------------------------------- #
# Fetch handler from router
handler, args, kwargs = self.router.get(request)
handler, args, kwargs, uri = self.router.get(request)
request.uri_template = uri
if handler is None:
raise ServerError(
("'None' was returned while requesting a "
@@ -487,9 +578,9 @@ class Sanic:
try:
response = await self._run_response_middleware(request,
response)
except:
log.exception(
'Exception occured in one of response middleware handlers'
except BaseException:
error_logger.exception(
'Exception occurred in one of response middleware handlers'
)
# pass the response to the correct callback
@@ -510,36 +601,31 @@ class Sanic:
# Execution
# -------------------------------------------------------------------- #
def run(self, host="127.0.0.1", port=8000, debug=False, before_start=None,
after_start=None, before_stop=None, after_stop=None, ssl=None,
sock=None, workers=1, loop=None, protocol=None,
backlog=100, stop_event=None, register_sys_signals=True):
def run(self, host=None, port=None, debug=False, ssl=None,
sock=None, workers=1, protocol=None,
backlog=100, stop_event=None, register_sys_signals=True,
access_log=True):
"""Run the HTTP Server and listen until keyboard interrupt or term
signal. On termination, drain connections before closing.
:param host: Address to host on
:param port: Port to host on
:param debug: Enables debug output (slows server)
:param before_start: Functions to be executed before the server starts
accepting connections
:param after_start: Functions to be executed after the server starts
accepting connections
:param before_stop: Functions to be executed when a stop signal is
received before it is respected
:param after_stop: Functions to be executed when all requests are
complete
:param ssl: SSLContext, or location of certificate and key
for SSL encryption of worker(s)
:param sock: Socket for the server to accept connections from
:param workers: Number of processes
received before it is respected
:param loop:
:param backlog:
:param stop_event:
:param register_sys_signals:
:param protocol: Subclass of asyncio protocol class
:return: Nothing
"""
if sock is None:
host, port = host or "127.0.0.1", port or 8000
if protocol is None:
protocol = (WebSocketProtocol if self.websocket_enabled
else HttpProtocol)
@@ -549,11 +635,10 @@ class Sanic:
warnings.warn("stop_event will be removed from future versions.",
DeprecationWarning)
server_settings = self._helper(
host=host, port=port, debug=debug, before_start=before_start,
after_start=after_start, before_stop=before_stop,
after_stop=after_stop, ssl=ssl, sock=sock, workers=workers,
loop=loop, protocol=protocol, backlog=backlog,
register_sys_signals=register_sys_signals)
host=host, port=port, debug=debug, ssl=ssl, sock=sock,
workers=workers, protocol=protocol, backlog=backlog,
register_sys_signals=register_sys_signals,
access_log=access_log)
try:
self.is_running = True
@@ -561,12 +646,13 @@ class Sanic:
serve(**server_settings)
else:
serve_multiple(server_settings, workers)
except:
log.exception(
except BaseException:
error_logger.exception(
'Experienced exception while trying to serve')
raise
finally:
self.is_running = False
log.info("Server Stopped")
logger.info("Server Stopped")
def stop(self):
"""This kills the Sanic"""
@@ -576,16 +662,19 @@ class Sanic:
"""gunicorn compatibility"""
return self
async def create_server(self, host="127.0.0.1", port=8000, debug=False,
before_start=None, after_start=None,
before_stop=None, after_stop=None, ssl=None,
sock=None, loop=None, protocol=None,
backlog=100, stop_event=None):
async def create_server(self, host=None, port=None, debug=False,
ssl=None, sock=None, protocol=None,
backlog=100, stop_event=None,
access_log=True):
"""Asynchronous version of `run`.
NOTE: This does not support multiprocessing and is not the preferred
way to run a Sanic application.
"""
if sock is None:
host, port = host or "127.0.0.1", port or 8000
if protocol is None:
protocol = (WebSocketProtocol if self.websocket_enabled
else HttpProtocol)
@@ -594,15 +683,31 @@ class Sanic:
warnings.simplefilter('default')
warnings.warn("stop_event will be removed from future versions.",
DeprecationWarning)
server_settings = self._helper(
host=host, port=port, debug=debug, before_start=before_start,
after_start=after_start, before_stop=before_stop,
after_stop=after_stop, ssl=ssl, sock=sock,
loop=loop or get_event_loop(), protocol=protocol,
backlog=backlog, run_async=True)
host=host, port=port, debug=debug, ssl=ssl, sock=sock,
loop=get_event_loop(), protocol=protocol,
backlog=backlog, run_async=True,
access_log=access_log)
# Trigger before_start events
await self.trigger_events(
server_settings.get('before_start', []),
server_settings.get('loop')
)
return await serve(**server_settings)
async def trigger_events(self, events, loop):
"""Trigger events (functions or async)
:param events: one or more sync or async functions to execute
:param loop: event loop
"""
for event in events:
result = event(loop)
if isawaitable(result):
await result
async def _run_request_middleware(self, request):
# The if improves speed. I don't know why
if self.request_middleware:
@@ -625,13 +730,11 @@ class Sanic:
break
return response
def _helper(self, host="127.0.0.1", port=8000, debug=False,
before_start=None, after_start=None, before_stop=None,
after_stop=None, ssl=None, sock=None, workers=1, loop=None,
def _helper(self, host=None, port=None, debug=False,
ssl=None, sock=None, workers=1, loop=None,
protocol=HttpProtocol, backlog=100, stop_event=None,
register_sys_signals=True, run_async=False):
register_sys_signals=True, run_async=False, access_log=True):
"""Helper function used by `run` and `create_server`."""
if isinstance(ssl, dict):
# try common aliaseses
cert = ssl.get('cert') or ssl.get('certificate')
@@ -646,23 +749,6 @@ class Sanic:
warnings.simplefilter('default')
warnings.warn("stop_event will be removed from future versions.",
DeprecationWarning)
if loop is not None:
if debug:
warnings.simplefilter('default')
warnings.warn("Passing a loop will be deprecated in version"
" 0.4.0 https://github.com/channelcat/sanic/"
"pull/335 has more information.",
DeprecationWarning)
# Deprecate this
if any(arg is not None for arg in (after_stop, after_start,
before_start, before_stop)):
if debug:
warnings.simplefilter('default')
warnings.warn("Passing a before_start, before_stop, after_start or"
"after_stop callback will be deprecated in next "
"major version after 0.4.0",
DeprecationWarning)
self.error_handler.debug = debug
self.debug = debug
@@ -670,6 +756,8 @@ class Sanic:
server_settings = {
'protocol': protocol,
'request_class': self.request_class,
'is_request_stream': self.is_request_stream,
'router': self.router,
'host': host,
'port': port,
'sock': sock,
@@ -679,39 +767,40 @@ class Sanic:
'request_handler': self.handle_request,
'error_handler': self.error_handler,
'request_timeout': self.config.REQUEST_TIMEOUT,
'response_timeout': self.config.RESPONSE_TIMEOUT,
'keep_alive_timeout': self.config.KEEP_ALIVE_TIMEOUT,
'request_max_size': self.config.REQUEST_MAX_SIZE,
'keep_alive': self.config.KEEP_ALIVE,
'loop': loop,
'register_sys_signals': register_sys_signals,
'backlog': backlog
'backlog': backlog,
'access_log': access_log,
'websocket_max_size': self.config.WEBSOCKET_MAX_SIZE,
'websocket_max_queue': self.config.WEBSOCKET_MAX_QUEUE,
'graceful_shutdown_timeout': self.config.GRACEFUL_SHUTDOWN_TIMEOUT
}
# -------------------------------------------- #
# Register start/stop events
# -------------------------------------------- #
for event_name, settings_name, reverse, args in (
("before_server_start", "before_start", False, before_start),
("after_server_start", "after_start", False, after_start),
("before_server_stop", "before_stop", True, before_stop),
("after_server_stop", "after_stop", True, after_stop),
for event_name, settings_name, reverse in (
("before_server_start", "before_start", False),
("after_server_start", "after_start", False),
("before_server_stop", "before_stop", True),
("after_server_stop", "after_stop", True),
):
listeners = self.listeners[event_name].copy()
if args:
if callable(args):
listeners.append(args)
else:
listeners.extend(args)
if reverse:
listeners.reverse()
# Prepend sanic to the arguments when listeners are triggered
listeners = [partial(listener, self) for listener in listeners]
server_settings[settings_name] = listeners
if debug:
log.setLevel(logging.DEBUG)
if self.configure_logging and debug:
logger.setLevel(logging.DEBUG)
if self.config.LOGO is not None:
log.debug(self.config.LOGO)
logger.debug(self.config.LOGO)
if run_async:
server_settings['run_async'] = True
@@ -721,6 +810,6 @@ class Sanic:
proto = "http"
if ssl is not None:
proto = "https"
log.info('Goin\' Fast @ {}://{}:{}'.format(proto, host, port))
logger.info('Goin\' Fast @ {}://{}:{}'.format(proto, host, port))
return server_settings

View File

@@ -4,8 +4,8 @@ from sanic.constants import HTTP_METHODS
from sanic.views import CompositionView
FutureRoute = namedtuple('Route',
['handler', 'uri', 'methods',
'host', 'strict_slashes'])
['handler', 'uri', 'methods', 'host',
'strict_slashes', 'stream', 'version', 'name'])
FutureListener = namedtuple('Listener', ['handler', 'uri', 'methods', 'host'])
FutureMiddleware = namedtuple('Route', ['middleware', 'args', 'kwargs'])
FutureException = namedtuple('Route', ['handler', 'args', 'kwargs'])
@@ -14,11 +14,16 @@ FutureStatic = namedtuple('Route',
class Blueprint:
def __init__(self, name, url_prefix=None, host=None):
def __init__(self, name,
url_prefix=None,
host=None, version=None,
strict_slashes=False):
"""Create a new blueprint
:param name: unique name of the blueprint
:param url_prefix: URL to be prefixed before all route URLs
:param strict_slashes: strict to trailing slash
"""
self.name = name
self.url_prefix = url_prefix
@@ -30,6 +35,8 @@ class Blueprint:
self.listeners = defaultdict(list)
self.middlewares = []
self.statics = []
self.version = version
self.strict_slashes = strict_slashes
def register(self, app, options):
"""Register the blueprint to the sanic app."""
@@ -43,12 +50,17 @@ class Blueprint:
future.handler.__blueprintname__ = self.name
# Prepend the blueprint URI prefix if available
uri = url_prefix + future.uri if url_prefix else future.uri
app.route(
uri=uri[1:] if uri.startswith('//') else uri,
methods=future.methods,
host=future.host or self.host,
strict_slashes=future.strict_slashes
)(future.handler)
version = future.version or self.version
app.route(uri=uri[1:] if uri.startswith('//') else uri,
methods=future.methods,
host=future.host or self.host,
strict_slashes=future.strict_slashes,
stream=future.stream,
version=version,
name=future.name,
)(future.handler)
for future in self.websocket_routes:
# attach the blueprint name to the handler so that it can be
@@ -56,19 +68,20 @@ class Blueprint:
future.handler.__blueprintname__ = self.name
# Prepend the blueprint URI prefix if available
uri = url_prefix + future.uri if url_prefix else future.uri
app.websocket(
uri=uri,
host=future.host or self.host,
strict_slashes=future.strict_slashes
)(future.handler)
app.websocket(uri=uri,
host=future.host or self.host,
strict_slashes=future.strict_slashes,
name=future.name,
)(future.handler)
# Middleware
for future in self.middlewares:
if future.args or future.kwargs:
app.middleware(*future.args,
**future.kwargs)(future.middleware)
app.register_middleware(future.middleware,
*future.args,
**future.kwargs)
else:
app.middleware(future.middleware)
app.register_middleware(future.middleware)
# Exceptions
for future in self.exceptions:
@@ -87,26 +100,35 @@ class Blueprint:
app.listener(event)(listener)
def route(self, uri, methods=frozenset({'GET'}), host=None,
strict_slashes=False):
strict_slashes=None, stream=False, version=None, name=None):
"""Create a blueprint route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:param methods: list of acceptable HTTP methods.
"""
if strict_slashes is None:
strict_slashes = self.strict_slashes
def decorator(handler):
route = FutureRoute(handler, uri, methods, host, strict_slashes)
route = FutureRoute(
handler, uri, methods, host, strict_slashes, stream, version,
name)
self.routes.append(route)
return handler
return decorator
def add_route(self, handler, uri, methods=frozenset({'GET'}), host=None,
strict_slashes=False):
strict_slashes=None, version=None, name=None):
"""Create a blueprint route from a function.
:param handler: function for handling uri requests. Accepts function,
or class instance with a view_class method.
:param uri: endpoint at which the route will be accessible.
:param methods: list of acceptable HTTP methods.
:param host:
:param strict_slashes:
:param version:
:param name: user defined route name for url_for
:return: function or class instance
"""
# Handle HTTPMethodView differently
@@ -117,26 +139,36 @@ class Blueprint:
if getattr(handler.view_class, method.lower(), None):
methods.add(method)
if strict_slashes is None:
strict_slashes = self.strict_slashes
# handle composition view differently
if isinstance(handler, CompositionView):
methods = handler.handlers.keys()
self.route(uri=uri, methods=methods, host=host,
strict_slashes=strict_slashes)(handler)
strict_slashes=strict_slashes, version=version,
name=name)(handler)
return handler
def websocket(self, uri, host=None, strict_slashes=False):
def websocket(self, uri, host=None, strict_slashes=None, version=None,
name=None):
"""Create a blueprint websocket route from a decorated function.
:param uri: endpoint at which the route will be accessible.
"""
if strict_slashes is None:
strict_slashes = self.strict_slashes
def decorator(handler):
route = FutureRoute(handler, uri, [], host, strict_slashes)
route = FutureRoute(handler, uri, [], host, strict_slashes,
False, version, name)
self.websocket_routes.append(route)
return handler
return decorator
def add_websocket_route(self, handler, uri, host=None):
def add_websocket_route(self, handler, uri, host=None, version=None,
name=None):
"""Create a blueprint websocket route from a function.
:param handler: function for handling uri requests. Accepts function,
@@ -144,7 +176,7 @@ class Blueprint:
:param uri: endpoint at which the route will be accessible.
:return: function or class instance
"""
self.websocket(uri=uri, host=host)(handler)
self.websocket(uri=uri, host=host, version=version, name=name)(handler)
return handler
def listener(self, event):
@@ -186,34 +218,57 @@ class Blueprint:
:param uri: endpoint at which the route will be accessible.
:param file_or_directory: Static asset.
"""
name = kwargs.pop('name', 'static')
if not name.startswith(self.name + '.'):
name = '{}.{}'.format(self.name, name)
kwargs.update(name=name)
strict_slashes = kwargs.get('strict_slashes')
if strict_slashes is None and self.strict_slashes is not None:
kwargs.update(strict_slashes=self.strict_slashes)
static = FutureStatic(uri, file_or_directory, args, kwargs)
self.statics.append(static)
# Shorthand method decorators
def get(self, uri, host=None, strict_slashes=False):
def get(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=["GET"], host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version,
name=name)
def post(self, uri, host=None, strict_slashes=False):
def post(self, uri, host=None, strict_slashes=None, stream=False,
version=None, name=None):
return self.route(uri, methods=["POST"], host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, stream=stream,
version=version, name=name)
def put(self, uri, host=None, strict_slashes=False):
def put(self, uri, host=None, strict_slashes=None, stream=False,
version=None, name=None):
return self.route(uri, methods=["PUT"], host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, stream=stream,
version=version, name=name)
def head(self, uri, host=None, strict_slashes=False):
def head(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=["HEAD"], host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version,
name=name)
def options(self, uri, host=None, strict_slashes=False):
def options(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=["OPTIONS"], host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version,
name=name)
def patch(self, uri, host=None, strict_slashes=False):
def patch(self, uri, host=None, strict_slashes=None, stream=False,
version=None, name=None):
return self.route(uri, methods=["PATCH"], host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, stream=stream,
version=version, name=name)
def delete(self, uri, host=None, strict_slashes=False):
def delete(self, uri, host=None, strict_slashes=None, version=None,
name=None):
return self.route(uri, methods=["DELETE"], host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version,
name=name)

View File

@@ -1,7 +1,7 @@
import os
import types
SANIC_PREFIX = 'SANIC_'
@@ -29,12 +29,18 @@ class Config(dict):
▌ ▐ ▀▀▄▄▄▀
▀▀▄▄▀
"""
self.REQUEST_MAX_SIZE = 100000000 # 100 megababies
self.REQUEST_MAX_SIZE = 100000000 # 100 megabytes
self.REQUEST_TIMEOUT = 60 # 60 seconds
self.RESPONSE_TIMEOUT = 60 # 60 seconds
self.KEEP_ALIVE = keep_alive
self.KEEP_ALIVE_TIMEOUT = 5 # 5 seconds
self.WEBSOCKET_MAX_SIZE = 2 ** 20 # 1 megabytes
self.WEBSOCKET_MAX_QUEUE = 32
self.GRACEFUL_SHUTDOWN_TIMEOUT = 15.0 # 15 sec
if load_env:
self.load_environment_vars()
prefix = SANIC_PREFIX if load_env is True else load_env
self.load_environment_vars(prefix=prefix)
def __getattr__(self, attr):
try:
@@ -98,12 +104,18 @@ class Config(dict):
if key.isupper():
self[key] = getattr(obj, key)
def load_environment_vars(self):
def load_environment_vars(self, prefix=SANIC_PREFIX):
"""
Looks for any SANIC_ prefixed environment variables and applies
Looks for prefixed environment variables and applies
them to the configuration if present.
"""
for k, v in os.environ.items():
if k.startswith(SANIC_PREFIX):
_, config_key = k.split(SANIC_PREFIX, 1)
self[config_key] = v
if k.startswith(prefix):
_, config_key = k.split(prefix, 1)
try:
self[config_key] = int(v)
except ValueError:
try:
self[config_key] = float(v)
except ValueError:
self[config_key] = v

View File

@@ -98,7 +98,8 @@ class Cookie(dict):
def __setitem__(self, key, value):
if key not in self._keys:
raise KeyError("Unknown cookie property")
return super().__setitem__(key, value)
if value is not False:
return super().__setitem__(key, value)
def encode(self, encoding):
output = ['%s=%s' % (self.key, _quote(self.value))]
@@ -116,9 +117,8 @@ class Cookie(dict):
))
except AttributeError:
output.append('%s=%s' % (self._keys[key], value))
elif key in self._flags:
if self[key]:
output.append(self._keys[key])
elif key in self._flags and self[key]:
output.append(self._keys[key])
else:
output.append('%s=%s' % (self._keys[key], value))

View File

@@ -1,3 +1,5 @@
from sanic.response import STATUS_CODES
TRACEBACK_STYLE = '''
<style>
body {
@@ -47,6 +49,10 @@ TRACEBACK_STYLE = '''
padding: 5px 10px;
}
.tb-border {
padding-top: 20px;
}
.frame-descriptor {
background-color: #e2eafb;
}
@@ -63,12 +69,9 @@ TRACEBACK_WRAPPER_HTML = '''
{style}
</head>
<body>
<h1>{exc_name}</h1>
<h3><code>{exc_value}</code></h3>
<div class="tb-wrapper">
<p class="tb-header">Traceback (most recent call last):</p>
{frame_html}
<p class="summary">
{inner_html}
<div class="summary">
<p>
<b>{exc_name}: {exc_value}</b>
while handling path <code>{path}</code>
</p>
@@ -77,6 +80,24 @@ TRACEBACK_WRAPPER_HTML = '''
</html>
'''
TRACEBACK_WRAPPER_INNER_HTML = '''
<h1>{exc_name}</h1>
<h3><code>{exc_value}</code></h3>
<div class="tb-wrapper">
<p class="tb-header">Traceback (most recent call last):</p>
{frame_html}
</div>
'''
TRACEBACK_BORDER = '''
<div class="tb-border">
<b><i>
The above exception was the direct cause of the
following exception:
</i></b>
</div>
'''
TRACEBACK_LINE_HTML = '''
<div class="frame-line">
<p class="frame-descriptor">
@@ -96,6 +117,20 @@ INTERNAL_SERVER_ERROR_HTML = '''
'''
_sanic_exceptions = {}
def add_status_code(code):
"""
Decorator used for adding exceptions to _sanic_exceptions.
"""
def class_decorator(cls):
cls.status_code = code
_sanic_exceptions[code] = cls
return cls
return class_decorator
class SanicException(Exception):
def __init__(self, message, status_code=None):
@@ -105,24 +140,34 @@ class SanicException(Exception):
self.status_code = status_code
@add_status_code(404)
class NotFound(SanicException):
status_code = 404
pass
@add_status_code(400)
class InvalidUsage(SanicException):
status_code = 400
pass
@add_status_code(500)
class ServerError(SanicException):
status_code = 500
pass
class URLBuildError(SanicException):
status_code = 500
@add_status_code(503)
class ServiceUnavailable(SanicException):
"""The server is currently unavailable (because it is overloaded or
down for maintenance). Generally, this is a temporary state."""
pass
class URLBuildError(ServerError):
pass
class FileNotFound(NotFound):
status_code = 404
pass
def __init__(self, message, path, relative_url):
super().__init__(message)
@@ -130,20 +175,30 @@ class FileNotFound(NotFound):
self.relative_url = relative_url
@add_status_code(408)
class RequestTimeout(SanicException):
status_code = 408
"""The Web server (running the Web site) thinks that there has been too
long an interval of time between 1) the establishment of an IP
connection (socket) between the client and the server and
2) the receipt of any data on that socket, so the server has dropped
the connection. The socket connection has actually been lost - the Web
server has 'timed out' on that particular socket connection.
"""
pass
@add_status_code(413)
class PayloadTooLarge(SanicException):
status_code = 413
pass
class HeaderNotFound(SanicException):
status_code = 400
class HeaderNotFound(InvalidUsage):
pass
@add_status_code(416)
class ContentRangeError(SanicException):
status_code = 416
pass
def __init__(self, message, content_range):
super().__init__(message)
@@ -153,5 +208,75 @@ class ContentRangeError(SanicException):
}
@add_status_code(403)
class Forbidden(SanicException):
pass
class InvalidRangeType(ContentRangeError):
pass
@add_status_code(401)
class Unauthorized(SanicException):
"""
Unauthorized exception (401 HTTP status code).
:param message: Message describing the exception.
:param status_code: HTTP Status code.
:param scheme: Name of the authentication scheme to be used.
When present, kwargs is used to complete the WWW-Authentication header.
Examples::
# With a Basic auth-scheme, realm MUST be present:
raise Unauthorized("Auth required.",
scheme="Basic",
realm="Restricted Area")
# With a Digest auth-scheme, things are a bit more complicated:
raise Unauthorized("Auth required.",
scheme="Digest",
realm="Restricted Area",
qop="auth, auth-int",
algorithm="MD5",
nonce="abcdef",
opaque="zyxwvu")
# With a Bearer auth-scheme, realm is optional so you can write:
raise Unauthorized("Auth required.", scheme="Bearer")
# or, if you want to specify the realm:
raise Unauthorized("Auth required.",
scheme="Bearer",
realm="Restricted Area")
"""
def __init__(self, message, status_code=None, scheme=None, **kwargs):
super().__init__(message, status_code)
# if auth-scheme is specified, set "WWW-Authenticate" header
if scheme is not None:
values = ["{!s}={!r}".format(k, v) for k, v in kwargs.items()]
challenge = ', '.join(values)
self.headers = {
"WWW-Authenticate": "{} {}".format(scheme, challenge).rstrip()
}
def abort(status_code, message=None):
"""
Raise an exception based on SanicException. Returns the HTTP response
message appropriate for the given status code, unless provided.
:param status_code: The HTTP status code to return.
:param message: The HTTP response body. Defaults to the messages
in response.py for the given status code.
"""
if message is None:
message = STATUS_CODES.get(status_code)
# These are stored as bytes in the STATUS_CODES dict
message = message.decode('utf8')
sanic_exception = _sanic_exceptions.get(status_code, SanicException)
raise sanic_exception(message=message, status_code=status_code)

View File

@@ -9,8 +9,10 @@ from sanic.exceptions import (
SanicException,
TRACEBACK_LINE_HTML,
TRACEBACK_STYLE,
TRACEBACK_WRAPPER_HTML)
from sanic.log import log
TRACEBACK_WRAPPER_HTML,
TRACEBACK_WRAPPER_INNER_HTML,
TRACEBACK_BORDER)
from sanic.log import logger
from sanic.response import text, html
@@ -24,19 +26,31 @@ class ErrorHandler:
self.cached_handlers = {}
self.debug = False
def _render_traceback_html(self, exception, request):
exc_type, exc_value, tb = sys.exc_info()
frames = extract_tb(tb)
def _render_exception(self, exception):
frames = extract_tb(exception.__traceback__)
frame_html = []
for frame in frames:
frame_html.append(TRACEBACK_LINE_HTML.format(frame))
return TRACEBACK_WRAPPER_INNER_HTML.format(
exc_name=exception.__class__.__name__,
exc_value=exception,
frame_html=''.join(frame_html))
def _render_traceback_html(self, exception, request):
exc_type, exc_value, tb = sys.exc_info()
exceptions = []
while exc_value:
exceptions.append(self._render_exception(exc_value))
exc_value = exc_value.__cause__
return TRACEBACK_WRAPPER_HTML.format(
style=TRACEBACK_STYLE,
exc_name=exc_type.__name__,
exc_value=exc_value,
frame_html=''.join(frame_html),
exc_name=exception.__class__.__name__,
exc_value=exception,
inner_html=TRACEBACK_BORDER.join(reversed(exceptions)),
path=request.path)
def add(self, exception, handler):
@@ -72,12 +86,13 @@ class ErrorHandler:
self.log(format_exc())
if self.debug:
url = getattr(request, 'url', 'unknown')
response_message = (
'Exception raised in exception handler "{}" '
'for uri: "{}"\n{}').format(
handler.__name__, url, format_exc())
log.error(response_message)
return text(response_message, 500)
response_message = ('Exception raised in exception handler '
'"%s" for uri: "%s"\n%s')
logger.error(response_message,
handler.__name__, url, format_exc())
return text(response_message % (
handler.__name__, url, format_exc()), 500)
else:
return text('An error occurred while handling an error', 500)
return response
@@ -87,7 +102,7 @@ class ErrorHandler:
Override this method in an ErrorHandler subclass to prevent
logging exceptions.
"""
getattr(log, level)(message)
getattr(logger, level)(message)
def default(self, request, exception):
self.log(format_exc())
@@ -100,10 +115,9 @@ class ErrorHandler:
elif self.debug:
html_output = self._render_traceback_html(exception, request)
response_message = (
'Exception occurred while handling uri: "{}"\n{}'.format(
request.url, format_exc()))
log.error(response_message)
response_message = ('Exception occurred while handling uri: '
'"%s"\n%s')
logger.error(response_message, request.url, format_exc())
return html(html_output, status=500)
else:
return html(INTERNAL_SERVER_ERROR_HTML, status=500)

View File

@@ -1,3 +1,63 @@
import logging
import sys
log = logging.getLogger('sanic')
LOGGING_CONFIG_DEFAULTS = dict(
version=1,
disable_existing_loggers=False,
loggers={
"root": {
"level": "INFO",
"handlers": ["console"]
},
"sanic.error": {
"level": "INFO",
"handlers": ["error_console"],
"propagate": True,
"qualname": "sanic.error"
},
"sanic.access": {
"level": "INFO",
"handlers": ["access_console"],
"propagate": True,
"qualname": "sanic.access"
}
},
handlers={
"console": {
"class": "logging.StreamHandler",
"formatter": "generic",
"stream": sys.stdout
},
"error_console": {
"class": "logging.StreamHandler",
"formatter": "generic",
"stream": sys.stderr
},
"access_console": {
"class": "logging.StreamHandler",
"formatter": "access",
"stream": sys.stdout
},
},
formatters={
"generic": {
"format": "%(asctime)s [%(process)d] [%(levelname)s] %(message)s",
"datefmt": "[%Y-%m-%d %H:%M:%S %z]",
"class": "logging.Formatter"
},
"access": {
"format": "%(asctime)s - (%(name)s)[%(levelname)s][%(host)s]: " +
"%(request)s %(message)s %(status)d %(byte)d",
"datefmt": "[%Y-%m-%d %H:%M:%S %z]",
"class": "logging.Formatter"
},
}
)
logger = logging.getLogger('root')
error_logger = logging.getLogger('sanic.error')
access_logger = logging.getLogger('sanic.access')

View File

@@ -1,3 +1,5 @@
import sys
import json
from cgi import parse_header
from collections import namedtuple
from http.cookies import SimpleCookie
@@ -7,13 +9,19 @@ from urllib.parse import parse_qs, urlunparse
try:
from ujson import loads as json_loads
except ImportError:
from json import loads as json_loads
if sys.version_info[:2] == (3, 5):
def json_loads(data):
# on Python 3.5 json.loads only supports str not bytes
return json.loads(data.decode())
else:
json_loads = json.loads
from sanic.exceptions import InvalidUsage
from sanic.log import log
from sanic.log import error_logger
DEFAULT_HTTP_CONTENT_TYPE = "application/octet-stream"
# HTTP/1.1: https://www.w3.org/Protocols/rfc2616/rfc2616-sec7.html#sec7.2.1
# > If the media type remains unknown, the recipient SHOULD treat it
# > as type "application/octet-stream"
@@ -38,7 +46,8 @@ class Request(dict):
__slots__ = (
'app', 'headers', 'version', 'method', '_cookies', 'transport',
'body', 'parsed_json', 'parsed_args', 'parsed_form', 'parsed_files',
'_ip', '_parsed_url',
'_ip', '_parsed_url', 'uri_template', 'stream', '_remote_addr',
'_socket', '_port'
)
def __init__(self, url_bytes, headers, version, method, transport):
@@ -57,17 +66,31 @@ class Request(dict):
self.parsed_form = None
self.parsed_files = None
self.parsed_args = None
self.uri_template = None
self._cookies = None
self.stream = None
def __repr__(self):
if self.method is None or not self.path:
return '<{0}>'.format(self.__class__.__name__)
return '<{0}: {1} {2}>'.format(self.__class__.__name__,
self.method,
self.path)
@property
def json(self):
if self.parsed_json is None:
try:
self.parsed_json = json_loads(self.body)
except Exception:
if not self.body:
return None
raise InvalidUsage("Failed when parsing body as json")
self.load_json()
return self.parsed_json
def load_json(self, loads=json_loads):
try:
self.parsed_json = loads(self.body)
except Exception:
if not self.body:
return None
raise InvalidUsage("Failed when parsing body as json")
return self.parsed_json
@@ -77,11 +100,15 @@ class Request(dict):
:return: token related to request
"""
prefixes = ('Bearer', 'Token')
auth_header = self.headers.get('Authorization')
if 'Token ' in auth_header:
return auth_header.partition('Token ')[-1]
else:
return auth_header
if auth_header is not None:
for prefix in prefixes:
if prefix in auth_header:
return auth_header.partition(prefix)[-1].strip()
return auth_header
@property
def form(self):
@@ -101,7 +128,7 @@ class Request(dict):
self.parsed_form, self.parsed_files = (
parse_multipart_form(self.body, boundary))
except Exception:
log.exception("Failed when parsing form")
error_logger.exception("Failed when parsing form")
return self.parsed_form
@@ -129,7 +156,7 @@ class Request(dict):
@property
def cookies(self):
if self._cookies is None:
cookie = self.headers.get('Cookie') or self.headers.get('cookie')
cookie = self.headers.get('Cookie')
if cookie is not None:
cookies = SimpleCookie()
cookies.load(cookie)
@@ -141,10 +168,46 @@ class Request(dict):
@property
def ip(self):
if not hasattr(self, '_ip'):
self._ip = self.transport.get_extra_info('peername')
if not hasattr(self, '_socket'):
self._get_address()
return self._ip
@property
def port(self):
if not hasattr(self, '_socket'):
self._get_address()
return self._port
@property
def socket(self):
if not hasattr(self, '_socket'):
self._get_socket()
return self._socket
def _get_address(self):
self._socket = (self.transport.get_extra_info('peername') or
(None, None))
self._ip, self._port = self._socket
@property
def remote_addr(self):
"""Attempt to return the original client ip based on X-Forwarded-For.
:return: original client ip.
"""
if not hasattr(self, '_remote_addr'):
forwarded_for = self.headers.get('X-Forwarded-For', '').split(',')
remote_addrs = [
addr for addr in [
addr.strip() for addr in forwarded_for
] if addr
]
if len(remote_addrs) > 0:
self._remote_addr = remote_addrs[0]
else:
self._remote_addr = ''
return self._remote_addr
@property
def scheme(self):
if self.app.websocket_enabled \
@@ -164,6 +227,15 @@ class Request(dict):
# so pull it from the headers
return self.headers.get('Host', '')
@property
def content_type(self):
return self.headers.get('Content-Type', DEFAULT_HTTP_CONTENT_TYPE)
@property
def match_info(self):
"""return matched info after resolving route"""
return self.app.router.get(self)[2]
@property
def path(self):
return self._parsed_url.path.decode('utf-8')
@@ -215,15 +287,15 @@ def parse_multipart_form(body, boundary):
break
colon_index = form_line.index(':')
form_header_field = form_line[0:colon_index]
form_header_field = form_line[0:colon_index].lower()
form_header_value, form_parameters = parse_header(
form_line[colon_index + 2:])
if form_header_field == 'Content-Disposition':
if form_header_field == 'content-disposition':
if 'filename' in form_parameters:
file_name = form_parameters['filename']
field_name = form_parameters.get('name')
elif form_header_field == 'Content-Type':
elif form_header_field == 'content-type':
file_type = form_header_value
post_data = form_part[line_index:-4]

View File

@@ -3,20 +3,14 @@ from os import path
try:
from ujson import dumps as json_dumps
except:
except BaseException:
from json import dumps as json_dumps
from aiofiles import open as open_async
from sanic.cookies import CookieJar
COMMON_STATUS_CODES = {
200: b'OK',
400: b'Bad Request',
404: b'Not Found',
500: b'Internal Server Error',
}
ALL_STATUS_CODES = {
STATUS_CODES = {
100: b'Continue',
101: b'Switching Protocols',
102: b'Processing',
@@ -56,6 +50,7 @@ ALL_STATUS_CODES = {
415: b'Unsupported Media Type',
416: b'Requested Range Not Satisfiable',
417: b'Expectation Failed',
418: b'I\'m a teapot',
422: b'Unprocessable Entity',
423: b'Locked',
424: b'Failed Dependency',
@@ -63,6 +58,7 @@ ALL_STATUS_CODES = {
428: b'Precondition Required',
429: b'Too Many Requests',
431: b'Request Header Fields Too Large',
451: b'Unavailable For Legal Reasons',
500: b'Internal Server Error',
501: b'Not Implemented',
502: b'Bad Gateway',
@@ -109,8 +105,9 @@ class BaseHTTPResponse:
class StreamingHTTPResponse(BaseHTTPResponse):
__slots__ = (
'transport', 'streaming_fn',
'status', 'content_type', 'headers', '_cookies')
'transport', 'streaming_fn', 'status',
'content_type', 'headers', '_cookies'
)
def __init__(self, streaming_fn, status=200, headers=None,
content_type='text/plain'):
@@ -159,11 +156,10 @@ class StreamingHTTPResponse(BaseHTTPResponse):
headers = self._parse_headers()
# Try to pull from the common codes first
# Speeds up response rate 6% over pulling from all
status = COMMON_STATUS_CODES.get(self.status)
if not status:
status = ALL_STATUS_CODES.get(self.status)
if self.status is 200:
status = b'OK'
else:
status = STATUS_CODES.get(self.status)
return (b'HTTP/%b %d %b\r\n'
b'%b'
@@ -206,11 +202,10 @@ class HTTPResponse(BaseHTTPResponse):
headers = self._parse_headers()
# Try to pull from the common codes first
# Speeds up response rate 6% over pulling from all
status = COMMON_STATUS_CODES.get(self.status)
if not status:
status = ALL_STATUS_CODES.get(self.status, b'UNKNOWN RESPONSE')
if self.status is 200:
status = b'OK'
else:
status = STATUS_CODES.get(self.status, b'UNKNOWN RESPONSE')
return (b'HTTP/%b %d %b\r\n'
b'Connection: %b\r\n'
@@ -233,22 +228,26 @@ class HTTPResponse(BaseHTTPResponse):
return self._cookies
def json(body, status=200, headers=None, **kwargs):
def json(body, status=200, headers=None,
content_type="application/json", dumps=json_dumps,
**kwargs):
"""
Returns response object with body in json format.
:param body: Response data to be serialized.
:param status: Response code.
:param headers: Custom Headers.
:param kwargs: Remaining arguments that are passed to the json encoder.
"""
return HTTPResponse(json_dumps(body, **kwargs), headers=headers,
status=status, content_type="application/json")
return HTTPResponse(dumps(body, **kwargs), headers=headers,
status=status, content_type=content_type)
def text(body, status=200, headers=None,
content_type="text/plain; charset=utf-8"):
"""
Returns response object with body in text format.
:param body: Response data to be encoded.
:param status: Response code.
:param headers: Custom Headers.
@@ -263,6 +262,7 @@ def raw(body, status=200, headers=None,
content_type="application/octet-stream"):
"""
Returns response object without encoding the body.
:param body: Response data.
:param status: Response code.
:param headers: Custom Headers.
@@ -275,6 +275,7 @@ def raw(body, status=200, headers=None,
def html(body, status=200, headers=None):
"""
Returns response object with body in html format.
:param body: Response data to be encoded.
:param status: Response code.
:param headers: Custom Headers.
@@ -283,15 +284,22 @@ def html(body, status=200, headers=None):
content_type="text/html; charset=utf-8")
async def file(location, mime_type=None, headers=None, _range=None):
async def file(
location, mime_type=None, headers=None, filename=None, _range=None):
"""Return a response object with file data.
:param location: Location of file on system.
:param mime_type: Specific mime_type.
:param headers: Custom Headers.
:param filename: Override filename.
:param _range:
"""
filename = path.split(location)[-1]
headers = headers or {}
if filename:
headers.setdefault(
'Content-Disposition',
'attachment; filename="{}"'.format(filename))
filename = filename or path.split(location)[-1]
async with open_async(location, mode='rb') as _file:
if _range:
@@ -303,13 +311,66 @@ async def file(location, mime_type=None, headers=None, _range=None):
out_stream = await _file.read()
mime_type = mime_type or guess_type(filename)[0] or 'text/plain'
return HTTPResponse(status=200,
headers=headers,
content_type=mime_type,
body_bytes=out_stream)
async def file_stream(
location, chunk_size=4096, mime_type=None, headers=None,
filename=None, _range=None):
"""Return a streaming response object with file data.
:param location: Location of file on system.
:param chunk_size: The size of each chunk in the stream (in bytes)
:param mime_type: Specific mime_type.
:param headers: Custom Headers.
:param filename: Override filename.
:param _range:
"""
headers = headers or {}
if filename:
headers.setdefault(
'Content-Disposition',
'attachment; filename="{}"'.format(filename))
filename = filename or path.split(location)[-1]
_file = await open_async(location, mode='rb')
async def _streaming_fn(response):
nonlocal _file, chunk_size
try:
if _range:
chunk_size = min((_range.size, chunk_size))
await _file.seek(_range.start)
to_send = _range.size
while to_send > 0:
content = await _file.read(chunk_size)
if len(content) < 1:
break
to_send -= len(content)
response.write(content)
else:
while True:
content = await _file.read(chunk_size)
if len(content) < 1:
break
response.write(content)
finally:
await _file.close()
return # Returning from this fn closes the stream
mime_type = mime_type or guess_type(filename)[0] or 'text/plain'
if _range:
headers['Content-Range'] = 'bytes %s-%s/%s' % (
_range.start, _range.end, _range.total)
return StreamingHTTPResponse(streaming_fn=_streaming_fn,
status=200,
headers=headers,
content_type=mime_type)
def stream(
streaming_fn, status=200, headers=None,
content_type="text/plain; charset=utf-8"):

View File

@@ -8,7 +8,7 @@ from sanic.views import CompositionView
Route = namedtuple(
'Route',
['handler', 'methods', 'pattern', 'parameters', 'name'])
['handler', 'methods', 'pattern', 'parameters', 'name', 'uri'])
Parameter = namedtuple('Parameter', ['name', 'cast'])
REGEX_TYPES = {
@@ -67,6 +67,8 @@ class Router:
def __init__(self):
self.routes_all = {}
self.routes_names = {}
self.routes_static_files = {}
self.routes_static = {}
self.routes_dynamic = defaultdict(list)
self.routes_always_check = []
@@ -91,6 +93,10 @@ class Router:
pattern = 'string'
if ':' in parameter_string:
name, pattern = parameter_string.split(':', 1)
if not name:
raise ValueError(
"Invalid parameter syntax: {}".format(parameter_string)
)
default = (str, pattern)
# Pull from pre-configured types
@@ -98,32 +104,8 @@ class Router:
return name, _type, pattern
def add(self, uri, methods, handler, host=None, strict_slashes=False):
# add regular version
self._add(uri, methods, handler, host)
if strict_slashes:
return
# Add versions with and without trailing /
slash_is_missing = (
not uri[-1] == '/'
and not self.routes_all.get(uri + '/', False)
)
without_slash_is_missing = (
uri[-1] == '/'
and not self.routes_all.get(uri[:-1], False)
and not uri == '/'
)
# add version with trailing slash
if slash_is_missing:
self._add(uri + '/', methods, handler, host)
# add version without trailing slash
elif without_slash_is_missing:
self._add(uri[:-1], methods, handler, host)
def _add(self, uri, methods, handler, host=None):
def add(self, uri, methods, handler, host=None, strict_slashes=False,
version=None, name=None):
"""Add a handler to the route list
:param uri: path to match
@@ -131,6 +113,52 @@ class Router:
provided, any method is allowed
:param handler: request handler function.
When executed, it should provide a response object.
:param strict_slashes: strict to trailing slash
:param version: current version of the route or blueprint. See
docs for further details.
:return: Nothing
"""
if version is not None:
version = re.escape(str(version).strip('/').lstrip('v'))
uri = "/".join(["/v{}".format(version), uri.lstrip('/')])
# add regular version
self._add(uri, methods, handler, host, name)
if strict_slashes:
return
# Add versions with and without trailing /
slashed_methods = self.routes_all.get(uri + '/', frozenset({}))
if isinstance(methods, Iterable):
_slash_is_missing = all(method in slashed_methods for
method in methods)
else:
_slash_is_missing = methods in slashed_methods
slash_is_missing = (
not uri[-1] == '/' and not _slash_is_missing
)
without_slash_is_missing = (
uri[-1] == '/' and not
self.routes_all.get(uri[:-1], False) and not
uri == '/'
)
# add version with trailing slash
if slash_is_missing:
self._add(uri + '/', methods, handler, host, name)
# add version without trailing slash
elif without_slash_is_missing:
self._add(uri[:-1], methods, handler, host, name)
def _add(self, uri, methods, handler, host=None, name=None):
"""Add a handler to the route list
:param uri: path to match
:param methods: sequence of accepted method names. If none are
provided, any method is allowed
:param handler: request handler function.
When executed, it should provide a response object.
:param name: user defined route name for url_for
:return: Nothing
"""
if host is not None:
@@ -144,7 +172,7 @@ class Router:
"host strings, not {!r}".format(host))
for host_ in host:
self.add(uri, methods, handler, host_)
self.add(uri, methods, handler, host_, name)
return
# Dict for faster lookups of if method allowed
@@ -212,22 +240,38 @@ class Router:
else:
route = self.routes_all.get(uri)
# prefix the handler name with the blueprint name
# if available
# special prefix for static files
is_static = False
if name and name.startswith('_static_'):
is_static = True
name = name.split('_static_', 1)[-1]
if hasattr(handler, '__blueprintname__'):
handler_name = '{}.{}'.format(
handler.__blueprintname__, name or handler.__name__)
else:
handler_name = name or getattr(handler, '__name__', None)
if route:
route = merge_route(route, methods, handler)
else:
# prefix the handler name with the blueprint name
# if available
if hasattr(handler, '__blueprintname__'):
handler_name = '{}.{}'.format(
handler.__blueprintname__, handler.__name__)
else:
handler_name = getattr(handler, '__name__', None)
route = Route(
handler=handler, methods=methods, pattern=pattern,
parameters=parameters, name=handler_name)
parameters=parameters, name=handler_name, uri=uri)
self.routes_all[uri] = route
if is_static:
pair = self.routes_static_files.get(handler_name)
if not (pair and (pair[0] + '/' == uri or uri + '/' == pair[0])):
self.routes_static_files[handler_name] = (uri, route)
else:
pair = self.routes_names.get(handler_name)
if not (pair and (pair[0] + '/' == uri or uri + '/' == pair[0])):
self.routes_names[handler_name] = (uri, route)
if properties['unhashable']:
self.routes_always_check.append(route)
elif parameters:
@@ -248,6 +292,16 @@ class Router:
uri = host + uri
try:
route = self.routes_all.pop(uri)
for handler_name, pairs in self.routes_names.items():
if pairs[0] == uri:
self.routes_names.pop(handler_name)
break
for handler_name, pairs in self.routes_static_files.items():
if pairs[0] == uri:
self.routes_static_files.pop(handler_name)
break
except KeyError:
raise RouteDoesNotExist("Route was not registered: {}".format(uri))
@@ -263,20 +317,20 @@ class Router:
self._get.cache_clear()
@lru_cache(maxsize=ROUTER_CACHE_SIZE)
def find_route_by_view_name(self, view_name):
def find_route_by_view_name(self, view_name, name=None):
"""Find a route in the router based on the specified view name.
:param view_name: string of view name to search by
:param kwargs: additional params, usually for static files
:return: tuple containing (uri, Route)
"""
if not view_name:
return (None, None)
for uri, route in self.routes_all.items():
if route.name == view_name:
return uri, route
if view_name == 'static' or view_name.endswith('.static'):
return self.routes_static_files.get(name, (None, None))
return (None, None)
return self.routes_names.get(view_name, (None, None))
def get(self, request):
"""Get a request handler based on the URL of the request, or raises an
@@ -344,4 +398,18 @@ class Router:
route_handler = route.handler
if hasattr(route_handler, 'handlers'):
route_handler = route_handler.handlers[method]
return route_handler, [], kwargs
return route_handler, [], kwargs, route.uri
def is_stream_handler(self, request):
""" Handler for request is stream or not.
:param request: Request object
:return: bool
"""
try:
handler = self.get(request)[0]
except (NotFound, InvalidUsage):
return False
if (hasattr(handler, 'view_class') and
hasattr(handler.view_class, request.method.lower())):
handler = getattr(handler.view_class, request.method.lower())
return hasattr(handler, 'is_stream')

View File

@@ -1,7 +1,6 @@
import asyncio
import os
import traceback
import warnings
from functools import partial
from inspect import isawaitable
from multiprocessing import Process
@@ -25,10 +24,12 @@ try:
except ImportError:
async_loop = asyncio
from sanic.log import log
from sanic.log import logger, access_logger
from sanic.response import HTTPResponse
from sanic.request import Request
from sanic.exceptions import (
RequestTimeout, PayloadTooLarge, InvalidUsage, ServerError)
RequestTimeout, PayloadTooLarge, InvalidUsage, ServerError,
ServiceUnavailable)
current_time = None
@@ -63,39 +64,62 @@ class HttpProtocol(asyncio.Protocol):
# request params
'parser', 'request', 'url', 'headers',
# request config
'request_handler', 'request_timeout', 'request_max_size',
'request_class',
'request_handler', 'request_timeout', 'response_timeout',
'keep_alive_timeout', 'request_max_size', 'request_class',
'is_request_stream', 'router',
# enable or disable access log purpose
'access_log',
# connection management
'_total_request_size', '_timeout_handler', '_last_communication_time')
'_total_request_size', '_request_timeout_handler',
'_response_timeout_handler', '_keep_alive_timeout_handler',
'_last_request_time', '_last_response_time', '_is_stream_handler')
def __init__(self, *, loop, request_handler, error_handler,
signal=Signal(), connections=set(), request_timeout=60,
request_max_size=None, request_class=None,
keep_alive=True):
response_timeout=60, keep_alive_timeout=5,
request_max_size=None, request_class=None, access_log=True,
keep_alive=True, is_request_stream=False, router=None,
state=None, debug=False, **kwargs):
self.loop = loop
self.transport = None
self.request = None
self.parser = None
self.url = None
self.headers = None
self.router = router
self.signal = signal
self.access_log = access_log
self.connections = connections
self.request_handler = request_handler
self.error_handler = error_handler
self.request_timeout = request_timeout
self.response_timeout = response_timeout
self.keep_alive_timeout = keep_alive_timeout
self.request_max_size = request_max_size
self.request_class = request_class or Request
self.is_request_stream = is_request_stream
self._is_stream_handler = False
self._total_request_size = 0
self._timeout_handler = None
self._request_timeout_handler = None
self._response_timeout_handler = None
self._keep_alive_timeout_handler = None
self._last_request_time = None
self._last_response_time = None
self._request_handler_task = None
self._request_stream_task = None
self._keep_alive = keep_alive
self._header_fragment = b''
self.state = state if state else {}
if 'requests_count' not in self.state:
self.state['requests_count'] = 0
self._debug = debug
@property
def keep_alive(self):
return (self._keep_alive
and not self.signal.stopped
and self.parser.should_keep_alive())
return (
self._keep_alive and
not self.signal.stopped and
self.parser.should_keep_alive())
# -------------------------------------------- #
# Connection
@@ -103,27 +127,72 @@ class HttpProtocol(asyncio.Protocol):
def connection_made(self, transport):
self.connections.add(self)
self._timeout_handler = self.loop.call_later(
self.request_timeout, self.connection_timeout)
self._request_timeout_handler = self.loop.call_later(
self.request_timeout, self.request_timeout_callback)
self.transport = transport
self._last_request_time = current_time
def connection_lost(self, exc):
self.connections.discard(self)
self._timeout_handler.cancel()
if self._request_timeout_handler:
self._request_timeout_handler.cancel()
if self._response_timeout_handler:
self._response_timeout_handler.cancel()
if self._keep_alive_timeout_handler:
self._keep_alive_timeout_handler.cancel()
def connection_timeout(self):
# Check if
def request_timeout_callback(self):
# See the docstring in the RequestTimeout exception, to see
# exactly what this timeout is checking for.
# Check if elapsed time since request initiated exceeds our
# configured maximum request timeout value
time_elapsed = current_time - self._last_request_time
if time_elapsed < self.request_timeout:
time_left = self.request_timeout - time_elapsed
self._timeout_handler = (
self.loop.call_later(time_left, self.connection_timeout))
self._request_timeout_handler = (
self.loop.call_later(time_left,
self.request_timeout_callback)
)
else:
if self._request_stream_task:
self._request_stream_task.cancel()
if self._request_handler_task:
self._request_handler_task.cancel()
exception = RequestTimeout('Request Timeout')
self.write_error(exception)
try:
raise RequestTimeout('Request Timeout')
except RequestTimeout as exception:
self.write_error(exception)
def response_timeout_callback(self):
# Check if elapsed time since response was initiated exceeds our
# configured maximum request timeout value
time_elapsed = current_time - self._last_request_time
if time_elapsed < self.response_timeout:
time_left = self.response_timeout - time_elapsed
self._response_timeout_handler = (
self.loop.call_later(time_left,
self.response_timeout_callback)
)
else:
try:
raise ServiceUnavailable('Response Timeout')
except ServiceUnavailable as exception:
self.write_error(exception)
def keep_alive_timeout_callback(self):
# Check if elapsed time since last response exceeds our configured
# maximum keep alive timeout value
time_elapsed = current_time - self._last_response_time
if time_elapsed < self.keep_alive_timeout:
time_left = self.keep_alive_timeout - time_elapsed
self._keep_alive_timeout_handler = (
self.loop.call_later(time_left,
self.keep_alive_timeout_callback)
)
else:
logger.info('KeepAlive Timeout. Closing connection.')
self.transport.close()
self.transport = None
# -------------------------------------------- #
# Parsing
@@ -143,22 +212,41 @@ class HttpProtocol(asyncio.Protocol):
self.headers = []
self.parser = HttpRequestParser(self)
# requests count
self.state['requests_count'] = self.state['requests_count'] + 1
# Parse request chunk or close connection
try:
self.parser.feed_data(data)
except HttpParserError:
exception = InvalidUsage('Bad Request')
message = 'Bad Request'
if self._debug:
message += '\n' + traceback.format_exc()
exception = InvalidUsage(message)
self.write_error(exception)
def on_url(self, url):
self.url = url
if not self.url:
self.url = url
else:
self.url += url
def on_header(self, name, value):
if name == b'Content-Length' and int(value) > self.request_max_size:
exception = PayloadTooLarge('Payload Too Large')
self.write_error(exception)
self._header_fragment += name
self.headers.append((name.decode().casefold(), value.decode()))
if value is not None:
if self._header_fragment == b'Content-Length' \
and int(value) > self.request_max_size:
exception = PayloadTooLarge('Payload Too Large')
self.write_error(exception)
try:
value = value.decode()
except UnicodeDecodeError:
value = value.decode('latin_1')
self.headers.append(
(self._header_fragment.decode().casefold(), value))
self._header_fragment = b''
def on_headers_complete(self):
self.request = self.request_class(
@@ -168,14 +256,42 @@ class HttpProtocol(asyncio.Protocol):
method=self.parser.get_method().decode(),
transport=self.transport
)
# Remove any existing KeepAlive handler here,
# It will be recreated if required on the new request.
if self._keep_alive_timeout_handler:
self._keep_alive_timeout_handler.cancel()
self._keep_alive_timeout_handler = None
if self.is_request_stream:
self._is_stream_handler = self.router.is_stream_handler(
self.request)
if self._is_stream_handler:
self.request.stream = asyncio.Queue()
self.execute_request_handler()
def on_body(self, body):
if self.is_request_stream and self._is_stream_handler:
self._request_stream_task = self.loop.create_task(
self.request.stream.put(body))
return
self.request.body.append(body)
def on_message_complete(self):
if self.request.body:
self.request.body = b''.join(self.request.body)
# Entire request (headers and whole body) is received.
# We can cancel and remove the request timeout handler now.
if self._request_timeout_handler:
self._request_timeout_handler.cancel()
self._request_timeout_handler = None
if self.is_request_stream and self._is_stream_handler:
self._request_stream_task = self.loop.create_task(
self.request.stream.put(None))
return
self.request.body = b''.join(self.request.body)
self.execute_request_handler()
def execute_request_handler(self):
self._response_timeout_handler = self.loop.call_later(
self.response_timeout, self.response_timeout_callback)
self._last_request_time = current_time
self._request_handler_task = self.loop.create_task(
self.request_handler(
self.request,
@@ -185,26 +301,53 @@ class HttpProtocol(asyncio.Protocol):
# -------------------------------------------- #
# Responding
# -------------------------------------------- #
def log_response(self, response):
if self.access_log:
extra = {
'status': getattr(response, 'status', 0),
}
if isinstance(response, HTTPResponse):
extra['byte'] = len(response.body)
else:
extra['byte'] = -1
extra['host'] = 'UNKNOWN'
if self.request is not None:
if self.request.ip:
extra['host'] = '{0[0]}:{0[1]}'.format(self.request.ip)
extra['request'] = '{0} {1}'.format(self.request.method,
self.request.url)
else:
extra['request'] = 'nil'
access_logger.info('', extra=extra)
def write_response(self, response):
"""
Writes response content synchronously to the transport.
"""
if self._response_timeout_handler:
self._response_timeout_handler.cancel()
self._response_timeout_handler = None
try:
keep_alive = self.keep_alive
self.transport.write(
response.output(
self.request.version, keep_alive,
self.request_timeout))
self.keep_alive_timeout))
self.log_response(response)
except AttributeError:
log.error(
('Invalid response object for url {}, '
'Expected Type: HTTPResponse, Actual Type: {}').format(
self.url, type(response)))
logger.error('Invalid response object for url %s, '
'Expected Type: HTTPResponse, Actual Type: %s',
self.url, type(response))
self.write_error(ServerError('Invalid response type'))
except RuntimeError:
log.error(
'Connection lost before response written @ {}'.format(
self.request.ip))
if self._debug:
logger.error('Connection lost before response written @ %s',
self.request.ip)
keep_alive = False
except Exception as e:
self.bail_out(
"Writing response failed, connection closed {}".format(
@@ -212,8 +355,12 @@ class HttpProtocol(asyncio.Protocol):
finally:
if not keep_alive:
self.transport.close()
self.transport = None
else:
self._last_request_time = current_time
self._keep_alive_timeout_handler = self.loop.call_later(
self.keep_alive_timeout,
self.keep_alive_timeout_callback)
self._last_response_time = current_time
self.cleanup()
async def stream_response(self, response):
@@ -222,22 +369,25 @@ class HttpProtocol(asyncio.Protocol):
the transport to the response so the response consumer can
write to the response as needed.
"""
if self._response_timeout_handler:
self._response_timeout_handler.cancel()
self._response_timeout_handler = None
try:
keep_alive = self.keep_alive
response.transport = self.transport
await response.stream(
self.request.version, keep_alive, self.request_timeout)
self.request.version, keep_alive, self.keep_alive_timeout)
self.log_response(response)
except AttributeError:
log.error(
('Invalid response object for url {}, '
'Expected Type: HTTPResponse, Actual Type: {}').format(
self.url, type(response)))
logger.error('Invalid response object for url %s, '
'Expected Type: HTTPResponse, Actual Type: %s',
self.url, type(response))
self.write_error(ServerError('Invalid response type'))
except RuntimeError:
log.error(
'Connection lost before response written @ {}'.format(
self.request.ip))
if self._debug:
logger.error('Connection lost before response written @ %s',
self.request.ip)
keep_alive = False
except Exception as e:
self.bail_out(
"Writing response failed, connection closed {}".format(
@@ -245,46 +395,63 @@ class HttpProtocol(asyncio.Protocol):
finally:
if not keep_alive:
self.transport.close()
self.transport = None
else:
self._last_request_time = current_time
self._keep_alive_timeout_handler = self.loop.call_later(
self.keep_alive_timeout,
self.keep_alive_timeout_callback)
self._last_response_time = current_time
self.cleanup()
def write_error(self, exception):
# An error _is_ a response.
# Don't throw a response timeout, when a response _is_ given.
if self._response_timeout_handler:
self._response_timeout_handler.cancel()
self._response_timeout_handler = None
response = None
try:
response = self.error_handler.response(self.request, exception)
version = self.request.version if self.request else '1.1'
self.transport.write(response.output(version))
except RuntimeError:
log.error(
'Connection lost before error written @ {}'.format(
self.request.ip if self.request else 'Unknown'))
if self._debug:
logger.error('Connection lost before error written @ %s',
self.request.ip if self.request else 'Unknown')
except Exception as e:
self.bail_out(
"Writing error failed, connection closed {}".format(repr(e)),
from_error=True)
"Writing error failed, connection closed {}".format(
repr(e)), from_error=True
)
finally:
if self.parser and (self.keep_alive
or getattr(response, 'status', 0) == 408):
self.log_response(response)
self.transport.close()
def bail_out(self, message, from_error=False):
if from_error or self.transport.is_closing():
log.error(
("Transport closed @ {} and exception "
"experienced during error handling").format(
self.transport.get_extra_info('peername')))
log.debug(
'Exception:\n{}'.format(traceback.format_exc()))
logger.error("Transport closed @ %s and exception "
"experienced during error handling",
self.transport.get_extra_info('peername'))
logger.debug('Exception:\n%s', traceback.format_exc())
else:
exception = ServerError(message)
self.write_error(exception)
log.error(message)
logger.error(message)
def cleanup(self):
"""This is called when KeepAlive feature is used,
it resets the connection in order for it to be able
to handle receiving another request on the same connection."""
self.parser = None
self.request = None
self.url = None
self.headers = None
self._request_handler_task = None
self._request_stream_task = None
self._total_request_size = 0
self._is_stream_handler = False
def close_if_idle(self):
"""Close the connection if a request is not being sent or received
@@ -296,6 +463,14 @@ class HttpProtocol(asyncio.Protocol):
return True
return False
def close(self):
"""
Force close the connection.
"""
if self.transport is not None:
self.transport.close()
self.transport = None
def update_current_time(loop):
"""Cache the current time, since it is needed at the end of every
@@ -323,10 +498,14 @@ def trigger_events(events, loop):
def serve(host, port, request_handler, error_handler, before_start=None,
after_start=None, before_stop=None, after_stop=None, debug=False,
request_timeout=60, ssl=None, sock=None, request_max_size=None,
reuse_port=False, loop=None, protocol=HttpProtocol, backlog=100,
request_timeout=60, response_timeout=60, keep_alive_timeout=5,
ssl=None, sock=None, request_max_size=None, reuse_port=False,
loop=None, protocol=HttpProtocol, backlog=100,
register_sys_signals=True, run_async=False, connections=None,
signal=Signal(), request_class=None, keep_alive=True):
signal=Signal(), request_class=None, access_log=True,
keep_alive=True, is_request_stream=False, router=None,
websocket_max_size=None, websocket_max_queue=None, state=None,
graceful_shutdown_timeout=15.0):
"""Start asynchronous HTTP Server on an individual process.
:param host: Address to host on
@@ -345,6 +524,8 @@ def serve(host, port, request_handler, error_handler, before_start=None,
`app` instance and `loop`
:param debug: enables debug output (slows server)
:param request_timeout: time in seconds
:param response_timeout: time in seconds
:param keep_alive_timeout: time in seconds
:param ssl: SSLContext
:param sock: Socket for the server to accept connections from
:param request_max_size: size in bytes, `None` for no limit
@@ -352,6 +533,9 @@ def serve(host, port, request_handler, error_handler, before_start=None,
:param loop: asyncio compatible event loop
:param protocol: subclass of asyncio protocol class
:param request_class: Request class to use
:param access_log: disable/enable access log
:param is_request_stream: disable/enable Request.stream
:param router: Router object
:return: Nothing
"""
if not run_async:
@@ -361,8 +545,6 @@ def serve(host, port, request_handler, error_handler, before_start=None,
if debug:
loop.set_debug(debug)
trigger_events(before_start, loop)
connections = connections if connections is not None else set()
server = partial(
protocol,
@@ -372,9 +554,18 @@ def serve(host, port, request_handler, error_handler, before_start=None,
request_handler=request_handler,
error_handler=error_handler,
request_timeout=request_timeout,
response_timeout=response_timeout,
keep_alive_timeout=keep_alive_timeout,
request_max_size=request_max_size,
request_class=request_class,
access_log=access_log,
keep_alive=keep_alive,
is_request_stream=is_request_stream,
router=router,
websocket_max_size=websocket_max_size,
websocket_max_queue=websocket_max_queue,
state=state,
debug=debug,
)
server_coroutine = loop.create_server(
@@ -386,6 +577,7 @@ def serve(host, port, request_handler, error_handler, before_start=None,
sock=sock,
backlog=backlog
)
# Instead of pulling time at the end of every request,
# pull it once per minute
loop.call_soon(partial(update_current_time, loop))
@@ -393,10 +585,12 @@ def serve(host, port, request_handler, error_handler, before_start=None,
if run_async:
return server_coroutine
trigger_events(before_start, loop)
try:
http_server = loop.run_until_complete(server_coroutine)
except:
log.exception("Unable to start server")
except BaseException:
logger.exception("Unable to start server")
return
trigger_events(after_start, loop)
@@ -407,14 +601,14 @@ def serve(host, port, request_handler, error_handler, before_start=None,
try:
loop.add_signal_handler(_signal, loop.stop)
except NotImplementedError:
log.warn('Sanic tried to use loop.add_signal_handler but it is'
' not implemented on this platform.')
logger.warning('Sanic tried to use loop.add_signal_handler '
'but it is not implemented on this platform.')
pid = os.getpid()
try:
log.info('Starting worker [{}]'.format(pid))
logger.info('Starting worker [%s]', pid)
loop.run_forever()
finally:
log.info("Stopping worker [{}]".format(pid))
logger.info("Stopping worker [%s]", pid)
# Run the on_stop function if provided
trigger_events(before_stop, loop)
@@ -428,8 +622,28 @@ def serve(host, port, request_handler, error_handler, before_start=None,
for connection in connections:
connection.close_if_idle()
while connections:
# Gracefully shutdown timeout.
# We should provide graceful_shutdown_timeout,
# instead of letting connection hangs forever.
# Let's roughly calcucate time.
start_shutdown = 0
while connections and (start_shutdown < graceful_shutdown_timeout):
loop.run_until_complete(asyncio.sleep(0.1))
start_shutdown = start_shutdown + 0.1
# Force close non-idle connection after waiting for
# graceful_shutdown_timeout
coros = []
for conn in connections:
if hasattr(conn, "websocket") and conn.websocket:
coros.append(
conn.websocket.close_connection(after_handshake=True)
)
else:
conn.close()
_shutdown = asyncio.gather(*coros, loop=loop)
loop.run_until_complete(_shutdown)
trigger_events(after_stop, loop)
@@ -445,12 +659,6 @@ def serve_multiple(server_settings, workers):
:param stop_event: if provided, is used as a stop signal
:return:
"""
if server_settings.get('loop', None) is not None:
if server_settings.get('debug', False):
warnings.simplefilter('default')
warnings.warn("Passing a loop will be deprecated in version 0.4.0"
" https://github.com/channelcat/sanic/pull/335"
" has more information.", DeprecationWarning)
server_settings['reuse_port'] = True
# Handling when custom socket is not provided.
@@ -464,8 +672,9 @@ def serve_multiple(server_settings, workers):
server_settings['port'] = None
def sig_handler(signal, frame):
log.info("Received signal {}. Shutting down.".format(
Signals(signal).name))
logger.info("Received signal %s. Shutting down.", Signals(signal).name)
for process in processes:
os.kill(process.pid, SIGINT)
signal_func(SIGINT, lambda s, f: sig_handler(s, f))
signal_func(SIGTERM, lambda s, f: sig_handler(s, f))
@@ -484,5 +693,3 @@ def serve_multiple(server_settings, workers):
for process in processes:
process.terminate()
server_settings.get('sock').close()
asyncio.get_event_loop().stop()

View File

@@ -13,11 +13,13 @@ from sanic.exceptions import (
InvalidUsage,
)
from sanic.handlers import ContentRangeHandler
from sanic.response import file, HTTPResponse
from sanic.response import file, file_stream, HTTPResponse
def register(app, uri, file_or_directory, pattern,
use_modified_since, use_content_range):
use_modified_since, use_content_range,
stream_large_files, name='static', host=None,
strict_slashes=None):
# TODO: Though sanic is not a file server, I feel like we should at least
# make a good effort here. Modified-since is nice, but we could
# also look into etags, expires, and caching
@@ -34,6 +36,11 @@ def register(app, uri, file_or_directory, pattern,
server's
:param use_content_range: If true, process header for range requests
and sends the file part that is requested
:param stream_large_files: If true, use the file_stream() handler rather
than the file() handler to send the file
If this is an integer, this represents the
threshold size to switch to file_stream()
:param name: user defined name used for url_for
"""
# If we're not trying to match a file directly,
# serve from the folder
@@ -93,6 +100,17 @@ def register(app, uri, file_or_directory, pattern,
headers=headers,
content_type=guess_type(file_path)[0] or 'text/plain')
else:
if stream_large_files:
if isinstance(stream_large_files, int):
threshold = stream_large_files
else:
threshold = 1024 * 1024
if not stats:
stats = await stat(file_path)
if stats.st_size >= threshold:
return await file_stream(file_path, headers=headers,
_range=_range)
return await file(file_path, headers=headers, _range=_range)
except ContentRangeError:
raise
@@ -101,4 +119,9 @@ def register(app, uri, file_or_directory, pattern,
path=file_or_directory,
relative_url=file_uri)
app.route(uri, methods=['GET', 'HEAD'])(_handler)
# special prefix for static files
if not name.startswith('_static_'):
name = '_static_{}'.format(name)
app.route(uri, methods=['GET', 'HEAD'], name=name, host=host,
strict_slashes=strict_slashes)(_handler)

View File

@@ -1,14 +1,16 @@
import traceback
from json import JSONDecodeError
from sanic.log import log
from sanic.log import logger
HOST = '127.0.0.1'
PORT = 42101
class SanicTestClient:
def __init__(self, app):
def __init__(self, app, port=PORT):
self.app = app
self.port = port
async def _local_request(self, method, uri, cookies=None, *args, **kwargs):
import aiohttp
@@ -16,9 +18,9 @@ class SanicTestClient:
url = uri
else:
url = 'http://{host}:{port}{uri}'.format(
host=HOST, port=PORT, uri=uri)
host=HOST, port=self.port, uri=uri)
log.info(url)
logger.info(url)
conn = aiohttp.TCPConnector(verify_ssl=False)
async with aiohttp.ClientSession(
cookies=cookies, connector=conn) as session:
@@ -28,6 +30,14 @@ class SanicTestClient:
response.text = await response.text()
except UnicodeDecodeError as e:
response.text = None
try:
response.json = await response.json()
except (JSONDecodeError,
UnicodeDecodeError,
aiohttp.ClientResponseError):
response.json = None
response.body = await response.read()
return response
@@ -52,12 +62,12 @@ class SanicTestClient:
**request_kwargs)
results[-1] = response
except Exception as e:
log.error(
logger.error(
'Exception:\n{}'.format(traceback.format_exc()))
exceptions.append(e)
self.app.stop()
self.app.run(host=HOST, debug=debug, port=PORT, **server_kwargs)
self.app.run(host=HOST, debug=debug, port=self.port, **server_kwargs)
self.app.listeners['after_server_start'].pop()
if exceptions:
@@ -67,14 +77,14 @@ class SanicTestClient:
try:
request, response = results
return request, response
except:
except BaseException:
raise ValueError(
"Request and response object expected, got ({})".format(
results))
else:
try:
return results[-1]
except:
except BaseException:
raise ValueError(
"Request object expected, got ({})".format(results))

View File

@@ -1,17 +0,0 @@
import warnings
from sanic.testing import SanicTestClient
def sanic_endpoint_test(app, method='get', uri='/', gather_request=True,
debug=False, server_kwargs={},
*request_args, **request_kwargs):
warnings.warn(
"Use of sanic_endpoint_test will be deprecated in"
"the next major version after 0.4.0. Please use the `test_client` "
"available on the app object.", DeprecationWarning)
test_client = SanicTestClient(app)
return test_client._sanic_endpoint_test(
method, uri, gather_request, debug, server_kwargs,
*request_args, **request_kwargs)

View File

@@ -64,6 +64,11 @@ class HTTPMethodView:
return view
def stream(func):
func.is_stream = True
return func
class CompositionView:
"""Simple method-function mapped view for the sanic.
You can add handler functions to methods (get, post, put, patch, delete)
@@ -83,7 +88,9 @@ class CompositionView:
def __init__(self):
self.handlers = {}
def add(self, methods, handler):
def add(self, methods, handler, stream=False):
if stream:
handler.is_stream = stream
for method in methods:
if method not in HTTP_METHODS:
raise InvalidUsage(

View File

@@ -6,14 +6,25 @@ from websockets import ConnectionClosed # noqa
class WebSocketProtocol(HttpProtocol):
def __init__(self, *args, **kwargs):
def __init__(self, *args, websocket_max_size=None,
websocket_max_queue=None, **kwargs):
super().__init__(*args, **kwargs)
self.websocket = None
self.websocket_max_size = websocket_max_size
self.websocket_max_queue = websocket_max_queue
def connection_timeout(self):
# timeouts make no sense for websocket routes
# timeouts make no sense for websocket routes
def request_timeout_callback(self):
if self.websocket is None:
super().connection_timeout()
super().request_timeout_callback()
def response_timeout_callback(self):
if self.websocket is None:
super().response_timeout_callback()
def keep_alive_timeout_callback(self):
if self.websocket is None:
super().keep_alive_timeout_callback()
def connection_lost(self, exc):
if self.websocket is not None:
@@ -38,7 +49,7 @@ class WebSocketProtocol(HttpProtocol):
else:
super().write_response(response)
async def websocket_handshake(self, request):
async def websocket_handshake(self, request, subprotocols=None):
# let the websockets package do the handshake with the client
headers = []
@@ -54,6 +65,17 @@ class WebSocketProtocol(HttpProtocol):
except InvalidHandshake:
raise InvalidUsage('Invalid websocket request')
subprotocol = None
if subprotocols and 'Sec-Websocket-Protocol' in request.headers:
# select a subprotocol
client_subprotocols = [p.strip() for p in request.headers[
'Sec-Websocket-Protocol'].split(',')]
for p in client_subprotocols:
if p in subprotocols:
subprotocol = p
set_header('Sec-Websocket-Protocol', subprotocol)
break
# write the 101 response back to the client
rv = b'HTTP/1.1 101 Switching Protocols\r\n'
for k, v in headers:
@@ -62,6 +84,11 @@ class WebSocketProtocol(HttpProtocol):
request.transport.write(rv)
# hook up the websocket protocol
self.websocket = WebSocketCommonProtocol()
self.websocket = WebSocketCommonProtocol(
max_size=self.websocket_max_size,
max_queue=self.websocket_max_queue
)
self.websocket.subprotocol = subprotocol
self.websocket.connection_made(request.transport)
self.websocket.connection_open()
return self.websocket

View File

@@ -3,13 +3,18 @@ import sys
import signal
import asyncio
import logging
import traceback
try:
import ssl
except ImportError:
ssl = None
import uvloop
try:
import uvloop
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
except ImportError:
pass
import gunicorn.workers.base as base
from sanic.server import trigger_events, serve, HttpProtocol, Signal
@@ -18,6 +23,9 @@ from sanic.websocket import WebSocketProtocol
class GunicornWorker(base.Worker):
http_protocol = HttpProtocol
websocket_protocol = WebSocketProtocol
def __init__(self, *args, **kw): # pragma: no cover
super().__init__(*args, **kw)
cfg = self.cfg
@@ -25,7 +33,7 @@ class GunicornWorker(base.Worker):
self.ssl_context = self._create_ssl_context(cfg)
else:
self.ssl_context = None
self.servers = []
self.servers = {}
self.connections = set()
self.exit_code = 0
self.signal = Signal()
@@ -34,7 +42,6 @@ class GunicornWorker(base.Worker):
# create new event_loop after fork
asyncio.get_event_loop().close()
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
self.loop = asyncio.new_event_loop()
asyncio.set_event_loop(self.loop)
@@ -42,11 +49,10 @@ class GunicornWorker(base.Worker):
def run(self):
is_debug = self.log.loglevel == logging.DEBUG
protocol = (WebSocketProtocol if self.app.callable.websocket_enabled
else HttpProtocol)
protocol = (
self.websocket_protocol if self.app.callable.websocket_enabled
else self.http_protocol)
self._server_settings = self.app.callable._helper(
host=None,
port=None,
loop=self.loop,
debug=is_debug,
protocol=protocol,
@@ -68,10 +74,16 @@ class GunicornWorker(base.Worker):
trigger_events(self._server_settings.get('before_stop', []),
self.loop)
self.loop.run_until_complete(self.close())
except BaseException:
traceback.print_exc()
finally:
trigger_events(self._server_settings.get('after_stop', []),
self.loop)
self.loop.close()
try:
trigger_events(self._server_settings.get('after_stop', []),
self.loop)
except BaseException:
traceback.print_exc()
finally:
self.loop.close()
sys.exit(self.exit_code)
@@ -90,16 +102,39 @@ class GunicornWorker(base.Worker):
for conn in self.connections:
conn.close_if_idle()
while self.connections:
# gracefully shutdown timeout
start_shutdown = 0
graceful_shutdown_timeout = self.cfg.graceful_timeout
while self.connections and \
(start_shutdown < graceful_shutdown_timeout):
await asyncio.sleep(0.1)
start_shutdown = start_shutdown + 0.1
# Force close non-idle connection after waiting for
# graceful_shutdown_timeout
coros = []
for conn in self.connections:
if hasattr(conn, "websocket") and conn.websocket:
coros.append(
conn.websocket.close_connection(after_handshake=False)
)
else:
conn.close()
_shutdown = asyncio.gather(*coros, loop=self.loop)
await _shutdown
async def _run(self):
for sock in self.sockets:
self.servers.append(await serve(
state = dict(requests_count=0)
self._server_settings["host"] = None
self._server_settings["port"] = None
server = await serve(
sock=sock,
connections=self.connections,
state=state,
**self._server_settings
))
)
self.servers[server] = state
async def _check_alive(self):
# If our parent changed then we shut down.
@@ -108,7 +143,14 @@ class GunicornWorker(base.Worker):
while self.alive:
self.notify()
if pid == os.getpid() and self.ppid != os.getppid():
req_count = sum(
self.servers[srv]["requests_count"] for srv in self.servers
)
if self.max_requests and req_count > self.max_requests:
self.alive = False
self.log.info("Max requests exceeded, shutting down: %s",
self)
elif pid == os.getpid() and self.ppid != os.getppid():
self.alive = False
self.log.info("Parent changed, shutting down: %s", self)
else:
@@ -158,9 +200,11 @@ class GunicornWorker(base.Worker):
def handle_quit(self, sig, frame):
self.alive = False
self.app.callable.is_running = False
self.cfg.worker_int(self)
def handle_abort(self, sig, frame):
self.alive = False
self.exit_code = 1
self.cfg.worker_abort(self)
sys.exit(1)

View File

@@ -9,14 +9,27 @@ from distutils.util import strtobool
from setuptools import setup
with codecs.open(os.path.join(os.path.abspath(os.path.dirname(
__file__)), 'sanic', '__init__.py'), 'r', 'latin1') as fp:
def open_local(paths, mode='r', encoding='utf8'):
path = os.path.join(
os.path.abspath(os.path.dirname(__file__)),
*paths
)
return codecs.open(path, mode, encoding)
with open_local(['sanic', '__init__.py'], encoding='latin1') as fp:
try:
version = re.findall(r"^__version__ = '([^']+)'\r?$",
fp.read(), re.M)[0]
except IndexError:
raise RuntimeError('Unable to determine version.')
with open_local(['README.rst']) as rm:
long_description = rm.read()
setup_kwargs = {
'name': 'sanic',
'version': version,
@@ -26,6 +39,7 @@ setup_kwargs = {
'author_email': 'channelcat@gmail.com',
'description': (
'A microframework based on uvloop, httptools, and learnings of flask'),
'long_description': long_description,
'packages': ['sanic'],
'platforms': 'any',
'classifiers': [
@@ -45,7 +59,7 @@ requirements = [
uvloop,
ujson,
'aiofiles>=0.3.0',
'websockets>=3.2',
'websockets>=4.0',
]
if strtobool(os.environ.get("SANIC_NO_UJSON", "no")):
print("Installing without uJSON")

View File

@@ -0,0 +1 @@
I am just a regular static file that needs to have its uri decoded

BIN
tests/static/bp/python.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 11 KiB

View File

@@ -0,0 +1 @@
I am just a regular static file

View File

@@ -1,16 +1,42 @@
import asyncio
import inspect
import pytest
from sanic import Sanic
from sanic.blueprints import Blueprint
from sanic.response import json, text
from sanic.exceptions import NotFound, ServerError, InvalidUsage
from sanic.constants import HTTP_METHODS
# ------------------------------------------------------------ #
# GET
# ------------------------------------------------------------ #
@pytest.mark.parametrize('method', HTTP_METHODS)
def test_versioned_routes_get(method):
app = Sanic('test_shorhand_routes_get')
bp = Blueprint('test_text')
method = method.lower()
func = getattr(bp, method)
if callable(func):
@func('/{}'.format(method), version=1)
def handler(request):
return text('OK')
else:
print(func)
raise
app.blueprint(bp)
client_method = getattr(app.test_client, method)
request, response = client_method('/v1/{}'.format(method))
assert response.status == 200
def test_bp():
app = Sanic('test_text')
bp = Blueprint('test_text')
@@ -21,6 +47,7 @@ def test_bp():
app.blueprint(bp)
request, response = app.test_client.get('/')
assert app.is_request_stream is False
assert response.text == 'Hello'
@@ -40,6 +67,7 @@ def test_bp_strict_slash():
request, response = app.test_client.get('/get')
assert response.text == 'OK'
assert response.json == None
request, response = app.test_client.get('/get/')
assert response.status == 404
@@ -50,6 +78,65 @@ def test_bp_strict_slash():
request, response = app.test_client.post('/post')
assert response.status == 404
def test_bp_strict_slash_default_value():
app = Sanic('test_route_strict_slash')
bp = Blueprint('test_text', strict_slashes=True)
@bp.get('/get')
def handler(request):
return text('OK')
@bp.post('/post/')
def handler(request):
return text('OK')
app.blueprint(bp)
request, response = app.test_client.get('/get/')
assert response.status == 404
request, response = app.test_client.post('/post')
assert response.status == 404
def test_bp_strict_slash_without_passing_default_value():
app = Sanic('test_route_strict_slash')
bp = Blueprint('test_text')
@bp.get('/get')
def handler(request):
return text('OK')
@bp.post('/post/')
def handler(request):
return text('OK')
app.blueprint(bp)
request, response = app.test_client.get('/get/')
assert response.text == 'OK'
request, response = app.test_client.post('/post')
assert response.text == 'OK'
def test_bp_strict_slash_default_value_can_be_overwritten():
app = Sanic('test_route_strict_slash')
bp = Blueprint('test_text', strict_slashes=True)
@bp.get('/get', strict_slashes=False)
def handler(request):
return text('OK')
@bp.post('/post/', strict_slashes=False)
def handler(request):
return text('OK')
app.blueprint(bp)
request, response = app.test_client.get('/get/')
assert response.text == 'OK'
request, response = app.test_client.post('/post')
assert response.text == 'OK'
def test_bp_with_url_prefix():
app = Sanic('test_text')
@@ -268,38 +355,48 @@ def test_bp_shorthand():
@blueprint.get('/get')
def handler(request):
assert request.stream is None
return text('OK')
@blueprint.put('/put')
def handler(request):
assert request.stream is None
return text('OK')
@blueprint.post('/post')
def handler(request):
assert request.stream is None
return text('OK')
@blueprint.head('/head')
def handler(request):
assert request.stream is None
return text('OK')
@blueprint.options('/options')
def handler(request):
assert request.stream is None
return text('OK')
@blueprint.patch('/patch')
def handler(request):
assert request.stream is None
return text('OK')
@blueprint.delete('/delete')
def handler(request):
assert request.stream is None
return text('OK')
@blueprint.websocket('/ws')
async def handler(request, ws):
assert request.stream is None
ev.set()
app.blueprint(blueprint)
assert app.is_request_stream is False
request, response = app.test_client.get('/get')
assert response.text == 'OK'

View File

@@ -19,15 +19,21 @@ def test_load_from_object():
def test_auto_load_env():
environ["SANIC_TEST_ANSWER"] = "42"
app = Sanic()
assert app.config.TEST_ANSWER == "42"
assert app.config.TEST_ANSWER == 42
del environ["SANIC_TEST_ANSWER"]
def test_auto_load_env():
def test_dont_load_env():
environ["SANIC_TEST_ANSWER"] = "42"
app = Sanic(load_env=False)
assert getattr(app.config, 'TEST_ANSWER', None) == None
del environ["SANIC_TEST_ANSWER"]
def test_load_env_prefix():
environ["MYAPP_TEST_ANSWER"] = "42"
app = Sanic(load_env='MYAPP_')
assert app.config.TEST_ANSWER == 42
del environ["MYAPP_TEST_ANSWER"]
def test_load_from_file():
app = Sanic('test_load_from_file')
config = b"""

View File

@@ -25,6 +25,25 @@ def test_cookies():
assert response.text == 'Cookies are: working!'
assert response_cookies['right_back'].value == 'at you'
@pytest.mark.parametrize("httponly,expected", [
(False, False),
(True, True),
])
def test_false_cookies_encoded(httponly, expected):
app = Sanic('test_text')
@app.route('/')
def handler(request):
response = text('hello cookies')
response.cookies['hello'] = 'world'
response.cookies['hello']['httponly'] = httponly
return text(response.cookies['hello'].encode('utf8'))
request, response = app.test_client.get('/')
assert ('HttpOnly' in response.text) == expected
@pytest.mark.parametrize("httponly,expected", [
(False, False),
(True, True),
@@ -34,7 +53,7 @@ def test_false_cookies(httponly, expected):
@app.route('/')
def handler(request):
response = text('Cookies are: {}'.format(request.cookies['test']))
response = text('hello cookies')
response.cookies['right_back'] = 'at you'
response.cookies['right_back']['httponly'] = httponly
return response
@@ -43,7 +62,7 @@ def test_false_cookies(httponly, expected):
response_cookies = SimpleCookie()
response_cookies.load(response.headers.get('Set-Cookie', {}))
'HttpOnly' in response_cookies == expected
assert ('HttpOnly' in response_cookies['right_back'].output()) == expected
def test_http2_cookies():
app = Sanic('test_http2_cookies')

View File

@@ -1,16 +1,17 @@
import sanic
from sanic.utils import sanic_endpoint_test
from sanic import Sanic
from sanic.response import text
from threading import Event
import asyncio
def test_create_task():
e = Event()
async def coro():
await asyncio.sleep(0.05)
e.set()
app = sanic.Sanic()
app = Sanic('test_create_task')
app.add_task(coro)
@app.route('/early')
@@ -22,9 +23,8 @@ def test_create_task():
await asyncio.sleep(0.1)
return text(e.is_set())
request, response = sanic_endpoint_test(app, uri='/early')
request, response = app.test_client.get('/early')
assert response.body == b'False'
request, response = sanic_endpoint_test(app, uri='/late')
request, response = app.test_client.get('/late')
assert response.body == b'True'

View File

@@ -3,7 +3,8 @@ from bs4 import BeautifulSoup
from sanic import Sanic
from sanic.response import text
from sanic.exceptions import InvalidUsage, ServerError, NotFound
from sanic.exceptions import InvalidUsage, ServerError, NotFound, Unauthorized
from sanic.exceptions import Forbidden, abort
class SanicExceptionTestException(Exception):
@@ -26,10 +27,45 @@ def exception_app():
def handler_404(request):
raise NotFound("OK")
@app.route('/403')
def handler_403(request):
raise Forbidden("Forbidden")
@app.route('/401')
def handler_401(request):
raise Unauthorized("Unauthorized")
@app.route('/401/basic')
def handler_401_basic(request):
raise Unauthorized("Unauthorized", scheme="Basic", realm="Sanic")
@app.route('/401/digest')
def handler_401_digest(request):
raise Unauthorized("Unauthorized",
scheme="Digest",
realm="Sanic",
qop="auth, auth-int",
algorithm="MD5",
nonce="abcdef",
opaque="zyxwvu")
@app.route('/401/bearer')
def handler_401_bearer(request):
raise Unauthorized("Unauthorized", scheme="Bearer")
@app.route('/invalid')
def handler_invalid(request):
raise InvalidUsage("OK")
@app.route('/abort/401')
def handler_401_error(request):
abort(401)
@app.route('/abort')
def handler_500_error(request):
abort(500)
return text("OK")
@app.route('/divide_by_zero')
def handle_unhandled_exception(request):
1 / 0
@@ -44,8 +80,10 @@ def exception_app():
return app
def test_catch_exception_list():
app = Sanic('exception_list')
@app.exception([SanicExceptionTestException, NotFound])
def exception_list(request, exception):
return text("ok")
@@ -60,6 +98,7 @@ def test_catch_exception_list():
request, response = app.test_client.get('/')
assert response.text == 'ok'
def test_no_exception(exception_app):
"""Test that a route works without an exception"""
request, response = exception_app.test_client.get('/')
@@ -85,6 +124,38 @@ def test_not_found_exception(exception_app):
assert response.status == 404
def test_forbidden_exception(exception_app):
"""Test the built-in Forbidden exception"""
request, response = exception_app.test_client.get('/403')
assert response.status == 403
def test_unauthorized_exception(exception_app):
"""Test the built-in Unauthorized exception"""
request, response = exception_app.test_client.get('/401')
assert response.status == 401
request, response = exception_app.test_client.get('/401/basic')
assert response.status == 401
assert response.headers.get('WWW-Authenticate') is not None
assert response.headers.get('WWW-Authenticate') == "Basic realm='Sanic'"
request, response = exception_app.test_client.get('/401/digest')
assert response.status == 401
auth_header = response.headers.get('WWW-Authenticate')
assert auth_header is not None
assert auth_header.startswith('Digest')
assert "qop='auth, auth-int'" in auth_header
assert "algorithm='MD5'" in auth_header
assert "nonce='abcdef'" in auth_header
assert "opaque='zyxwvu'" in auth_header
request, response = exception_app.test_client.get('/401/bearer')
assert response.status == 401
assert response.headers.get('WWW-Authenticate') == "Bearer"
def test_handled_unhandled_exception(exception_app):
"""Test that an exception not built into sanic is handled"""
request, response = exception_app.test_client.get('/divide_by_zero')
@@ -97,6 +168,7 @@ def test_handled_unhandled_exception(exception_app):
"The server encountered an internal error and "
"cannot complete your request.")
def test_exception_in_exception_handler(exception_app):
"""Test that an exception thrown in an error handler is handled"""
request, response = exception_app.test_client.get(
@@ -114,10 +186,19 @@ def test_exception_in_exception_handler_debug_off(exception_app):
assert response.body == b'An error occurred while handling an error'
def test_exception_in_exception_handler_debug_off(exception_app):
def test_exception_in_exception_handler_debug_on(exception_app):
"""Test that an exception thrown in an error handler is handled"""
request, response = exception_app.test_client.get(
'/error_in_error_handler_handler',
debug=True)
assert response.status == 500
assert response.body.startswith(b'Exception raised in exception ')
def test_abort(exception_app):
"""Test the abort function"""
request, response = exception_app.test_client.get('/abort/401')
assert response.status == 401
request, response = exception_app.test_client.get('/abort')
assert response.status == 500

Some files were not shown because too many files have changed in this diff Show More