Compare commits

..

239 Commits
0.5.4 ... 0.6.0

Author SHA1 Message Date
Eli Uriegas
8b24c35ac7 Merge pull request #878 from seemethere/increment_060
Increment to 0.6.0
2017-08-02 19:13:39 -07:00
Eli Uriegas
f80a6ae228 Increment to 0.6.0 2017-08-02 19:11:53 -07:00
7
9b3fbe4593 fixed small doc issue (#877) 2017-08-02 10:15:18 -07:00
Eli Uriegas
222eca64d7 Merge pull request #875 from cctse/patch-1
add some example links for readme
2017-08-01 09:36:37 -07:00
akc
1b687f3feb add some example links 2017-08-01 16:32:15 +08:00
Eli Uriegas
2228104bff Merge pull request #862 from zyguan/revert-599fbce
revert 599fbce
2017-07-31 13:51:04 -07:00
Raphael Deem
402c3752c4 Merge pull request #871 from Frzk/unauthorized-exception
Simplified the Unauthorized exception __init__ signature.
2017-07-31 12:23:12 -07:00
François KUBLER
69a8bb5e1f Fixed a trailing white space in the docstring. 2017-07-28 22:29:45 +02:00
Raphael Deem
0d76f9e030 Merge pull request #873 from yunstanford/fix-timeout-issue
Fix timeout issue
2017-07-28 09:49:22 -07:00
Yun Xu
eb8f65c58b switch to use dist: precise 2017-07-27 22:21:19 -07:00
7
e0e27a671e Merge pull request #8 from channelcat/master
merge upstream master branch
2017-07-27 20:01:59 -07:00
François KUBLER
b65eb69d9f Simplified the Unauthorized exception __init__ signature.
(again).
Use of **kwargs makes it more straight forward and easier to use.
2017-07-27 23:00:27 +02:00
Raphael Deem
8118e542fb Merge pull request #866 from Nikamura/patch-1
Fix typo in documentation
2017-07-26 13:18:52 -07:00
Raphael Deem
c866759bd4 Merge pull request #868 from cclauss/patch-2
Comment: F821 undefined name is done on purpose
2017-07-26 13:18:29 -07:00
cclauss
40776e5324 Comment: F821 undefined name is done on purpose
Comment helps readers and `# noqa` silences linters
2017-07-26 12:44:30 +02:00
Karolis Mažukna
621343112d Fix typo in documentation 2017-07-25 13:29:17 +03:00
Raphael Deem
eb06e6ba51 Merge pull request #863 from zyguan/issue-760
handle keep-alive timeout gracefully
2017-07-25 00:54:43 -07:00
zyguan
da91b16244 add tests 2017-07-24 18:21:15 +08:00
zyguan
918e2ba8d0 Revert "fix #752"
This reverts commit 599fbcee6e.
2017-07-24 11:53:11 +08:00
zyguan
f50dc83829 handle keep-alive timeout gracefully 2017-07-24 01:37:36 +08:00
Raphael Deem
e8a9b4743b Merge pull request #861 from yunstanford/add-jinja-sanic
Add jinja2-sanic
2017-07-22 20:16:21 -07:00
Yun Xu
f34226425e add jinja2-sanic 2017-07-22 18:41:53 -07:00
7
e27c7ba36f Merge pull request #7 from channelcat/master
merge upstreaming master branch
2017-07-22 18:28:38 -07:00
Raphael Deem
1aad527956 Merge pull request #824 from Frzk/unauthorized-exception
Simplified the `Unauthorized.__init__` signature.
2017-07-21 23:50:36 -07:00
Raphael Deem
173c62acb6 Merge branch 'master' into unauthorized-exception 2017-07-21 01:54:45 -07:00
Raphael Deem
76605d7dfe Merge pull request #858 from mohd-akram/freebsd-syslog
Fix FreeBSD syslog path
2017-07-20 20:50:25 -07:00
Mohamed Akram
32be1a6496 Fix FreeBSD syslog path 2017-07-20 03:02:40 +04:00
Raphael Deem
c7d43aa544 Merge pull request #853 from yunstanford/patch-proxy-fix
proxy fix
2017-07-17 01:01:26 -07:00
Yun Xu
198bf55b7b flake8 fix 2017-07-14 17:23:18 -07:00
Yun Xu
75378d3567 add remote_addr property for proxy fix 2017-07-14 09:29:16 -07:00
7
55cb371569 Merge pull request #6 from channelcat/master
merge upstreaming master branch
2017-07-13 20:07:15 -07:00
Raphael Deem
5bb97d25d0 Merge pull request #839 from asvetlov/patch-1
Drop benchmarks from README
2017-07-13 14:41:57 -07:00
Andrew Svetlov
b2017cae77 Drop benchmarks from readme 2017-07-13 23:41:04 +02:00
Raphael Deem
35af903d4a Merge pull request #851 from zenixls2/issue-805
don't let default LOGGING to dictConfig influence already existed configs
2017-07-13 12:49:29 -07:00
zenix
426e00b6f4 dont let dictConfig influence already exists configs 2017-07-13 15:09:04 +09:00
Raphael Deem
8e62b3e438 Merge pull request #850 from r0fls/versioning
Versioning
2017-07-12 22:37:14 -07:00
Raphael Deem
4265ad5f23 add versioning 2017-07-12 22:19:42 -07:00
Raphael Deem
c181eb0539 Merge branch 'master' of https://github.com/channelcat/sanic 2017-07-12 21:12:15 -07:00
Raphael Deem
e0f06753c6 Merge pull request #831 from yunstanford/auto-doc
Improve Documentation.
2017-07-12 20:20:51 -07:00
Raphael Deem
8aafd72ef0 Merge branch 'auto-doc' of https://github.com/yunstanford/sanic 2017-07-12 20:19:30 -07:00
Raphael Deem
48549ce97b Merge pull request #847 from youknowone/gunicorn
ensure loop.close() and sys.exit() in gunicorn worker
2017-07-12 18:19:39 -07:00
Jeong YunWon
be0f3731b4 ensure loop.close() and sys.exit() in gunicorn worker 2017-07-12 22:26:58 +09:00
Raphael Deem
b755431b93 Merge pull request #844 from sfermigier/patch-1
Add missing code block qualifier
2017-07-10 15:26:57 -07:00
Stefane Fermigier
04ff393875 Add missing code block qualifier 2017-07-10 22:11:12 +02:00
Raphael Deem
7841274300 Merge pull request #843 from yunstanford/case-insensitive-check
Case insensitive check
2017-07-10 12:44:25 -07:00
Yun Xu
235687d983 should call lower just once 2017-07-10 12:37:21 -07:00
Yun Xu
3d75e6ed95 case-insensitive check for header fields 2017-07-10 12:29:47 -07:00
Andrew Svetlov
eb9af8bceb Drop aiohttp from benchmark table
The reason is: aiohttp with disabled access log shows about 16,000 RPS on sanic's own benchmark.
It's pretty much faster than 3,000 RPS from the table.

I'm not a Sanic dev team member. You should not trust users to update this table but manage periodic updates yourself.
If you don't want to do it --- it's up to you.
Please just drop very incorrect and outdated numbers from README in this case.
2017-07-09 08:18:45 +02:00
7
39ea434513 Merge pull request #5 from channelcat/master
merge upstream master branch
2017-07-08 14:23:00 -07:00
Raphael Deem
f0a956467c Merge pull request #815 from yunstanford/master
add graceful timeout when shutdown
2017-07-08 11:31:37 -07:00
Yun Xu
e48bd08095 make flake8 happy 2017-07-02 10:05:33 -07:00
Yun Xu
5d00717f39 improve doc and remove warnings 2017-07-02 10:02:04 -07:00
Yun Xu
3fff685c44 add auto-doc support 2017-07-01 23:46:34 -07:00
Raphael Deem
1e75265eed Merge pull request #756 from qwesda/master
fixes #755 fragmented headers
2017-06-30 18:24:51 -07:00
Eli Uriegas
b6ac3ef445 Merge pull request #826 from yunstanford/pytest-sanic
Pytest sanic
2017-06-30 18:17:15 -07:00
Raphael Deem
421f78f3e6 Merge pull request #814 from Frzk/forbidden-exception
Added a Forbidden exception
2017-06-30 18:11:23 -07:00
Eli Uriegas
b71fdcfc20 Merge pull request #829 from Frzk/config_doc
Fixed an error : `Sanic.__init__` doesn't have a `load_vars` parameter.
2017-06-30 10:06:30 -07:00
François KUBLER
021e9b228a Fixed a small error : Sanic.__init__ doesn't have a load_vars parameter.
It is `load_env`.
2017-06-30 16:24:41 +02:00
Raphael Deem
00d4533022 Merge pull request #821 from Frzk/bearer-support
Inverted the order of prefixes in Request.token property.
2017-06-29 09:43:34 -07:00
Yun Xu
fd5faeb5dd add an example 2017-06-29 09:14:21 -07:00
Yun Xu
e7c8035ed7 add pytest-sanic 2017-06-29 09:06:17 -07:00
François KUBLER
e427e38da8 Simplified the Unauthorized.__init__ signature.
It doesn't really make sense to have a `realm` parameter in the method signature.
Instead, one can simply set the realm in the `challenge` dict if necessary.

Also fixed the tests accordingly (and added a new one for "Bearer" auth-scheme).
2017-06-29 12:34:52 +02:00
François KUBLER
1f24abc3d2 Fixed support for "Bearer" and "Token" auth-schemes.
Removed the test for "Authentication: Bearer Token <TOKEN>" which was not supposed to exist (see https://github.com/channelcat/sanic/pull/821)
Also added a call to `split` when retrieving the token value to handle cases where there are leading or trailing spaces.
2017-06-29 10:23:49 +02:00
François
76e62779ba Merge branch 'master' into forbidden-exception 2017-06-28 17:25:40 +02:00
Eli Uriegas
1af343ef50 Merge pull request #823 from ojii/cookies-warning
Added a warning to the cookies documentation about security
2017-06-27 19:35:35 -07:00
Jonas Obrist
412ffd1592 Added a warning to the cookies documentation about security 2017-06-28 11:05:59 +09:00
Daniel Schwarz
b141fec573 Merge remote-tracking branch 'upstream/master'
# Conflicts:
#	sanic/server.py
2017-06-27 13:32:49 +02:00
François KUBLER
d2e14abfd5 Inverted the order of prefixes in Request.token property.
As suggested by @allan-simon
See: https://github.com/channelcat/sanic/pull/811#pullrequestreview-46144327
2017-06-27 12:57:47 +02:00
Raphael Deem
d4abca0480 Merge pull request #818 from youknowone/debug
Introduce debug mode for HTTP protocol
2017-06-26 22:02:37 -07:00
Raphael Deem
529f5822ee Merge pull request #819 from r0fls/817
convert environment vars to int if digits
2017-06-26 21:54:54 -07:00
Raphael Deem
395d85a12f use try/except 2017-06-26 21:35:01 -07:00
Raphael Deem
4379a4b067 float logic 2017-06-26 20:59:59 -07:00
Raphael Deem
ad8e1cbf62 convert environment vars to int if digits 2017-06-26 20:49:41 -07:00
Jeong YunWon
dc5a70b0de Introduce debug mode for HTTP protocol 2017-06-26 21:13:13 +09:00
Yun Xu
b5d1f52ea4 make flake8 happy 2017-06-25 10:22:40 -07:00
Yun Xu
221cf235b5 fix a unit test 2017-06-25 01:03:28 -07:00
Yun Xu
7720e31a31 add unit test 2017-06-25 00:51:59 -07:00
Yun Xu
d812affef0 add graceful_shutdown_timeout to gunicorn worker 2017-06-25 00:51:14 -07:00
Yun Xu
5c19eb34bf add graceful_shutdown_timeout 2017-06-24 19:00:33 -07:00
7
e18ebaee3d Merge pull request #4 from channelcat/master
merge upstream master branch
2017-06-24 18:21:13 -07:00
Eli Uriegas
dbcbf12456 Merge pull request #811 from Frzk/bearer-support
Added support for 'Authorization: Bearer <TOKEN>' header...
2017-06-23 10:32:21 -07:00
Eli Uriegas
c04b44057c Merge pull request #813 from Frzk/unauthorized-exception
Added an Unauthorized exception
2017-06-23 10:30:51 -07:00
François KUBLER
60aa60f48e Fixed the test for the new Unauthorized exception. 2017-06-23 17:16:31 +02:00
François KUBLER
2848d7c80e Added a Forbidden exception
Also added a small test.
2017-06-23 16:44:57 +02:00
François KUBLER
9fcdacb624 Modified the name of an argument. 2017-06-23 16:29:04 +02:00
François KUBLER
cf1713b085 Added a Unauthorized exception.
Also added a few tests related to this new exception.
2017-06-23 16:12:15 +02:00
7
f049a4ca67 Recycling gunicorn worker (#800)
* add recycling feature to gunicorn worker

* add unit tests

* add more unit tests, and remove redundant trigger_events call

* fixed up unit tests

* make flake8 happy

* address feedbacks

* make flake8 happy

* add doc
2017-06-22 13:26:50 -07:00
François KUBLER
55f860da2f Added support for 'Authorization: Bearer <TOKEN>' header in Request.token property.
Also added a test case for that kind of header.
2017-06-22 18:11:23 +02:00
Eli Uriegas
b5369e611c Merge pull request #764 from stopspazzing/master
Clean up of examples.
2017-06-20 14:53:37 -07:00
Jeremy Zimmerman
3d1dd1c6ac re-add extensions.md to fix merge conflict. 2017-06-20 14:49:12 -07:00
Eli Uriegas
10a363b275 Merge pull request #809 from jrocketfingers/feature/allow-textual-responses
Allow textual responses when using test_client and aiohttp 2
2017-06-20 10:30:26 -07:00
Nikola Kolevski
d865c5e2b6 Conform to pep8 2017-06-20 13:22:28 +02:00
Nikola Kolevski
9fac37588c Allow textual responses when using test_client and aiohttp 2 2017-06-20 13:15:30 +02:00
Jeremy Zimmerman
aac0d58417 Delete extensions.md
non-core content moved to wiki
2017-06-19 15:18:32 -07:00
Eli Uriegas
b37e6187d4 Merge pull request #802 from yunstanford/add-match-info
Add match_info property to request class
2017-06-18 10:15:46 -07:00
Yun Xu
20138ee85f add match_info to request 2017-06-17 09:47:58 -07:00
Raphael Deem
6dc569cde5 Merge pull request #795 from ekampf/patch-1
Prevent `run` from overriding logging config set in constructor
2017-06-15 23:39:09 -07:00
Eran Kampf
77cf0b678a Fix has_log value 2017-06-15 11:21:08 -07:00
Eran Kampf
2dfb061063 Prevent run from overriding logging config set in constructor
When creating the `Sanic` instance I provide it with a customized `log_config`.
Calling `run` overrides these settings unless I provide it *again* with the same `log_config`.
This is confusing and error prone. `run` shouldnt override configurations set in the `Sanic` constructor...
2017-06-15 10:39:00 -07:00
7
e4669e2581 Merge pull request #3 from channelcat/master
merge upstream master branch
2017-06-12 22:15:31 -07:00
Eli Uriegas
df47cf72d3 Merge pull request #791 from seemethere/add_docs_requirements
Add docs requirements
2017-06-12 10:55:45 -07:00
Eli Uriegas
ba1b34e375 Add docs requirements
Closes channelcat/sanic#787
2017-06-12 10:29:38 -07:00
Eli Uriegas
950b5ee529 Merge pull request #789 from yunstanford/coverage-report
Coverage report
2017-06-11 23:38:04 -07:00
Raphael Deem
041c48de19 Merge pull request #790 from r0fls/752
fix #752
2017-06-11 23:24:32 -07:00
Raphael Deem
599fbcee6e fix #752 2017-06-11 23:20:04 -07:00
Yun Xu
ce2df8030c quick fix for test_gunicorn_worker test 2017-06-11 09:06:48 -07:00
Yun Xu
47e761bbe2 add coverage report 2017-06-11 08:49:35 -07:00
7
0646baa18d Merge pull request #2 from channelcat/master
remote-tracking with upstream
2017-06-10 11:54:11 -07:00
Eli Uriegas
38997c1b47 Merge pull request #786 from yunstanford/handle_stream_404
Handle stream 404
2017-06-10 10:03:54 -07:00
Yun Xu
acaafabc23 retry build 2017-06-10 09:57:32 -07:00
Yun Xu
6a80bdafa6 add unit tests 2017-06-10 09:48:30 -07:00
Yun Xu
cf30ed745c also should handle InvalidUsage exception 2017-06-10 09:42:48 -07:00
Eli Uriegas
a399fb4044 Merge pull request #785 from youknowone/gunicorn
Gunicorn worker hints app weather it is being terminated
2017-06-09 11:00:12 -07:00
Yun Xu
24b946e850 make flake8 happy 2017-06-09 08:43:23 -07:00
Yun Xu
236daf48ff add unit tests 2017-06-09 08:42:48 -07:00
Yun Xu
4942af27dc handle NotFound 2017-06-09 08:33:34 -07:00
7
3adb90071b Merge pull request #1 from channelcat/master
merge upstream project
2017-06-09 08:23:14 -07:00
Jeong YunWon
29b4a2a08c Gunicorn worker hints app weather it is being terminated
For now, `Sanic.is_running` is set when the worker is started but not
unset when it is about to stopped. Setting the flag for quit signal
will not affect working requests, but the `Sanic.is_running` flag still
can be used to support graceful termination.
2017-06-09 14:51:15 +09:00
Eli Uriegas
e1331fc0a2 Merge pull request #783 from yunstanford/master
add content_type property in request
2017-06-08 17:30:07 -07:00
Yun Xu
3802f8ff65 unit tests 2017-06-08 17:25:22 -07:00
Eli Uriegas
4b0abdbe7c Merge pull request #782 from yunstanford/sanic-transmute-doc
add sanic-transmute
2017-06-08 14:27:29 -07:00
Yun Xu
81889fd7a3 add unit tests 2017-06-07 20:48:07 -07:00
Yun Xu
aac99c45c0 add content_type property in request 2017-06-07 20:46:48 -07:00
Yun Xu
566a6369a5 add sanic-transmute 2017-06-07 20:27:54 -07:00
Eli Uriegas
4fdf340d04 Merge pull request #773 from mbatchkarov/await-json-too
Testing: store JSON response in local request
2017-06-07 12:05:30 -07:00
Miroslav Batchkarov
ddd7145153 check json is None if body is not JSON 2017-06-07 10:03:27 +01:00
Miroslav Batchkarov
3f22b644b6 wrap call to json in try-except to make tests pass 2017-06-07 09:57:07 +01:00
Eli Uriegas
639c9f579d Merge pull request #774 from algtmatt/feature/logging_doc_fixes
Small logging docs fixes
2017-06-05 10:56:00 -07:00
Matthew Snyder
735b8665f1 Small logging docs fixes 2017-06-05 10:42:17 -07:00
Miroslav Batchkarov
199fa50a9d also store json result in local request 2017-06-05 16:24:23 +01:00
Jeremy Zimmerman
aac5ad8504 Merge remote-tracking branch 'origin/master' 2017-06-01 16:53:36 -07:00
Jeremy Zimmerman
349c108ebc re-added request_stream example. 2017-06-01 16:52:56 -07:00
Eli Uriegas
3b464782ef Merge pull request #765 from ttopholm/master
Added content_type to be set for son response
2017-06-01 16:29:08 -05:00
Tue Topholm
3d97fd8d2a Removed whitespace 2017-06-01 23:09:37 +02:00
Tue Topholm
c102e76146 Fixed line width 2017-06-01 23:01:27 +02:00
Jeremy Zimmerman
beee7b68bf reverted back to default 0.0.0.0 host 2017-06-01 14:01:13 -07:00
Tue Topholm
f47e571d92 Added content_type to be set for son response 2017-06-01 22:53:56 +02:00
Jeremy Zimmerman
4b5320a8f0 Clean up of examples. Removes non-core examples, optimizes and restyles remaining to strictly follow PEP 8 styling guidelines. Non-Core examples will be moved to Wiki. 2017-06-01 11:53:05 -07:00
Daniel Schwarz
30c2c89c6b fix partial url parsing 2017-05-30 16:13:49 +02:00
Raphael Deem
4a1d1a0dc1 Merge pull request #750 from xenu256/patch-1
aiomysql has DictCursor
2017-05-29 22:54:09 -07:00
Raphael Deem
360adc9130 Merge pull request #757 from monobot/asyncOrmV020
update asyncorm version example to 0.2.0
2017-05-29 22:51:46 -07:00
Raphael Deem
cc21abe843 Merge pull request #751 from ak04nv/patch-1
Update jinja_example.py
2017-05-28 23:50:16 -07:00
monobot
9a27555763 update asyncorm version example to 0.2.0 2017-05-29 00:01:56 +01:00
Daniel Schwarz
aaef2fbd01 fix flake8 errors 2017-05-28 18:46:07 +02:00
Anton Kochnev
5bb640ca17 Update jinja_example.py
Added python version check for enabling async mode.
2017-05-28 14:37:41 +08:00
Daniel Schwarz
0e5c7a62cb remove debug messages 2017-05-27 22:36:08 +02:00
Daniel Schwarz
1b33e05f74 fix debug log messages 2017-05-27 16:32:39 +02:00
Daniel Schwarz
53a04309ff add header_fragment handeling 2017-05-27 16:28:57 +02:00
Daniel Schwarz
dc411651b6 add check for header and value 2017-05-27 15:36:57 +02:00
Daniel Schwarz
514540b90b add debug for header values 2017-05-27 15:32:37 +02:00
Tadas Talaikis
a5249d1f5d aiomysql has DictCursor 2017-05-27 11:06:45 +03:00
Eli Uriegas
21aa3f6578 Merge pull request #748 from messense/feature/websocket-config
Add websocket max_size and max_queue configuration
2017-05-26 10:44:49 -05:00
messense
0024edbbb9 Add websocket max_size and max_queue configuration 2017-05-26 11:15:28 +08:00
Raphael Deem
23cb39b557 Merge pull request #744 from algtmatt/feature/from_file_doc_fix
Update config file loader docs
2017-05-24 18:16:10 -07:00
Eli Uriegas
48de321869 Merge pull request #697 from 38elements/stream
Add Request.stream
2017-05-24 16:22:52 -07:00
Raphael Deem
c6d68009d2 Merge pull request #745 from messense/feature/gunicorn-worker-test-case
Add a simple integration test for Gunicorn worker
2017-05-23 20:53:01 -07:00
Raphael Deem
2cab267405 Merge pull request #734 from ashleysommer/static_large_file_stream
Add option to static helper to use streaming for large files.
2017-05-23 20:52:12 -07:00
messense
6bdc0d2e5e Fix Gunicorn worker 2017-05-23 11:28:12 +08:00
messense
3eed81c1eb Add a simple integration test for Gunicorn worker 2017-05-23 11:04:27 +08:00
Raphael Deem
b447807b36 Merge pull request #742 from r0fls/700
changes required for unix socket support
2017-05-22 19:29:32 -07:00
Eli Uriegas
2771c8c32e Merge pull request #738 from messense/feature/conduct
Add Code of Conduct
2017-05-22 15:34:01 -07:00
Eli Uriegas
21a88bc2d3 Add maintainers email address 2017-05-22 15:31:05 -07:00
Matthew Snyder
57c1838f68 Update config file loader docs 2017-05-22 14:10:08 -07:00
Raphael Deem
52b0254ec6 unix socket support; fixes #700 2017-05-21 03:15:06 -07:00
Raphael Deem
49631542ce Merge pull request #732 from jrocketfingers/feature/explicit-register-middleware
Extract register_middleware into a method.
2017-05-21 03:06:12 -07:00
Raphael Deem
4b80ffb9eb Merge pull request #740 from r0fls/739
add abort function
2017-05-21 02:20:02 -07:00
Raphael Deem
9efa7c116d remove redundant code; decode response 2017-05-20 23:27:00 -07:00
Raphael Deem
5d9c8d59a0 add abort() test 2017-05-20 14:43:57 -07:00
Raphael Deem
1a60201f68 flake8 spacing 2017-05-20 10:44:09 -07:00
Raphael Deem
f3186abf09 SANIC_EXCEPTIONS -> _sanic_exceptions 2017-05-20 10:29:00 -07:00
Raphael Deem
6bcc0d3c7f Merge remote-tracking branch 'upstream/master' 2017-05-20 02:17:26 -07:00
Raphael Deem
57b9a57dde Update README.rst 2017-05-20 02:17:12 -07:00
Raphael Deem
28994f4b64 update todo 2017-05-20 02:15:45 -07:00
Raphael Deem
588b4712bf add exception decorator 2017-05-20 01:24:34 -07:00
Raphael Deem
d3b6208057 add abort function 2017-05-19 18:52:19 -07:00
Ashley Sommer
ef80953b1b Fix flake8 line length error. 2017-05-20 09:56:05 +10:00
ashleysommer
72db1188c7 Add an option to the static() helper to switch on streaming for large files.
By default uses a 1M threshold.
ie. if the static file to serve is >= 1M it will stream the file.
This threshold value is configurable by passing an int instead of a bool to `stream_large_files` parameter of `static()`.
2017-05-20 09:56:05 +10:00
Raphael Deem
0858d3c544 Merge pull request #733 from ashleysommer/file_stream
Add file_stream response handler
2017-05-19 16:48:12 -07:00
Ashley Sommer
5c5656f981 Moved file_stream tests to test_responses.py 2017-05-20 09:41:36 +10:00
Raphael Deem
58a9c92d75 fix 739 2017-05-19 13:35:04 -07:00
Eli Uriegas
a6dc4646db Merge pull request #737 from 38elements/deploying
Fix Running via Gunicorn in deploying.md
2017-05-19 12:10:36 -07:00
messense
8ff553e926 Add Code of Conduct 2017-05-19 23:09:44 +08:00
38elements
848a5c61f0 Fix Running via Gunicorn in deploying.md 2017-05-19 23:22:57 +09:00
Raphael Deem
d49000e9f4 Merge pull request #736 from fanjindong/examples_read
debug 'Blueprint names must be unique'
2017-05-19 01:45:06 -07:00
fanjindong
a82145c4e6 debug 'Blueprint names must be unique' 2017-05-19 16:26:56 +08:00
Ashley Sommer
181edb7235 Test file() and file_stream() response helpers.
Added test for `file()` response helper and `file_stream()` response helper.
2017-05-19 13:01:21 +10:00
Ashley Sommer
ff2ae11ac8 Remove exception print(e) statement. 2017-05-19 13:00:01 +10:00
Johnny Rocketfingers
3f841f3b21 Switch to non-hardcoded register_middleware. 2017-05-18 22:08:44 +02:00
ashleysommer
181977ad4e Added brief documentation with an example for file_stream
Added test to ensure `file_stream()` works in the test suite.
2017-05-18 18:12:26 +10:00
ashleysommer
e155fe403d Add file_stream response handler
For streaming large static files
Like `file()` but breaks the file into chunks and sends it with a `StreamingHTTPResponse`
Chunk size is configurable, but defaults to 4k, this seemed to be the sweet spot in my testing.
Also supports ContentRange same as `file()` does.
2017-05-18 18:04:28 +10:00
Johnny Rocketfingers
bf5438d573 Extract register_middleware into a method. 2017-05-18 06:36:11 +02:00
Raphael Deem
0e4aaf8856 Merge pull request #731 from jrocketfingers/fix/token-missing-auth-headers
Check that the Authorization headers are actually provided.
2017-05-17 13:10:12 -07:00
Raphael Deem
5c44ce1637 Merge pull request #719 from messense/feature/worker-uvloop
Gunicorn worker should not require uvloop
2017-05-17 12:47:19 -07:00
Raphael Deem
974fe25a11 Merge pull request #722 from messense/feature/ci-without-ext
Add py3*-no-ext test env
2017-05-17 12:47:05 -07:00
Johnny
58bae83558 Add a regression test. 2017-05-17 11:15:45 +02:00
Johnny
5d309af86f Check that the headers are actually provided. 2017-05-17 11:08:50 +02:00
messense
ec857d1c53 Drop tox-travis 2017-05-17 12:21:56 +08:00
messense
2f84cdd708 Fix websocket handler bug on Python3.5 with no uvloop 2017-05-17 12:12:25 +08:00
messense
7cc02e84ed Fix json loads bug on Python 3.5 2017-05-17 12:12:25 +08:00
Raphael Deem
87c2a5bc97 Merge pull request #724 from suoning/doc-logger
update logging doc
2017-05-16 21:10:57 -07:00
Eli Uriegas
826c2e0f4e Merge pull request #725 from 38elements/contributing
Add rule in CONTRIBUTING.md
2017-05-15 14:45:57 -07:00
Eli Uriegas
b5e25e13b7 Merge pull request #727 from argaen/update_aiocache_example
Update aiocache example to latest version
2017-05-15 11:39:02 -07:00
argaen
f9653114d1 Update aiocache example to latest version 2017-05-15 20:30:52 +02:00
Eli Uriegas
6b7e19891b Get rid of un-needed s, Fix some formatting. 2017-05-15 10:54:47 -07:00
38elements
a677f14423 Add rule in CONTRIBUTING.md 2017-05-15 21:28:35 +09:00
suoning
dddce3f30d update logging, Remove the comments 2017-05-15 13:59:03 +08:00
Raphael Deem
be93d670a3 Merge pull request #717 from jrocketfingers/fix/ipv6-access-log
Fix "TypeError: not all arguments converted during string formatting"
2017-05-14 20:28:07 -07:00
suoning
68d4bb6ffe update logging doc 2017-05-15 10:54:30 +08:00
suoning
a27471178a update logger doc 2017-05-15 10:25:19 +08:00
messense
66fcb0cc8f Add py3*-no-ext test env 2017-05-15 10:10:50 +08:00
messense
05d0ddc281 Gunicorn worker should not require uvloop 2017-05-15 00:01:51 +08:00
Johnny Rocketfingers
b1890f50b6 Conform to pep8 2017-05-14 10:15:11 +02:00
Johnny Rocketfingers
b44c707e94 Prevent incorrect tuple size on get_extra_info errors
According to https://docs.python.org/3/library/asyncio-protocol.html#asyncio.BaseTransport.get_extra_info,
get_extra_info fails by returning None. This is an attempt in
normalization of the response in cases of AF_INET, AF_INET6 and
erroneous return values.
2017-05-14 09:56:56 +02:00
Johnny
4c7675939a Fix "TypeError: not all arguments converted during string formatting"
socket.getpeername() returns AF_INET6 address family four-tuple, with
flowid and scopeid.

In server's write_response, an exception is raised when an IPv6 client
connects due to four-tuple elements having two unused elements (flowid
and scopeid).

This makes sure that only the first two (host and port) are used in log
string formatting.
2017-05-13 17:35:04 +02:00
Eli Uriegas
fa1b7de52a Merge pull request #706 from messense/feature/remove-log-file
Remove timedRotatingFile log config
2017-05-12 10:56:19 -07:00
Eli Uriegas
666f8c8d3c Merge pull request #712 from stopspazzing/master
Fixed plotly_example, now works
2017-05-12 10:39:57 -07:00
Jeremy Zimmerman
996c0b3280 Fixed with a working example
Remember:
K.I.S.S
2017-05-11 13:40:16 -07:00
Eli Uriegas
f9d428de8b Merge pull request #711 from stopspazzing/master
Use of register_blueprint will be deprecated, why not upgrade?
2017-05-11 12:53:43 -07:00
Jeremy Zimmerman
a17b3f1b84 Use of register_blueprint will be deprecated, why not upgrade? 2017-05-11 12:33:57 -07:00
Jeremy Zimmerman
f39512aa63 double if statement (#707)
* Migrated `%` string formating

* double if statement

combined double 'if' to a single 'if' with 'and'

* Revert "Fix "Prefer `format()` over string interpolation operator" issue"
2017-05-11 11:49:32 -07:00
Raphael Deem
23ee9f64d4 Merge pull request #710 from monobot/asyncorm_update
modify the asyncorm example, with the new lazy querysets
2017-05-11 10:35:52 -07:00
monobot
ad68739df7 modify the asyncorm example, for the new lazy querysets 2017-05-11 18:15:04 +01:00
38elements
a50d8421b8 Add heading in streaming.md 2017-05-11 19:18:58 +09:00
messense
3ea1b07906 Revert "Add access.log and error.log to .gitignore"
This reverts commit fc0d69616c.
2017-05-11 11:19:03 +08:00
messense
c3683662c2 Remove timedRotatingFile log config 2017-05-11 11:18:59 +08:00
Eli Uriegas
861865807a Merge pull request #705 from stopspazzing/patch-2
spelling mistake
2017-05-10 10:52:29 -07:00
Jeremy Zimmerman
18930082e2 spelling mistake
fixed incorrect spelling
2017-05-10 09:38:57 -07:00
38elements
6a14e49479 Replace stream decorator to stream parameter 2017-05-09 22:31:15 +09:00
Raphael Deem
c530d5f016 Merge pull request #704 from seemethere/fix_camel_case
Fix camel case module
2017-05-08 20:50:39 -07:00
Eli Uriegas
ac5e1a6ebd Fix import 2017-05-08 20:47:20 -07:00
Eli Uriegas
bb6de53f28 Fix docs 2017-05-08 20:44:19 -07:00
Eli Uriegas
bfcd499cc2 Remove default_filter module, put into logging 2017-05-08 20:41:34 -07:00
Eli Uriegas
3e88ec18e2 Actually add file >.> 2017-05-08 20:37:44 -07:00
Eli Uriegas
1fb640c313 Fix camel case module 2017-05-08 20:36:57 -07:00
38elements
4d4f38fb35 is_request_stream for CompositionView and HTTPMethodView 2017-05-09 01:04:03 +09:00
38elements
15ad07f03d Fix streaming.md 2017-05-08 00:10:36 +09:00
38elements
0b53c413a7 Add stream decorator for HTTPMethodView 2017-05-07 21:33:15 +09:00
38elements
931397c7e1 Add stream for CompositionView 2017-05-07 18:38:48 +09:00
38elements
ef2cc7ebf5 Add Request.stream 2017-05-07 18:38:48 +09:00
93 changed files with 2861 additions and 1454 deletions

3
.gitignore vendored
View File

@@ -14,5 +14,4 @@ settings.py
docs/_build/
docs/_api/
build/*
access.log
error.log
.DS_Store

View File

@@ -1,10 +1,25 @@
sudo: false
dist: precise
language: python
python:
- '3.5'
- '3.6'
install: pip install tox-travis
script: tox
cache:
directories:
- $HOME/.cache/pip
matrix:
include:
- env: TOX_ENV=py35
python: 3.5
- env: TOX_ENV=py35-no-ext
python: 3.5
- env: TOX_ENV=py36
python: 3.6
- env: TOX_ENV=py36-no-ext
python: 3.6
- env: TOX_ENV=flake8
python: 3.6
- env: TOX_ENV=check
python: 3.6
install: pip install -U tox
script: tox -e $TOX_ENV
deploy:
provider: pypi
user: channelcat

74
CONDUCT.md Normal file
View File

@@ -0,0 +1,74 @@
# Contributor Covenant Code of Conduct
## Our Pledge
In the interest of fostering an open and welcoming environment, we as
contributors and maintainers pledge to making participation in our project and
our community a harassment-free experience for everyone, regardless of age, body
size, disability, ethnicity, gender identity and expression, level of experience,
nationality, personal appearance, race, religion, or sexual identity and
orientation.
## Our Standards
Examples of behavior that contributes to creating a positive environment
include:
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery and unwelcome sexual attention or
advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic
address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Our Responsibilities
Project maintainers are responsible for clarifying the standards of acceptable
behavior and are expected to take appropriate and fair corrective action in
response to any instances of unacceptable behavior.
Project maintainers have the right and responsibility to remove, edit, or
reject comments, commits, code, wiki edits, issues, and other contributions
that are not aligned to this Code of Conduct, or to ban temporarily or
permanently any contributor for other behaviors that they deem inappropriate,
threatening, offensive, or harmful.
## Scope
This Code of Conduct applies both within project spaces and in public spaces
when an individual is representing the project or its community. Examples of
representing a project or community include using an official project e-mail
address, posting via an official social media account, or acting as an appointed
representative at an online or offline event. Representation of a project may be
further defined and clarified by project maintainers.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported by contacting the project team at sanic-maintainers@googlegroups.com. All
complaints will be reviewed and investigated and will result in a response that
is deemed necessary and appropriate to the circumstances. The project team is
obligated to maintain confidentiality with regard to the reporter of an incident.
Further details of specific enforcement policies may be posted separately.
Project maintainers who do not follow or enforce the Code of Conduct in good
faith may face temporary or permanent repercussions as determined by other
members of the project's leadership.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
available at [http://contributor-covenant.org/version/1/4][version]
[homepage]: http://contributor-covenant.org
[version]: http://contributor-covenant.org/version/1/4/

View File

@@ -4,6 +4,11 @@ Thank you for your interest! Sanic is always looking for contributors. If you
don't feel comfortable contributing code, adding docstrings to the source files
is very appreciated.
We are committed to providing a friendly, safe and welcoming environment for all,
regardless of gender, sexual orientation, disability, ethnicity, religion,
or similar personal characteristic.
Our [code of conduct](./CONDUCT.md) sets the standards for behavior.
## Installation
To develop on sanic (and mainly to just run the tests) it is highly recommend to
@@ -29,14 +34,19 @@ See it's that simple!
## Pull requests!
So the pull request approval rules are pretty simple:
1. All pull requests must pass unit tests
1. All pull requests must pass unit tests.
2. All pull requests must be reviewed and approved by at least
one current collaborator on the project
3. All pull requests must pass flake8 checks
4. If you decide to remove/change anything from any common interface
one current collaborator on the project.
3. All pull requests must pass flake8 checks.
4. All pull requests must be consistent with the existing code.
5. If you decide to remove/change anything from any common interface
a deprecation message should accompany it.
5. If you implement a new feature you should have at least one unit
6. If you implement a new feature you should have at least one unit
test to accompany it.
7. An example must be one of the following:
* Example of how to use Sanic
* Example of how to use Sanic extensions
* Example of how to use Sanic and asynchronous library
## Documentation

View File

@@ -11,34 +11,6 @@ Sanic is developed `on GitHub <https://github.com/channelcat/sanic/>`_. Contribu
If you have a project that utilizes Sanic make sure to comment on the `issue <https://github.com/channelcat/sanic/issues/396>`_ that we use to track those projects!
Benchmarks
----------
All tests were run on an AWS medium instance running ubuntu, using 1
process. Each script delivered a small JSON response and was tested with
wrk using 100 connections. Pypy was tested for Falcon and Flask but did
not speed up requests.
+-----------+-----------------------+----------------+---------------+
| Server | Implementation | Requests/sec | Avg Latency |
+===========+=======================+================+===============+
| Sanic | Python 3.5 + uvloop | 33,342 | 2.96ms |
+-----------+-----------------------+----------------+---------------+
| Wheezy | gunicorn + meinheld | 20,244 | 4.94ms |
+-----------+-----------------------+----------------+---------------+
| Falcon | gunicorn + meinheld | 18,972 | 5.27ms |
+-----------+-----------------------+----------------+---------------+
| Bottle | gunicorn + meinheld | 13,596 | 7.36ms |
+-----------+-----------------------+----------------+---------------+
| Flask | gunicorn + meinheld | 4,988 | 20.08ms |
+-----------+-----------------------+----------------+---------------+
| Kyoukai | Python 3.5 + uvloop | 3,889 | 27.44ms |
+-----------+-----------------------+----------------+---------------+
| Aiohttp | Python 3.5 + uvloop | 2,979 | 33.42ms |
+-----------+-----------------------+----------------+---------------+
| Tornado | Python 3.5 | 2,138 | 46.66ms |
+-----------+-----------------------+----------------+---------------+
Hello World Example
-------------------
@@ -84,9 +56,18 @@ Documentation
.. |PyPI version| image:: https://img.shields.io/pypi/pyversions/sanic.svg
:target: https://pypi.python.org/pypi/sanic/
Examples
--------
`Non-Core examples <https://github.com/channelcat/sanic/wiki/Examples/>`_. Examples of plugins and Sanic that are outside the scope of Sanic core.
`Extensions <https://github.com/channelcat/sanic/wiki/Extensions/>`_. Sanic extensions created by the community.
`Projects <https://github.com/channelcat/sanic/wiki/Projects/>`_. Sanic in production use.
TODO
----
* Streamed file processing
* http2
Limitations

View File

@@ -1,20 +1,225 @@
# Minimal makefile for Sphinx documentation
# Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
SPHINXPROJ = Sanic
SOURCEDIR = .
PAPER =
BUILDDIR = _build
# Put it first so that "make" without argument is like "make help".
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
.PHONY: help
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " applehelp to make an Apple Help Book"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " epub3 to make an epub3"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
@echo " text to make text files"
@echo " man to make manual pages"
@echo " texinfo to make Texinfo files"
@echo " info to make Texinfo files and run them through makeinfo"
@echo " gettext to make PO message catalogs"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " xml to make Docutils-native XML files"
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
@echo " coverage to run coverage check of the documentation (if enabled)"
@echo " dummy to check syntax errors of document sources"
.PHONY: help Makefile
.PHONY: clean
clean:
rm -rf $(BUILDDIR)/*
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: html
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
.PHONY: dirhtml
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
.PHONY: singlehtml
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
.PHONY: pickle
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
.PHONY: json
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
.PHONY: htmlhelp
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
.PHONY: qthelp
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/aiographite.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/aiographite.qhc"
.PHONY: applehelp
applehelp:
$(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp
@echo
@echo "Build finished. The help book is in $(BUILDDIR)/applehelp."
@echo "N.B. You won't be able to view it unless you put it in" \
"~/Library/Documentation/Help or install it in your application" \
"bundle."
.PHONY: devhelp
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/aiographite"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/aiographite"
@echo "# devhelp"
.PHONY: epub
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
.PHONY: epub3
epub3:
$(SPHINXBUILD) -b epub3 $(ALLSPHINXOPTS) $(BUILDDIR)/epub3
@echo
@echo "Build finished. The epub3 file is in $(BUILDDIR)/epub3."
.PHONY: latex
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
.PHONY: latexpdf
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
.PHONY: latexpdfja
latexpdfja:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through platex and dvipdfmx..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
.PHONY: text
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
.PHONY: man
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
.PHONY: texinfo
texinfo:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
@echo "Run \`make' in that directory to run these through makeinfo" \
"(use \`make info' here to do that automatically)."
.PHONY: info
info:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo "Running Texinfo files through makeinfo..."
make -C $(BUILDDIR)/texinfo info
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
.PHONY: gettext
gettext:
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
@echo
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
.PHONY: changes
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
.PHONY: linkcheck
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
.PHONY: doctest
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."
.PHONY: coverage
coverage:
$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
@echo "Testing of coverage in the sources finished, look at the " \
"results in $(BUILDDIR)/coverage/python.txt."
.PHONY: xml
xml:
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
@echo
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
.PHONY: pseudoxml
pseudoxml:
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
@echo
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
.PHONY: dummy
dummy:
$(SPHINXBUILD) -b dummy $(ALLSPHINXOPTS) $(BUILDDIR)/dummy
@echo
@echo "Build finished. Dummy builder generates no files."

View File

@@ -13,6 +13,9 @@ import sys
# Add support for Markdown documentation using Recommonmark
from recommonmark.parser import CommonMarkParser
# Add support for auto-doc
from recommonmark.transform import AutoStructify
# Ensure that sanic is present in the path, to allow sphinx-apidoc to
# autogenerate documentation from docstrings
root_directory = os.path.dirname(os.getcwd())
@@ -140,3 +143,12 @@ epub_exclude_files = ['search.html']
# -- Custom Settings -------------------------------------------------------
suppress_warnings = ['image.nonlocal_uri']
# app setup hook
def setup(app):
app.add_config_value('recommonmark_config', {
'enable_eval_rst': True,
'enable_auto_doc_ref': True,
}, True)
app.add_transform(AutoStructify)

View File

@@ -16,6 +16,7 @@ Guides
sanic/blueprints
sanic/config
sanic/cookies
sanic/decorators
sanic/streaming
sanic/class_based_views
sanic/custom_protocol
@@ -25,6 +26,7 @@ Guides
sanic/deploying
sanic/extensions
sanic/contributing
sanic/api_reference
Module Documentation
@@ -33,4 +35,5 @@ Module Documentation
.. toctree::
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

View File

@@ -1,19 +1,64 @@
@ECHO OFF
pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=.
set BUILDDIR=_build
set SPHINXPROJ=Sanic
set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% .
set I18NSPHINXOPTS=%SPHINXOPTS% .
if NOT "%PAPER%" == "" (
set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS%
set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS%
)
if "%1" == "" goto help
%SPHINXBUILD% >NUL 2>NUL
if "%1" == "help" (
:help
echo.Please use `make ^<target^>` where ^<target^> is one of
echo. html to make standalone HTML files
echo. dirhtml to make HTML files named index.html in directories
echo. singlehtml to make a single large HTML file
echo. pickle to make pickle files
echo. json to make JSON files
echo. htmlhelp to make HTML files and a HTML help project
echo. qthelp to make HTML files and a qthelp project
echo. devhelp to make HTML files and a Devhelp project
echo. epub to make an epub
echo. epub3 to make an epub3
echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter
echo. text to make text files
echo. man to make manual pages
echo. texinfo to make Texinfo files
echo. gettext to make PO message catalogs
echo. changes to make an overview over all changed/added/deprecated items
echo. xml to make Docutils-native XML files
echo. pseudoxml to make pseudoxml-XML files for display purposes
echo. linkcheck to check all external links for integrity
echo. doctest to run all doctests embedded in the documentation if enabled
echo. coverage to run coverage check of the documentation if enabled
echo. dummy to check syntax errors of document sources
goto end
)
if "%1" == "clean" (
for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i
del /q /s %BUILDDIR%\*
goto end
)
REM Check if sphinx-build is available and fallback to Python version if any
%SPHINXBUILD% 1>NUL 2>NUL
if errorlevel 9009 goto sphinx_python
goto sphinx_ok
:sphinx_python
set SPHINXBUILD=python -m sphinx.__init__
%SPHINXBUILD% 2> nul
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
@@ -26,11 +71,211 @@ if errorlevel 9009 (
exit /b 1
)
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
goto end
:sphinx_ok
:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
if "%1" == "html" (
%SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/html.
goto end
)
if "%1" == "dirhtml" (
%SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml.
goto end
)
if "%1" == "singlehtml" (
%SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml.
goto end
)
if "%1" == "pickle" (
%SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can process the pickle files.
goto end
)
if "%1" == "json" (
%SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can process the JSON files.
goto end
)
if "%1" == "htmlhelp" (
%SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can run HTML Help Workshop with the ^
.hhp project file in %BUILDDIR%/htmlhelp.
goto end
)
if "%1" == "qthelp" (
%SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can run "qcollectiongenerator" with the ^
.qhcp project file in %BUILDDIR%/qthelp, like this:
echo.^> qcollectiongenerator %BUILDDIR%\qthelp\aiographite.qhcp
echo.To view the help file:
echo.^> assistant -collectionFile %BUILDDIR%\qthelp\aiographite.ghc
goto end
)
if "%1" == "devhelp" (
%SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished.
goto end
)
if "%1" == "epub" (
%SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The epub file is in %BUILDDIR%/epub.
goto end
)
if "%1" == "epub3" (
%SPHINXBUILD% -b epub3 %ALLSPHINXOPTS% %BUILDDIR%/epub3
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The epub3 file is in %BUILDDIR%/epub3.
goto end
)
if "%1" == "latex" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
if errorlevel 1 exit /b 1
echo.
echo.Build finished; the LaTeX files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "latexpdf" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
cd %BUILDDIR%/latex
make all-pdf
cd %~dp0
echo.
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "latexpdfja" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
cd %BUILDDIR%/latex
make all-pdf-ja
cd %~dp0
echo.
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "text" (
%SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The text files are in %BUILDDIR%/text.
goto end
)
if "%1" == "man" (
%SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The manual pages are in %BUILDDIR%/man.
goto end
)
if "%1" == "texinfo" (
%SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo.
goto end
)
if "%1" == "gettext" (
%SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The message catalogs are in %BUILDDIR%/locale.
goto end
)
if "%1" == "changes" (
%SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes
if errorlevel 1 exit /b 1
echo.
echo.The overview file is in %BUILDDIR%/changes.
goto end
)
if "%1" == "linkcheck" (
%SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck
if errorlevel 1 exit /b 1
echo.
echo.Link check complete; look for any errors in the above output ^
or in %BUILDDIR%/linkcheck/output.txt.
goto end
)
if "%1" == "doctest" (
%SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest
if errorlevel 1 exit /b 1
echo.
echo.Testing of doctests in the sources finished, look at the ^
results in %BUILDDIR%/doctest/output.txt.
goto end
)
if "%1" == "coverage" (
%SPHINXBUILD% -b coverage %ALLSPHINXOPTS% %BUILDDIR%/coverage
if errorlevel 1 exit /b 1
echo.
echo.Testing of coverage in the sources finished, look at the ^
results in %BUILDDIR%/coverage/python.txt.
goto end
)
if "%1" == "xml" (
%SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The XML files are in %BUILDDIR%/xml.
goto end
)
if "%1" == "pseudoxml" (
%SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml.
goto end
)
if "%1" == "dummy" (
%SPHINXBUILD% -b dummy %ALLSPHINXOPTS% %BUILDDIR%/dummy
if errorlevel 1 exit /b 1
echo.
echo.Build finished. Dummy builder generates no files.
goto end
)
:end
popd

View File

@@ -0,0 +1,150 @@
API Reference
=============
Submodules
----------
sanic.app module
----------------
.. automodule:: sanic.app
:members:
:undoc-members:
:show-inheritance:
sanic.blueprints module
-----------------------
.. automodule:: sanic.blueprints
:members:
:undoc-members:
:show-inheritance:
sanic.config module
-------------------
.. automodule:: sanic.config
:members:
:undoc-members:
:show-inheritance:
sanic.constants module
----------------------
.. automodule:: sanic.constants
:members:
:undoc-members:
:show-inheritance:
sanic.cookies module
--------------------
.. automodule:: sanic.cookies
:members:
:undoc-members:
:show-inheritance:
sanic.exceptions module
-----------------------
.. automodule:: sanic.exceptions
:members:
:undoc-members:
:show-inheritance:
sanic.handlers module
---------------------
.. automodule:: sanic.handlers
:members:
:undoc-members:
:show-inheritance:
sanic.log module
----------------
.. automodule:: sanic.log
:members:
:undoc-members:
:show-inheritance:
sanic.request module
--------------------
.. automodule:: sanic.request
:members:
:undoc-members:
:show-inheritance:
sanic.response module
---------------------
.. automodule:: sanic.response
:members:
:undoc-members:
:show-inheritance:
sanic.router module
-------------------
.. automodule:: sanic.router
:members:
:undoc-members:
:show-inheritance:
sanic.server module
-------------------
.. automodule:: sanic.server
:members:
:undoc-members:
:show-inheritance:
sanic.static module
-------------------
.. automodule:: sanic.static
:members:
:undoc-members:
:show-inheritance:
sanic.testing module
--------------------
.. automodule:: sanic.testing
:members:
:undoc-members:
:show-inheritance:
sanic.views module
------------------
.. automodule:: sanic.views
:members:
:undoc-members:
:show-inheritance:
sanic.websocket module
----------------------
.. automodule:: sanic.websocket
:members:
:undoc-members:
:show-inheritance:
sanic.worker module
-------------------
.. automodule:: sanic.worker
:members:
:undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: sanic
:members:
:undoc-members:
:show-inheritance:

View File

@@ -169,7 +169,7 @@ app.run(host='0.0.0.0', port=8000, debug=True)
If you wish to generate a URL for a route inside of a blueprint, remember that the endpoint name
takes the format `<blueprint_name>.<handler_name>`. For example:
```
```python
@blueprint_v1.route('/')
async def root(request):
url = app.url_for('v1.post_handler', post_id=5) # --> '/v1/post/5'

View File

@@ -31,10 +31,10 @@ There are several ways how to load configuration.
### From environment variables.
Any variables defined with the `SANIC_` prefix will be applied to the sanic config. For example, setting `SANIC_REQUEST_TIMEOUT` will be loaded by the application automatically. You can pass the `load_vars` boolean to the Sanic constructor to override that:
Any variables defined with the `SANIC_` prefix will be applied to the sanic config. For example, setting `SANIC_REQUEST_TIMEOUT` will be loaded by the application automatically. You can pass the `load_env` boolean to the Sanic constructor to override that:
```python
app = Sanic(load_vars=False)
app = Sanic(load_env=False)
```
### From an Object
@@ -52,7 +52,7 @@ You could use a class or any other object as well.
### From a File
Usually you will want to load configuration from a file that is not part of the distributed application. You can load configuration from a file using `from_file(/path/to/config_file)`. However, that requires the program to know the path to the config file. So instead you can specify the location of the config file in an environment variable and tell Sanic to use that to find the config file:
Usually you will want to load configuration from a file that is not part of the distributed application. You can load configuration from a file using `from_pyfile(/path/to/config_file)`. However, that requires the program to know the path to the config file. So instead you can specify the location of the config file in an environment variable and tell Sanic to use that to find the config file:
```
app = Sanic('myapp')
@@ -66,7 +66,7 @@ $ MYAPP_SETTINGS=/path/to/config_file python3 myapp.py
INFO: Goin' Fast @ http://0.0.0.0:8000
```
The config files are regular Python files which are executed in order to load them. This allows you to use arbitrary logic for constructing the right configuration. Only uppercase varibales are added to the configuration. Most commonly the configuration consists of simple key value pairs:
The config files are regular Python files which are executed in order to load them. This allows you to use arbitrary logic for constructing the right configuration. Only uppercase variables are added to the configuration. Most commonly the configuration consists of simple key value pairs:
```
# config_file

View File

@@ -1,75 +0,0 @@
# Cookies
Cookies are pieces of data which persist inside a user's browser. Sanic can
both read and write cookies, which are stored as key-value pairs.
## Reading cookies
A user's cookies can be accessed via the `Request` object's `cookies` dictionary.
```python
from sanic.response import text
@app.route("/cookie")
async def test(request):
test_cookie = request.cookies.get('test')
return text("Test cookie set to: {}".format(test_cookie))
```
## Writing cookies
When returning a response, cookies can be set on the `Response` object.
```python
from sanic.response import text
@app.route("/cookie")
async def test(request):
response = text("There's a cookie up in this response")
response.cookies['test'] = 'It worked!'
response.cookies['test']['domain'] = '.gotta-go-fast.com'
response.cookies['test']['httponly'] = True
return response
```
## Deleting cookies
Cookies can be removed semantically or explicitly.
```python
from sanic.response import text
@app.route("/cookie")
async def test(request):
response = text("Time to eat some cookies muahaha")
# This cookie will be set to expire in 0 seconds
del response.cookies['kill_me']
# This cookie will self destruct in 5 seconds
response.cookies['short_life'] = 'Glad to be here'
response.cookies['short_life']['max-age'] = 5
del response.cookies['favorite_color']
# This cookie will remain unchanged
response.cookies['favorite_color'] = 'blue'
response.cookies['favorite_color'] = 'pink'
del response.cookies['favorite_color']
return response
```
Response cookies can be set like dictionary values and have the following
parameters available:
- `expires` (datetime): The time for the cookie to expire on the
client's browser.
- `path` (string): The subset of URLs to which this cookie applies. Defaults to /.
- `comment` (string): A comment (metadata).
- `domain` (string): Specifies the domain for which the cookie is valid. An
explicitly specified domain must always start with a dot.
- `max-age` (number): Number of seconds the cookie should live for.
- `secure` (boolean): Specifies whether the cookie will only be sent via
HTTPS.
- `httponly` (boolean): Specifies whether the cookie cannot be read by
Javascript.

87
docs/sanic/cookies.rst Normal file
View File

@@ -0,0 +1,87 @@
Cookies
=======
Cookies are pieces of data which persist inside a user's browser. Sanic can
both read and write cookies, which are stored as key-value pairs.
.. warning::
Cookies can be freely altered by the client. Therefore you cannot just store
data such as login information in cookies as-is, as they can be freely altered
by the client. To ensure data you store in cookies is not forged or tampered
with by the client, use something like `itsdangerous`_ to cryptographically
sign the data.
Reading cookies
---------------
A user's cookies can be accessed via the ``Request`` object's ``cookies`` dictionary.
.. code-block:: python
from sanic.response import text
@app.route("/cookie")
async def test(request):
test_cookie = request.cookies.get('test')
return text("Test cookie set to: {}".format(test_cookie))
Writing cookies
---------------
When returning a response, cookies can be set on the ``Response`` object.
.. code-block:: python
from sanic.response import text
@app.route("/cookie")
async def test(request):
response = text("There's a cookie up in this response")
response.cookies['test'] = 'It worked!'
response.cookies['test']['domain'] = '.gotta-go-fast.com'
response.cookies['test']['httponly'] = True
return response
Deleting cookies
----------------
Cookies can be removed semantically or explicitly.
.. code-block:: python
from sanic.response import text
@app.route("/cookie")
async def test(request):
response = text("Time to eat some cookies muahaha")
# This cookie will be set to expire in 0 seconds
del response.cookies['kill_me']
# This cookie will self destruct in 5 seconds
response.cookies['short_life'] = 'Glad to be here'
response.cookies['short_life']['max-age'] = 5
del response.cookies['favorite_color']
# This cookie will remain unchanged
response.cookies['favorite_color'] = 'blue'
response.cookies['favorite_color'] = 'pink'
del response.cookies['favorite_color']
return response
Response cookies can be set like dictionary values and have the following
parameters available:
- ``expires`` (datetime): The time for the cookie to expire on the client's browser.
- ``path`` (string): The subset of URLs to which this cookie applies. Defaults to /.
- ``comment`` (string): A comment (metadata).
- ``domain`` (string): Specifies the domain for which the cookie is valid. An
explicitly specified domain must always start with a dot.
- ``max-age`` (number): Number of seconds the cookie should live for.
- ``secure`` (boolean): Specifies whether the cookie will only be sent via HTTPS.
- ``httponly`` (boolean): Specifies whether the cookie cannot be read by Javascript.
.. _itsdangerous: https://pythonhosted.org/itsdangerous/

View File

@@ -54,9 +54,15 @@ In order to run Sanic application with Gunicorn, you need to use the special `sa
for Gunicorn `worker-class` argument:
```
gunicorn --bind 0.0.0.0:1337 --worker-class sanic.worker.GunicornWorker
gunicorn myapp:app --bind 0.0.0.0:1337 --worker-class sanic.worker.GunicornWorker
```
If your application suffers from memory leaks, you can configure Gunicorn to gracefully restart a worker
after it has processed a given number of requests. This can be a convenient way to help limit the effects
of the memory leak.
See the [Gunicorn Docs](http://docs.gunicorn.org/en/latest/settings.html#max-requests) for more information.
## Asynchronous support
This is suitable if you *need* to share the sanic process with other applications, in particular the `loop`.
However be advised that this method does not support using multiple processes, and is not the preferred way

View File

@@ -17,6 +17,19 @@ def i_am_ready_to_die(request):
raise ServerError("Something bad happened", status_code=500)
```
You can also use the `abort` function with the appropriate status code:
```python
from sanic.exceptions import abort
from sanic.response import text
@app.route('/youshallnotpass')
def no_no(request):
abort(401)
# this won't happen
text("OK")
```
## Handling exceptions
To override Sanic's default handling of an exception, the `@app.exception`

View File

@@ -22,3 +22,6 @@ A list of Sanic extensions created by the community.
- [sanic-graphql](https://github.com/graphql-python/sanic-graphql): GraphQL integration with Sanic
- [sanic-prometheus](https://github.com/dkruchinin/sanic-prometheus): Prometheus metrics for Sanic
- [Sanic-RestPlus](https://github.com/ashleysommer/sanic-restplus): A port of Flask-RestPlus for Sanic. Full-featured REST API with SwaggerUI generation.
- [sanic-transmute](https://github.com/yunstanford/sanic-transmute): A Sanic extension that generates APIs from python function and classes, and also generates Swagger UI/documentation automatically.
- [pytest-sanic](https://github.com/yunstanford/pytest-sanic): A pytest plugin for Sanic. It helps you to test your code asynchronously.
- [jinja2-sanic](https://github.com/yunstanford/jinja2-sanic): a jinja2 template renderer for Sanic.([Documentation](http://jinja2-sanic.readthedocs.io/en/latest/))

View File

@@ -1,11 +1,11 @@
# Logging
Sanic allows you to do different types of logging (access log, error log) on the requests based on the [python3 logging API](https://docs.python.org/3/howto/logging.html). You should have some basic knowledge on python3 logging if you want do create a new configuration.
Sanic allows you to do different types of logging (access log, error log) on the requests based on the [python3 logging API](https://docs.python.org/3/howto/logging.html). You should have some basic knowledge on python3 logging if you want to create a new configuration.
### Quck Start
### Quick Start
A simple example using default setting would be like this:
A simple example using default settings would be like this:
```python
from sanic import Sanic
@@ -14,7 +14,7 @@ from sanic.config import LOGGING
# The default logging handlers are ['accessStream', 'errorStream']
# but we change it to use other handlers here for demo purpose
LOGGING['loggers']['network']['handlers'] = [
'accessTimedRotatingFile', 'errorTimedRotationgFile']
'accessSysLog', 'errorSysLog']
app = Sanic('test')
@@ -26,8 +26,6 @@ if __name__ == "__main__":
app.run(log_config=LOGGING)
```
After the program starts, it will log down all the information/requests in access.log and error.log in your working directory.
And to close logging, simply assign log_config=None:
```python
@@ -76,21 +74,14 @@ By default, log_config parameter is set to use sanic.config.LOGGING dictionary f
(Notice that in Docker you have to enable everything by yourself)
- accessTimedRotatingFile (using [logging.handlers.TimedRotatingFileHandler](https://docs.python.org/3/library/logging.handlers.html#logging.handlers.TimedRotatingFileHandler))<br>
For requests information logging to file with daily rotation support.
- errorTimedRotatingFile (using [logging.handlers.TimedRotatingFileHandler](https://docs.python.org/3/library/logging.handlers.html#logging.handlers.TimedRotatingFileHandler))<br>
For error message and traceback logging to file with daily rotation support.
And `filters`:
- accessFilter (using sanic.defaultFilter.DefaultFilter)<br>
- accessFilter (using sanic.log.DefaultFilter)<br>
The filter that allows only levels in `DEBUG`, `INFO`, and `NONE(0)`
- errorFilter (using sanic.defaultFilter.DefaultFilter)<br>
The filter taht allows only levels in `WARNING`, `ERROR`, and `CRITICAL`
- errorFilter (using sanic.log.DefaultFilter)<br>
The filter that allows only levels in `WARNING`, `ERROR`, and `CRITICAL`
There are two `loggers` used in sanic, and **must be defined if you want to create your own logging configuration**:

View File

@@ -60,6 +60,16 @@ async def index(request):
return response.stream(streaming_fn, content_type='text/plain')
```
## File Streaming
For large files, a combination of File and Streaming above
```python
from sanic import response
@app.route('/big_file.png')
async def handle_request(request):
return await response.file_stream('/srv/www/whatever.png')
```
## Redirect
```python

View File

@@ -214,4 +214,3 @@ and `recv` methods to send and receive data respectively.
WebSocket support requires the [websockets](https://github.com/aaugustin/websockets)
package by Aymeric Augustin.

View File

@@ -1,5 +1,79 @@
# Streaming
## Request Streaming
Sanic allows you to get request data by stream, as below. When the request ends, `request.stream.get()` returns `None`. Only post, put and patch decorator have stream argument.
```python
from sanic import Sanic
from sanic.views import CompositionView
from sanic.views import HTTPMethodView
from sanic.views import stream as stream_decorator
from sanic.blueprints import Blueprint
from sanic.response import stream, text
bp = Blueprint('blueprint_request_stream')
app = Sanic('request_stream')
class SimpleView(HTTPMethodView):
@stream_decorator
async def post(self, request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
@app.post('/stream', stream=True)
async def handler(request):
async def streaming(response):
while True:
body = await request.stream.get()
if body is None:
break
body = body.decode('utf-8').replace('1', 'A')
response.write(body)
return stream(streaming)
@bp.put('/bp_stream', stream=True)
async def bp_handler(request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8').replace('1', 'A')
return text(result)
async def post_handler(request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
app.blueprint(bp)
app.add_route(SimpleView.as_view(), '/method_view')
view = CompositionView()
view.add(['POST'], post_handler, stream=True)
app.add_route(view, '/composition_view')
if __name__ == '__main__':
app.run(host='127.0.0.1', port=8000)
```
## Response Streaming
Sanic allows you to stream content to the client with the `stream` method. This method accepts a coroutine callback which is passed a `StreamingHTTPResponse` object that is written to. A simple example is like follows:
```python

View File

@@ -57,3 +57,71 @@ def test_post_json_request_includes_data():
More information about
the available arguments to aiohttp can be found
[in the documentation for ClientSession](https://aiohttp.readthedocs.io/en/stable/client_reference.html#client-session).
## pytest-sanic
[pytest-sanic](https://github.com/yunstanford/pytest-sanic) is a pytest plugin, it helps you to test your code asynchronously.
Just write tests like,
```python
async def test_sanic_db_find_by_id(app):
"""
Let's assume that, in db we have,
{
"id": "123",
"name": "Kobe Bryant",
"team": "Lakers",
}
"""
doc = await app.db["players"].find_by_id("123")
assert doc.name == "Kobe Bryant"
assert doc.team == "Lakers"
```
[pytest-sanic](https://github.com/yunstanford/pytest-sanic) also provides some useful fixtures, like loop, unused_port,
test_server, test_client.
```python
@pytest.yield_fixture
def app():
app = Sanic("test_sanic_app")
@app.route("/test_get", methods=['GET'])
async def test_get(request):
return response.json({"GET": True})
@app.route("/test_post", methods=['POST'])
async def test_post(request):
return response.json({"POST": True})
yield app
@pytest.fixture
def test_cli(loop, app, test_client):
return loop.run_until_complete(test_client(app, protocol=WebSocketProtocol))
#########
# Tests #
#########
async def test_fixture_test_client_get(test_cli):
"""
GET request
"""
resp = await test_cli.get('/test_get')
assert resp.status == 200
resp_json = await resp.json()
assert resp_json == {"GET": True}
async def test_fixture_test_client_post(test_cli):
"""
POST request
"""
resp = await test_cli.post('/test_post')
assert resp.status == 200
resp_json = await resp.json()
assert resp_json == {"POST": True}
```

50
docs/sanic/versioning.md Normal file
View File

@@ -0,0 +1,50 @@
# Versioning
You can pass the `version` keyword to the route decorators, or to a blueprint initializer. It will result in the `v{version}` url prefix where `{version}` is the version number.
## Per route
You can pass a version number to the routes directly.
```python
from sanic import response
@app.route('/text', version=1)
def handle_request(request):
return response.text('Hello world! Version 1')
@app.route('/text', version=2)
def handle_request(request):
return response.text('Hello world! Version 2')
app.run(port=80)
```
Then with curl:
```bash
curl localhost/v1/text
curl localhost/v2/text
```
## Global blueprint version
You can also pass a version number to the blueprint, which will apply to all routes.
```python
from sanic import response
from sanic.blueprints import Blueprint
bp = Blueprint('test', version=1)
@bp.route('/html')
def handle_request(request):
return response.html('<p>Hello world!</p>')
```
Then with curl:
```bash
curl localhost/v1/html
```

View File

@@ -1,26 +0,0 @@
from sanic import Sanic
from sanic import response
import aiohttp
app = Sanic(__name__)
async def fetch(session, url):
"""
Use session object to perform 'get' request on url
"""
async with session.get(url) as result:
return await result.json()
@app.route('/')
async def handle_request(request):
url = "https://api.github.com/repos/channelcat/sanic"
async with aiohttp.ClientSession() as session:
result = await fetch(session, url)
return response.json(result)
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000, workers=2)

View File

@@ -1,140 +0,0 @@
from sanic import Sanic
from sanic.exceptions import NotFound
from sanic.response import json
from sanic.views import HTTPMethodView
from asyncorm import configure_orm
from asyncorm.exceptions import QuerysetError
from library.models import Book
from library.serializer import BookSerializer
app = Sanic(name=__name__)
@app.listener('before_server_start')
def orm_configure(sanic, loop):
db_config = {'database': 'sanic_example',
'host': 'localhost',
'user': 'sanicdbuser',
'password': 'sanicDbPass',
}
# configure_orm needs a dictionary with:
# * the database configuration
# * the application/s where the models are defined
orm_app = configure_orm({'loop': loop, # always use the sanic loop!
'db_config': db_config,
'modules': ['library', ], # list of apps
})
# orm_app is the object that orchestrates the whole ORM
# sync_db should be run only once, better do that as external command
# it creates the tables in the database!!!!
# orm_app.sync_db()
# for all the 404 lets handle the exceptions
@app.exception(NotFound)
def ignore_404s(request, exception):
return json({'method': request.method,
'status': exception.status_code,
'error': exception.args[0],
'results': None,
})
# now the propper sanic workflow
class BooksView(HTTPMethodView):
def arg_parser(self, request):
parsed_args = {}
for k, v in request.args.items():
parsed_args[k] = v[0]
return parsed_args
async def get(self, request):
filtered_by = self.arg_parser(request)
if filtered_by:
q_books = await Book.objects.filter(**filtered_by)
else:
q_books = await Book.objects.all()
books = [BookSerializer.serialize(book) for book in q_books]
return json({'method': request.method,
'status': 200,
'results': books or None,
'count': len(books),
})
async def post(self, request):
# populate the book with the data in the request
book = Book(**request.json)
# and await on save
await book.save()
return json({'method': request.method,
'status': 201,
'results': BookSerializer.serialize(book),
})
class BookView(HTTPMethodView):
async def get_object(self, request, book_id):
try:
# await on database consults
book = await Book.objects.get(**{'id': book_id})
except QuerysetError as e:
raise NotFound(e.args[0])
return book
async def get(self, request, book_id):
# await on database consults
book = await self.get_object(request, book_id)
return json({'method': request.method,
'status': 200,
'results': BookSerializer.serialize(book),
})
async def put(self, request, book_id):
# await on database consults
book = await self.get_object(request, book_id)
# await on save
await book.save(**request.json)
return json({'method': request.method,
'status': 200,
'results': BookSerializer.serialize(book),
})
async def patch(self, request, book_id):
# await on database consults
book = await self.get_object(request, book_id)
# await on save
await book.save(**request.json)
return json({'method': request.method,
'status': 200,
'results': BookSerializer.serialize(book),
})
async def delete(self, request, book_id):
# await on database consults
book = await self.get_object(request, book_id)
# await on its deletion
await book.delete()
return json({'method': request.method,
'status': 200,
'results': None
})
app.add_route(BooksView.as_view(), '/books/')
app.add_route(BookView.as_view(), '/books/<book_id:int>/')
if __name__ == '__main__':
app.run()

View File

@@ -1,21 +0,0 @@
from asyncorm.model import Model
from asyncorm.fields import CharField, IntegerField, DateField
BOOK_CHOICES = (
('hard cover', 'hard cover book'),
('paperback', 'paperback book')
)
# This is a simple model definition
class Book(Model):
name = CharField(max_length=50)
synopsis = CharField(max_length=255)
book_type = CharField(max_length=15, null=True, choices=BOOK_CHOICES)
pages = IntegerField(null=True)
date_created = DateField(auto_now=True)
class Meta():
ordering = ['name', ]
unique_together = ['name', 'synopsis']

View File

@@ -1,15 +0,0 @@
from asyncorm.model import ModelSerializer, SerializerMethod
from library.models import Book
class BookSerializer(ModelSerializer):
book_type = SerializerMethod()
def get_book_type(self, instance):
return instance.book_type_display()
class Meta():
model = Book
fields = [
'id', 'name', 'synopsis', 'book_type', 'pages', 'date_created'
]

View File

@@ -1,2 +0,0 @@
asyncorm==0.0.7
sanic==0.4.1

View File

@@ -1,12 +1,12 @@
from sanic import Sanic
from sanic import Blueprint
from sanic.response import json, text
from sanic.response import json
app = Sanic(__name__)
blueprint = Blueprint('name', url_prefix='/my_blueprint')
blueprint2 = Blueprint('name', url_prefix='/my_blueprint2')
blueprint3 = Blueprint('name', url_prefix='/my_blueprint3')
blueprint2 = Blueprint('name2', url_prefix='/my_blueprint2')
blueprint3 = Blueprint('name3', url_prefix='/my_blueprint3')
@blueprint.route('/foo')
@@ -18,6 +18,7 @@ async def foo(request):
async def foo2(request):
return json({'msg': 'hi from blueprint2'})
@blueprint3.websocket('/foo')
async def foo3(request, ws):
while True:

View File

@@ -1,46 +0,0 @@
"""
Example of caching using aiocache package. To run it you will need a Redis
instance running in localhost:6379. You can also try with SimpleMemoryCache.
Running this example you will see that the first call lasts 3 seconds and
the rest are instant because the value is retrieved from the Redis.
If you want more info about the package check
https://github.com/argaen/aiocache
"""
import asyncio
import aiocache
from sanic import Sanic
from sanic.response import json
from sanic.log import log
from aiocache import cached
from aiocache.serializers import JsonSerializer
app = Sanic(__name__)
@app.listener('before_server_start')
def init_cache(sanic, loop):
aiocache.settings.set_defaults(
class_="aiocache.RedisCache",
# class_="aiocache.SimpleMemoryCache",
loop=loop
)
@cached(key="my_custom_key", serializer=JsonSerializer())
async def expensive_call():
log.info("Expensive has been called")
await asyncio.sleep(3)
return {"test": True}
@app.route("/")
async def test(request):
log.info("Received GET /")
return json(await expensive_call())
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,41 +0,0 @@
from sanic import Sanic
from sanic import response
from tornado.platform.asyncio import BaseAsyncIOLoop, to_asyncio_future
from distributed import LocalCluster, Client
app = Sanic(__name__)
def square(x):
return x**2
@app.listener('after_server_start')
async def setup(app, loop):
# configure tornado use asyncio's loop
ioloop = BaseAsyncIOLoop(loop)
# init distributed client
app.client = Client('tcp://localhost:8786', loop=ioloop, start=False)
await to_asyncio_future(app.client._start())
@app.listener('before_server_stop')
async def stop(app, loop):
await to_asyncio_future(app.client._shutdown())
@app.route('/<value:int>')
async def test(request, value):
future = app.client.submit(square, value)
result = await to_asyncio_future(future._result())
return response.text(f'The square of {value} is {result}')
if __name__ == '__main__':
# Distributed cluster should run somewhere else
with LocalCluster(scheduler_port=8786, nanny=False, n_workers=2,
threads_per_worker=1) as cluster:
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,136 +0,0 @@
# This demo requires aioredis and environmental variables established in ENV_VARS
import json
import logging
import os
from datetime import datetime
import aioredis
import sanic
from sanic import Sanic
ENV_VARS = ["REDIS_HOST", "REDIS_PORT",
"REDIS_MINPOOL", "REDIS_MAXPOOL",
"REDIS_PASS", "APP_LOGFILE"]
app = Sanic(name=__name__)
logger = None
@app.middleware("request")
async def log_uri(request):
# Simple middleware to log the URI endpoint that was called
logger.info("URI called: {0}".format(request.url))
@app.listener('before_server_start')
async def before_server_start(app, loop):
logger.info("Starting redis pool")
app.redis_pool = await aioredis.create_pool(
(app.config.REDIS_HOST, int(app.config.REDIS_PORT)),
minsize=int(app.config.REDIS_MINPOOL),
maxsize=int(app.config.REDIS_MAXPOOL),
password=app.config.REDIS_PASS)
@app.listener('after_server_stop')
async def after_server_stop(app, loop):
logger.info("Closing redis pool")
app.redis_pool.close()
await app.redis_pool.wait_closed()
@app.middleware("request")
async def attach_db_connectors(request):
# Just put the db objects in the request for easier access
logger.info("Passing redis pool to request object")
request["redis"] = request.app.redis_pool
@app.route("/state/<user_id>", methods=["GET"])
async def access_state(request, user_id):
try:
# Check to see if the value is in cache, if so lets return that
with await request["redis"] as redis_conn:
state = await redis_conn.get(user_id, encoding="utf-8")
if state:
return sanic.response.json({"msg": "Success",
"status": 200,
"success": True,
"data": json.loads(state),
"finished_at": datetime.now().isoformat()})
# Then state object is not in redis
logger.critical("Unable to find user_data in cache.")
return sanic.response.HTTPResponse({"msg": "User state not found",
"success": False,
"status": 404,
"finished_at": datetime.now().isoformat()}, status=404)
except aioredis.ProtocolError:
logger.critical("Unable to connect to state cache")
return sanic.response.HTTPResponse({"msg": "Internal Server Error",
"status": 500,
"success": False,
"finished_at": datetime.now().isoformat()}, status=500)
@app.route("/state/<user_id>/push", methods=["POST"])
async def set_state(request, user_id):
try:
# Pull a connection from the pool
with await request["redis"] as redis_conn:
# Set the value in cache to your new value
await redis_conn.set(user_id, json.dumps(request.json), expire=1800)
logger.info("Successfully pushed state to cache")
return sanic.response.HTTPResponse({"msg": "Successfully pushed state to cache",
"success": True,
"status": 200,
"finished_at": datetime.now().isoformat()})
except aioredis.ProtocolError:
logger.critical("Unable to connect to state cache")
return sanic.response.HTTPResponse({"msg": "Internal Server Error",
"status": 500,
"success": False,
"finished_at": datetime.now().isoformat()}, status=500)
def configure():
# Setup environment variables
env_vars = [os.environ.get(v, None) for v in ENV_VARS]
if not all(env_vars):
# Send back environment variables that were not set
return False, ", ".join([ENV_VARS[i] for i, flag in env_vars if not flag])
else:
# Add all the env vars to our app config
app.config.update({k: v for k, v in zip(ENV_VARS, env_vars)})
setup_logging()
return True, None
def setup_logging():
logging_format = "[%(asctime)s] %(process)d-%(levelname)s "
logging_format += "%(module)s::%(funcName)s():l%(lineno)d: "
logging_format += "%(message)s"
logging.basicConfig(
filename=app.config.APP_LOGFILE,
format=logging_format,
level=logging.DEBUG)
def main(result, missing):
if result:
try:
app.run(host="0.0.0.0", port=8080, debug=True)
except:
logging.critical("User killed server. Closing")
else:
logging.critical("Unable to start. Missing environment variables [{0}]".format(missing))
if __name__ == "__main__":
result, missing = configure()
logger = logging.getLogger()
main(result, missing)

View File

@@ -37,7 +37,6 @@ server's error_handler to an instance of our CustomHandler
"""
from sanic import Sanic
from sanic import response
app = Sanic(__name__)
@@ -49,8 +48,7 @@ app.error_handler = handler
async def test(request):
# Here, something occurs which causes an unexpected exception
# This exception will flow to our custom handler.
1 / 0
return response.json({"test": True})
raise SanicException('You Broke It!')
app.run(host="0.0.0.0", port=8000, debug=True)
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000, debug=True)

View File

@@ -1,28 +0,0 @@
# Render templates in a Flask like way from a "template" directory in
# the project
from sanic import Sanic
from sanic import response
from jinja2 import Environment, PackageLoader, select_autoescape
app = Sanic(__name__)
# Load the template environment with async support
template_env = Environment(
loader=PackageLoader('jinja_example', 'templates'),
autoescape=select_autoescape(['html', 'xml']),
enable_async=True
)
# Load the template from file
template = template_env.get_template("example_template.html")
@app.route('/')
async def test(request):
rendered_template = await template.render_async(
knights='that say nih; asynchronously')
return response.html(rendered_template)
app.run(host="0.0.0.0", port=8080, debug=True)

View File

@@ -1,8 +0,0 @@
aiofiles==0.3.1
httptools==0.0.9
Jinja2==2.9.6
MarkupSafe==1.0
sanic==0.5.2
ujson==1.35
uvloop==0.8.0
websockets==3.3

View File

@@ -1,10 +0,0 @@
<!DOCTYPE html>
<html lang="en">
<head>
<title>My Webpage</title>
</head>
<body>
<h1>Hello World</h1>
<p>knights - {{ knights }}</p>
</body>
</html>

View File

@@ -8,11 +8,12 @@ app = Sanic(__name__)
sem = None
@app.listener('before_server_start')
def init(sanic, loop):
global sem
CONCURRENCY_PER_WORKER = 4
sem = asyncio.Semaphore(CONCURRENCY_PER_WORKER, loop=loop)
concurrency_per_worker = 4
sem = asyncio.Semaphore(concurrency_per_worker, loop=loop)
async def bounded_fetch(session, url):
"""

View File

@@ -7,6 +7,7 @@ from sanic import response
app = Sanic(__name__)
@app.route('/')
def handle_request(request):
return response.json(
@@ -15,6 +16,7 @@ def handle_request(request):
status=200
)
@app.route('/unauthorized')
def handle_request(request):
return response.json(

View File

@@ -14,6 +14,8 @@ log = logging.getLogger()
# Set logger to override default basicConfig
sanic = Sanic()
@sanic.route("/")
def test(request):
log.info("received request; responding with 'hey'")

View File

@@ -1,85 +0,0 @@
from sanic import Sanic
from sanic_session import InMemorySessionInterface
from sanic_jinja2 import SanicJinja2
import json
import plotly
import pandas as pd
import numpy as np
app = Sanic(__name__)
jinja = SanicJinja2(app)
session = InMemorySessionInterface(cookie_name=app.name, prefix=app.name)
@app.middleware('request')
async def print_on_request(request):
print(request.headers)
await session.open(request)
@app.middleware('response')
async def print_on_response(request, response):
await session.save(request, response)
@app.route('/')
async def index(request):
rng = pd.date_range('1/1/2011', periods=7500, freq='H')
ts = pd.Series(np.random.randn(len(rng)), index=rng)
graphs = [
dict(
data=[
dict(
x=[1, 2, 3],
y=[10, 20, 30],
type='scatter'
),
],
layout=dict(
title='first graph'
)
),
dict(
data=[
dict(
x=[1, 3, 5],
y=[10, 50, 30],
type='bar'
),
],
layout=dict(
title='second graph'
)
),
dict(
data=[
dict(
x=ts.index, # Can use the pandas data structures directly
y=ts
)
]
)
]
# Add "ids" to each of the graphs to pass up to the client
# for templating
ids = ['graph-{}'.format(i) for i, _ in enumerate(graphs)]
# Convert the figures to JSON
# PlotlyJSONEncoder appropriately converts pandas, datetime, etc
# objects to their JSON equivalents
graphJSON = json.dumps(graphs, cls=plotly.utils.PlotlyJSONEncoder)
return jinja.render('index.html', request,
ids=ids,
graphJSON=graphJSON)
if __name__ == '__main__':
app.run(host='0.0.0.0', port=8000, debug=True)

View File

@@ -1,5 +0,0 @@
pandas==0.19.2
plotly==2.0.7
sanic==0.5.0
sanic-jinja2==0.5.1
sanic-session==0.1.3

View File

@@ -8,6 +8,7 @@ app = Sanic(__name__)
def handle_request(request):
return response.redirect('/redirect')
@app.route('/redirect')
async def test(request):
return response.json({"Redirected": True})

View File

@@ -0,0 +1,10 @@
import requests
# Warning: This is a heavy process.
data = ""
for i in range(1, 250000):
data += str(i)
r = requests.post('http://0.0.0.0:8000/stream', data=data)
print(r.text)

View File

@@ -0,0 +1,65 @@
from sanic import Sanic
from sanic.views import CompositionView
from sanic.views import HTTPMethodView
from sanic.views import stream as stream_decorator
from sanic.blueprints import Blueprint
from sanic.response import stream, text
bp = Blueprint('blueprint_request_stream')
app = Sanic('request_stream')
class SimpleView(HTTPMethodView):
@stream_decorator
async def post(self, request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
@app.post('/stream', stream=True)
async def handler(request):
async def streaming(response):
while True:
body = await request.stream.get()
if body is None:
break
body = body.decode('utf-8').replace('1', 'A')
response.write(body)
return stream(streaming)
@bp.put('/bp_stream', stream=True)
async def bp_handler(request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8').replace('1', 'A')
return text(result)
async def post_handler(request):
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
app.blueprint(bp)
app.add_route(SimpleView.as_view(), '/method_view')
view = CompositionView()
view.add(['POST'], post_handler, stream=True)
app.add_route(view, '/composition_view')
if __name__ == '__main__':
app.run(host='0.0.0.0', port=8000)

View File

@@ -1,12 +1,12 @@
from sanic import Sanic
from sanic import response
from multiprocessing import Event
from signal import signal, SIGINT
import asyncio
import uvloop
app = Sanic(__name__)
@app.route("/")
async def test(request):
return response.json({"answer": "42"})

View File

@@ -1,62 +0,0 @@
# encoding: utf-8
"""
You need the aiomysql
"""
import os
import aiomysql
from sanic import Sanic
from sanic.response import json
database_name = os.environ['DATABASE_NAME']
database_host = os.environ['DATABASE_HOST']
database_user = os.environ['DATABASE_USER']
database_password = os.environ['DATABASE_PASSWORD']
app = Sanic()
@app.listener("before_server_start")
async def get_pool(app, loop):
"""
the first param is the global instance ,
so we can store our connection pool in it .
and it can be used by different request
:param args:
:param kwargs:
:return:
"""
app.pool = {
"aiomysql": await aiomysql.create_pool(host=database_host, user=database_user, password=database_password,
db=database_name,
maxsize=5)}
async with app.pool['aiomysql'].acquire() as conn:
async with conn.cursor() as cur:
await cur.execute('DROP TABLE IF EXISTS sanic_polls')
await cur.execute("""CREATE TABLE sanic_polls (
id serial primary key,
question varchar(50),
pub_date timestamp
);""")
for i in range(0, 100):
await cur.execute("""INSERT INTO sanic_polls
(id, question, pub_date) VALUES ({}, {}, now())
""".format(i, i))
@app.route("/")
async def test():
result = []
data = {}
async with app.pool['aiomysql'].acquire() as conn:
async with conn.cursor() as cur:
await cur.execute("SELECT question, pub_date FROM sanic_polls")
async for row in cur:
result.append({"question": row[0], "pub_date": row[1]})
if result or len(result) > 0:
data['data'] = res
return json(data)
if __name__ == '__main__':
app.run(host="127.0.0.1", workers=4, port=12000)

View File

@@ -1,61 +0,0 @@
from sanic import Sanic
from sanic.response import json
from aiopeewee import AioModel, AioMySQLDatabase, model_to_dict
from peewee import CharField, TextField, DateTimeField
from peewee import ForeignKeyField, PrimaryKeyField
db = AioMySQLDatabase('test', user='root', password='',
host='127.0.0.1', port=3306)
class User(AioModel):
username = CharField()
class Meta:
database = db
class Blog(AioModel):
user = ForeignKeyField(User)
title = CharField(max_length=25)
content = TextField(default='')
pub_date = DateTimeField(null=True)
pk = PrimaryKeyField()
class Meta:
database = db
app = Sanic(__name__)
@app.listener('before_server_start')
async def setup(app, loop):
# create connection pool
await db.connect(loop)
# create table if not exists
await db.create_tables([User, Blog], safe=True)
@app.listener('before_server_stop')
async def stop(app, loop):
# close connection pool
await db.close()
@app.post('/users')
async def add_user(request):
user = await User.create(**request.json)
return json(await model_to_dict(user))
@app.get('/users/count')
async def user_count(request):
count = await User.select().count()
return json({'count': count})
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,65 +0,0 @@
""" To run this example you need additional aiopg package
"""
import os
import asyncio
import uvloop
import aiopg
from sanic import Sanic
from sanic.response import json
database_name = os.environ['DATABASE_NAME']
database_host = os.environ['DATABASE_HOST']
database_user = os.environ['DATABASE_USER']
database_password = os.environ['DATABASE_PASSWORD']
connection = 'postgres://{0}:{1}@{2}/{3}'.format(database_user,
database_password,
database_host,
database_name)
async def get_pool():
return await aiopg.create_pool(connection)
app = Sanic(name=__name__)
@app.listener('before_server_start')
async def prepare_db(app, loop):
"""
Let's create some table and add some data
"""
async with aiopg.create_pool(connection) as pool:
async with pool.acquire() as conn:
async with conn.cursor() as cur:
await cur.execute('DROP TABLE IF EXISTS sanic_polls')
await cur.execute("""CREATE TABLE sanic_polls (
id serial primary key,
question varchar(50),
pub_date timestamp
);""")
for i in range(0, 100):
await cur.execute("""INSERT INTO sanic_polls
(id, question, pub_date) VALUES ({}, {}, now())
""".format(i, i))
@app.route("/")
async def handle(request):
result = []
async def test_select():
async with aiopg.create_pool(connection) as pool:
async with pool.acquire() as conn:
async with conn.cursor() as cur:
await cur.execute("SELECT question, pub_date FROM sanic_polls")
async for row in cur:
result.append({"question": row[0], "pub_date": row[1]})
res = await test_select()
return json({'polls': result})
if __name__ == '__main__':
app.run(host='0.0.0.0',
port=8000,
debug=True)

View File

@@ -1,67 +0,0 @@
""" To run this example you need additional aiopg package
"""
import os
import asyncio
import datetime
import uvloop
from aiopg.sa import create_engine
import sqlalchemy as sa
from sanic import Sanic
from sanic.response import json
database_name = os.environ['DATABASE_NAME']
database_host = os.environ['DATABASE_HOST']
database_user = os.environ['DATABASE_USER']
database_password = os.environ['DATABASE_PASSWORD']
connection = 'postgres://{0}:{1}@{2}/{3}'.format(database_user,
database_password,
database_host,
database_name)
metadata = sa.MetaData()
polls = sa.Table('sanic_polls', metadata,
sa.Column('id', sa.Integer, primary_key=True),
sa.Column('question', sa.String(50)),
sa.Column("pub_date", sa.DateTime))
app = Sanic(name=__name__)
@app.listener('before_server_start')
async def prepare_db(app, loop):
""" Let's add some data
"""
async with create_engine(connection) as engine:
async with engine.acquire() as conn:
await conn.execute('DROP TABLE IF EXISTS sanic_polls')
await conn.execute("""CREATE TABLE sanic_polls (
id serial primary key,
question varchar(50),
pub_date timestamp
);""")
for i in range(0, 100):
await conn.execute(
polls.insert().values(question=i,
pub_date=datetime.datetime.now())
)
@app.route("/")
async def handle(request):
async with create_engine(connection) as engine:
async with engine.acquire() as conn:
result = []
async for row in conn.execute(polls.select()):
result.append({"question": row.question,
"pub_date": row.pub_date})
return json({"polls": result})
if __name__ == '__main__':
app.run(host='0.0.0.0', port=8000)

View File

@@ -1,34 +0,0 @@
""" To run this example you need additional aioredis package
"""
from sanic import Sanic, response
import aioredis
app = Sanic(__name__)
@app.route("/")
async def handle(request):
async with request.app.redis_pool.get() as redis:
await redis.set('test-my-key', 'value')
val = await redis.get('test-my-key')
return response.text(val.decode('utf-8'))
@app.listener('before_server_start')
async def before_server_start(app, loop):
app.redis_pool = await aioredis.create_pool(
('localhost', 6379),
minsize=5,
maxsize=10,
loop=loop
)
@app.listener('after_server_stop')
async def after_server_stop(app, loop):
app.redis_pool.close()
await app.redis_pool.wait_closed()
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,51 +0,0 @@
import os
import asyncio
import uvloop
from asyncpg import connect, create_pool
from sanic import Sanic
from sanic.response import json
DB_CONFIG = {
'host': '<host>',
'user': '<user>',
'password': '<password>',
'port': '<port>',
'database': '<database>'
}
def jsonify(records):
"""
Parse asyncpg record response into JSON format
"""
return [dict(r.items()) for r in records]
app = Sanic(__name__)
@app.listener('before_server_start')
async def register_db(app, loop):
app.pool = await create_pool(**DB_CONFIG, loop=loop, max_size=100)
async with app.pool.acquire() as connection:
await connection.execute('DROP TABLE IF EXISTS sanic_post')
await connection.execute("""CREATE TABLE sanic_post (
id serial primary key,
content varchar(50),
post_date timestamp
);""")
for i in range(0, 1000):
await connection.execute(f"""INSERT INTO sanic_post
(id, content, post_date) VALUES ({i}, {i}, now())""")
@app.get('/')
async def root_get(request):
async with app.pool.acquire() as connection:
results = await connection.fetch('SELECT * FROM sanic_post')
return json({'posts': jsonify(results)})
if __name__ == '__main__':
app.run(host='127.0.0.1', port=8080)

View File

@@ -1,41 +0,0 @@
""" sanic motor (async driver for mongodb) example
Required packages:
pymongo==3.4.0
motor==1.1
sanic==0.2.0
"""
from sanic import Sanic
from sanic import response
app = Sanic('motor_mongodb')
def get_db():
from motor.motor_asyncio import AsyncIOMotorClient
mongo_uri = "mongodb://127.0.0.1:27017/test"
client = AsyncIOMotorClient(mongo_uri)
return client['test']
@app.route('/objects', methods=['GET'])
async def get(request):
db = get_db()
docs = await db.test_col.find().to_list(length=100)
for doc in docs:
doc['id'] = str(doc['_id'])
del doc['_id']
return response.json(docs)
@app.route('/post', methods=['POST'])
async def new(request):
doc = request.json
print(doc)
db = get_db()
object_id = await db.test_col.save(doc)
return response.json({'object_id': str(object_id)})
if __name__ == "__main__":
app.run(host='0.0.0.0', port=8000, debug=True)

View File

@@ -1,116 +0,0 @@
## You need the following additional packages for this example
# aiopg
# peewee_async
# peewee
## sanic imports
from sanic import Sanic
from sanic.response import json
## peewee_async related imports
import peewee
from peewee import Model, BaseModel
from peewee_async import Manager, PostgresqlDatabase, execute
from functools import partial
# we instantiate a custom loop so we can pass it to our db manager
## from peewee_async docs:
# Also theres no need to connect and re-connect before executing async queries
# with manager! Its all automatic. But you can run Manager.connect() or
# Manager.close() when you need it.
class AsyncManager(Manager):
"""Inherit the peewee_async manager with our own object
configuration
database.allow_sync = False
"""
def __init__(self, _model_class, *args, **kwargs):
super(AsyncManager, self).__init__(*args, **kwargs)
self._model_class = _model_class
self.database.allow_sync = False
def _do_fill(self, method, *args, **kwargs):
_class_method = getattr(super(AsyncManager, self), method)
pf = partial(_class_method, self._model_class)
return pf(*args, **kwargs)
def new(self, *args, **kwargs):
return self._do_fill('create', *args, **kwargs)
def get(self, *args, **kwargs):
return self._do_fill('get', *args, **kwargs)
def execute(self, query):
return execute(query)
def _get_meta_db_class(db):
"""creating a declartive class model for db"""
class _BlockedMeta(BaseModel):
def __new__(cls, name, bases, attrs):
_instance = super(_BlockedMeta, cls).__new__(cls, name, bases, attrs)
_instance.objects = AsyncManager(_instance, db)
return _instance
class _Base(Model, metaclass=_BlockedMeta):
def to_dict(self):
return self._data
class Meta:
database=db
return _Base
def declarative_base(*args, **kwargs):
"""Returns a new Modeled Class after inheriting meta and Model classes"""
db = PostgresqlDatabase(*args, **kwargs)
return _get_meta_db_class(db)
AsyncBaseModel = declarative_base(database='test',
host='127.0.0.1',
user='postgres',
password='mysecretpassword')
# let's create a simple key value store:
class KeyValue(AsyncBaseModel):
key = peewee.CharField(max_length=40, unique=True)
text = peewee.TextField(default='')
app = Sanic('peewee_example')
@app.route('/post/<key>/<value>')
async def post(request, key, value):
"""
Save get parameters to database
"""
obj = await KeyValue.objects.new(key=key, text=value)
return json({'object_id': obj.id})
@app.route('/get')
async def get(request):
"""
Load all objects from database
"""
all_objects = await KeyValue.objects.execute(KeyValue.select())
serialized_obj = []
for obj in all_objects:
serialized_obj.append({
'id': obj.id,
'key': obj.key,
'value': obj.text}
)
return json({'objects': serialized_obj})
if __name__ == "__main__":
app.run(host='0.0.0.0', port=8000)

View File

@@ -19,32 +19,41 @@ def test_sync(request):
@app.route("/dynamic/<name>/<id:int>")
def test_params(request, name, id):
return response.text("yeehaww {} {}".format(name, id))
def test_params(request, name, i):
return response.text("yeehaww {} {}".format(name, i))
@app.route("/exception")
def exception(request):
raise ServerError("It's dead jim")
@app.route("/await")
async def test_await(request):
import asyncio
await asyncio.sleep(5)
return response.text("I'm feeling sleepy")
@app.route("/file")
async def test_file(request):
return await response.file(os.path.abspath("setup.py"))
@app.route("/file_stream")
async def test_file_stream(request):
return await response.file_stream(os.path.abspath("setup.py"),
chunk_size=1024)
# ----------------------------------------------- #
# Exceptions
# ----------------------------------------------- #
@app.exception(ServerError)
async def test(request, exception):
return response.json({"exception": "{}".format(exception), "status": exception.status_code}, status=exception.status_code)
return response.json({"exception": "{}".format(exception), "status": exception.status_code},
status=exception.status_code)
# ----------------------------------------------- #
@@ -63,7 +72,8 @@ def post_json(request):
@app.route("/query_string")
def query_string(request):
return response.json({"parsed": True, "args": request.args, "url": request.url, "query_string": request.query_string})
return response.json({"parsed": True, "args": request.args, "url": request.url,
"query_string": request.query_string})
# ----------------------------------------------- #

23
examples/unix_socket.py Normal file
View File

@@ -0,0 +1,23 @@
from sanic import Sanic
from sanic import response
import socket
import os
app = Sanic(__name__)
@app.route("/test")
async def test(request):
return response.text("OK")
if __name__ == '__main__':
server_address = './uds_socket'
# Make sure the socket does not already exist
try:
os.unlink(server_address)
except OSError:
if os.path.exists(server_address):
raise
sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
sock.bind(server_address)
app.run(sock=sock)

View File

@@ -3,6 +3,7 @@ from sanic import response
app = Sanic(__name__)
@app.route('/')
async def index(request):
# generate a URL for the endpoint `post_handler`
@@ -10,6 +11,7 @@ async def index(request):
# the URL is `/posts/5`, redirect to it
return response.redirect(url)
@app.route('/posts/<post_id>')
async def post_handler(request, post_id):
return response.text('Post - {}'.format(post_id))

View File

@@ -11,29 +11,29 @@ from sanic.blueprints import Blueprint
app = Sanic()
bp = Blueprint("bp", host="bp.example.com")
@app.route('/', host=["example.com",
"somethingelse.com",
"therestofyourdomains.com"])
async def hello(request):
return response.text("Some defaults")
@app.route('/', host="example.com")
async def hello(request):
return response.text("Answer")
@app.route('/', host="sub.example.com")
async def hello(request):
return response.text("42")
@bp.route("/question")
async def hello(request):
return response.text("What is the meaning of life?")
@bp.route("/answer")
async def hello(request):
return response.text("42")
app.register_blueprint(bp)
app.blueprint(bp)
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000)

View File

@@ -20,4 +20,5 @@ async def feed(request, ws):
if __name__ == '__main__':
app.run()
app.run(host="0.0.0.0", port=8000, debug=True)

View File

@@ -9,3 +9,4 @@ pytest
tox
ujson
uvloop
gunicorn

3
requirements-docs.txt Normal file
View File

@@ -0,0 +1,3 @@
sphinx
sphinx_rtd_theme
recommonmark

View File

@@ -1,6 +1,6 @@
from sanic.app import Sanic
from sanic.blueprints import Blueprint
__version__ = '0.5.4'
__version__ = '0.6.0'
__all__ = ['Sanic', 'Blueprint']

View File

@@ -60,6 +60,7 @@ class Sanic:
self.sock = None
self.listeners = defaultdict(list)
self.is_running = False
self.is_request_stream = False
self.websocket_enabled = False
self.websocket_tasks = []
@@ -110,12 +111,14 @@ class Sanic:
# Decorator
def route(self, uri, methods=frozenset({'GET'}), host=None,
strict_slashes=False):
strict_slashes=False, stream=False, version=None):
"""Decorate a function to be registered as a route
:param uri: path of the URL
:param methods: list or tuple of methods allowed
:param host:
:param strict_slashes:
:param stream:
:return: decorated function
"""
@@ -124,44 +127,56 @@ class Sanic:
if not uri.startswith('/'):
uri = '/' + uri
if stream:
self.is_request_stream = True
def response(handler):
if stream:
handler.is_stream = stream
self.router.add(uri=uri, methods=methods, handler=handler,
host=host, strict_slashes=strict_slashes)
host=host, strict_slashes=strict_slashes,
version=version)
return handler
return response
# Shorthand method decorators
def get(self, uri, host=None, strict_slashes=False):
def get(self, uri, host=None, strict_slashes=False, version=None):
return self.route(uri, methods=frozenset({"GET"}), host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version)
def post(self, uri, host=None, strict_slashes=False):
def post(self, uri, host=None, strict_slashes=False, stream=False,
version=None):
return self.route(uri, methods=frozenset({"POST"}), host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, stream=stream,
version=version)
def put(self, uri, host=None, strict_slashes=False):
def put(self, uri, host=None, strict_slashes=False, stream=False,
version=None):
return self.route(uri, methods=frozenset({"PUT"}), host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, stream=stream,
version=version)
def head(self, uri, host=None, strict_slashes=False):
def head(self, uri, host=None, strict_slashes=False, version=None):
return self.route(uri, methods=frozenset({"HEAD"}), host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version)
def options(self, uri, host=None, strict_slashes=False):
def options(self, uri, host=None, strict_slashes=False, version=None):
return self.route(uri, methods=frozenset({"OPTIONS"}), host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version)
def patch(self, uri, host=None, strict_slashes=False):
def patch(self, uri, host=None, strict_slashes=False, stream=False,
version=None):
return self.route(uri, methods=frozenset({"PATCH"}), host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, stream=stream,
version=version)
def delete(self, uri, host=None, strict_slashes=False):
def delete(self, uri, host=None, strict_slashes=False, version=None):
return self.route(uri, methods=frozenset({"DELETE"}), host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version)
def add_route(self, handler, uri, methods=frozenset({'GET'}), host=None,
strict_slashes=False):
strict_slashes=False, version=None):
"""A helper method to register class instance or
functions as a handler to the application url
routes.
@@ -173,20 +188,29 @@ class Sanic:
:param host:
:return: function or class instance
"""
stream = False
# Handle HTTPMethodView differently
if hasattr(handler, 'view_class'):
methods = set()
for method in HTTP_METHODS:
if getattr(handler.view_class, method.lower(), None):
_handler = getattr(handler.view_class, method.lower(), None)
if _handler:
methods.add(method)
if hasattr(_handler, 'is_stream'):
stream = True
# handle composition view differently
if isinstance(handler, CompositionView):
methods = handler.handlers.keys()
for _handler in handler.handlers.values():
if hasattr(_handler, 'is_stream'):
stream = True
break
self.route(uri=uri, methods=methods, host=host,
strict_slashes=strict_slashes)(handler)
strict_slashes=strict_slashes, stream=stream,
version=version)(handler)
return handler
# Decorator
@@ -206,7 +230,12 @@ class Sanic:
def response(handler):
async def websocket_handler(request, *args, **kwargs):
request.app = self
try:
protocol = request.transport.get_protocol()
except AttributeError:
# On Python3.5 the Transport classes in asyncio do not
# have a get_protocol() method as in uvloop
protocol = request.transport._protocol
ws = await protocol.websocket_handshake(request)
# schedule the application handler
@@ -272,34 +301,37 @@ class Sanic:
return response
# Decorator
def middleware(self, middleware_or_request):
"""Decorate and register middleware to be called before a request.
Can either be called as @app.middleware or @app.middleware('request')
"""
def register_middleware(middleware, attach_to='request'):
def register_middleware(self, middleware, attach_to='request'):
if attach_to == 'request':
self.request_middleware.append(middleware)
if attach_to == 'response':
self.response_middleware.appendleft(middleware)
return middleware
# Decorator
def middleware(self, middleware_or_request):
"""Decorate and register middleware to be called before a request.
Can either be called as @app.middleware or @app.middleware('request')
"""
# Detect which way this was called, @middleware or @middleware('AT')
if callable(middleware_or_request):
return register_middleware(middleware_or_request)
return self.register_middleware(middleware_or_request)
else:
return partial(register_middleware,
return partial(self.register_middleware,
attach_to=middleware_or_request)
# Static Files
def static(self, uri, file_or_directory, pattern=r'/?.+',
use_modified_since=True, use_content_range=False):
use_modified_since=True, use_content_range=False,
stream_large_files=False):
"""Register a root to serve files from. The input can either be a
file or a directory. See
"""
static_register(self, uri, file_or_directory, pattern,
use_modified_since, use_content_range)
use_modified_since, use_content_range,
stream_large_files)
def blueprint(self, blueprint, **options):
"""Register a blueprint on the application.
@@ -516,10 +548,10 @@ class Sanic:
# Execution
# -------------------------------------------------------------------- #
def run(self, host="127.0.0.1", port=8000, debug=False, ssl=None,
def run(self, host=None, port=None, debug=False, ssl=None,
sock=None, workers=1, protocol=None,
backlog=100, stop_event=None, register_sys_signals=True,
log_config=LOGGING):
log_config=None):
"""Run the HTTP Server and listen until keyboard interrupt or term
signal. On termination, drain connections before closing.
@@ -537,7 +569,11 @@ class Sanic:
:param protocol: Subclass of asyncio protocol class
:return: Nothing
"""
if sock is None:
host, port = host or "127.0.0.1", port or 8000
if log_config:
self.log_config = log_config
logging.config.dictConfig(log_config)
if protocol is None:
protocol = (WebSocketProtocol if self.websocket_enabled
@@ -551,7 +587,7 @@ class Sanic:
host=host, port=port, debug=debug, ssl=ssl, sock=sock,
workers=workers, protocol=protocol, backlog=backlog,
register_sys_signals=register_sys_signals,
has_log=log_config is not None)
has_log=self.log_config is not None)
try:
self.is_running = True
@@ -575,7 +611,7 @@ class Sanic:
"""gunicorn compatibility"""
return self
async def create_server(self, host="127.0.0.1", port=8000, debug=False,
async def create_server(self, host=None, port=None, debug=False,
ssl=None, sock=None, protocol=None,
backlog=100, stop_event=None,
log_config=LOGGING):
@@ -584,6 +620,9 @@ class Sanic:
NOTE: This does not support multiprocessing and is not the preferred
way to run a Sanic application.
"""
if sock is None:
host, port = host or "127.0.0.1", port or 8000
if log_config:
logging.config.dictConfig(log_config)
if protocol is None:
@@ -624,12 +663,11 @@ class Sanic:
break
return response
def _helper(self, host="127.0.0.1", port=8000, debug=False,
def _helper(self, host=None, port=None, debug=False,
ssl=None, sock=None, workers=1, loop=None,
protocol=HttpProtocol, backlog=100, stop_event=None,
register_sys_signals=True, run_async=False, has_log=True):
"""Helper function used by `run` and `create_server`."""
if isinstance(ssl, dict):
# try common aliaseses
cert = ssl.get('cert') or ssl.get('certificate')
@@ -651,6 +689,8 @@ class Sanic:
server_settings = {
'protocol': protocol,
'request_class': self.request_class,
'is_request_stream': self.is_request_stream,
'router': self.router,
'host': host,
'port': port,
'sock': sock,
@@ -665,7 +705,10 @@ class Sanic:
'loop': loop,
'register_sys_signals': register_sys_signals,
'backlog': backlog,
'has_log': has_log
'has_log': has_log,
'websocket_max_size': self.config.WEBSOCKET_MAX_SIZE,
'websocket_max_queue': self.config.WEBSOCKET_MAX_QUEUE,
'graceful_shutdown_timeout': self.config.GRACEFUL_SHUTDOWN_TIMEOUT
}
# -------------------------------------------- #

View File

@@ -4,8 +4,8 @@ from sanic.constants import HTTP_METHODS
from sanic.views import CompositionView
FutureRoute = namedtuple('Route',
['handler', 'uri', 'methods',
'host', 'strict_slashes'])
['handler', 'uri', 'methods', 'host',
'strict_slashes', 'stream', 'version'])
FutureListener = namedtuple('Listener', ['handler', 'uri', 'methods', 'host'])
FutureMiddleware = namedtuple('Route', ['middleware', 'args', 'kwargs'])
FutureException = namedtuple('Route', ['handler', 'args', 'kwargs'])
@@ -14,7 +14,7 @@ FutureStatic = namedtuple('Route',
class Blueprint:
def __init__(self, name, url_prefix=None, host=None):
def __init__(self, name, url_prefix=None, host=None, version=None):
"""Create a new blueprint
:param name: unique name of the blueprint
@@ -30,6 +30,7 @@ class Blueprint:
self.listeners = defaultdict(list)
self.middlewares = []
self.statics = []
self.version = version
def register(self, app, options):
"""Register the blueprint to the sanic app."""
@@ -43,11 +44,16 @@ class Blueprint:
future.handler.__blueprintname__ = self.name
# Prepend the blueprint URI prefix if available
uri = url_prefix + future.uri if url_prefix else future.uri
version = future.version or self.version
app.route(
uri=uri[1:] if uri.startswith('//') else uri,
methods=future.methods,
host=future.host or self.host,
strict_slashes=future.strict_slashes
strict_slashes=future.strict_slashes,
stream=future.stream,
version=version
)(future.handler)
for future in self.websocket_routes:
@@ -65,10 +71,11 @@ class Blueprint:
# Middleware
for future in self.middlewares:
if future.args or future.kwargs:
app.middleware(*future.args,
**future.kwargs)(future.middleware)
app.register_middleware(future.middleware,
*future.args,
**future.kwargs)
else:
app.middleware(future.middleware)
app.register_middleware(future.middleware)
# Exceptions
for future in self.exceptions:
@@ -87,20 +94,21 @@ class Blueprint:
app.listener(event)(listener)
def route(self, uri, methods=frozenset({'GET'}), host=None,
strict_slashes=False):
strict_slashes=False, stream=False, version=None):
"""Create a blueprint route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:param methods: list of acceptable HTTP methods.
"""
def decorator(handler):
route = FutureRoute(handler, uri, methods, host, strict_slashes)
route = FutureRoute(
handler, uri, methods, host, strict_slashes, stream, version)
self.routes.append(route)
return handler
return decorator
def add_route(self, handler, uri, methods=frozenset({'GET'}), host=None,
strict_slashes=False):
strict_slashes=False, version=None):
"""Create a blueprint route from a function.
:param handler: function for handling uri requests. Accepts function,
@@ -122,21 +130,22 @@ class Blueprint:
methods = handler.handlers.keys()
self.route(uri=uri, methods=methods, host=host,
strict_slashes=strict_slashes)(handler)
strict_slashes=strict_slashes, version=version)(handler)
return handler
def websocket(self, uri, host=None, strict_slashes=False):
def websocket(self, uri, host=None, strict_slashes=False, version=None):
"""Create a blueprint websocket route from a decorated function.
:param uri: endpoint at which the route will be accessible.
"""
def decorator(handler):
route = FutureRoute(handler, uri, [], host, strict_slashes)
route = FutureRoute(handler, uri, [], host, strict_slashes,
False, version)
self.websocket_routes.append(route)
return handler
return decorator
def add_websocket_route(self, handler, uri, host=None):
def add_websocket_route(self, handler, uri, host=None, version=None):
"""Create a blueprint websocket route from a function.
:param handler: function for handling uri requests. Accepts function,
@@ -144,7 +153,7 @@ class Blueprint:
:param uri: endpoint at which the route will be accessible.
:return: function or class instance
"""
self.websocket(uri=uri, host=host)(handler)
self.websocket(uri=uri, host=host, version=version)(handler)
return handler
def listener(self, event):
@@ -190,30 +199,36 @@ class Blueprint:
self.statics.append(static)
# Shorthand method decorators
def get(self, uri, host=None, strict_slashes=False):
def get(self, uri, host=None, strict_slashes=False, version=None):
return self.route(uri, methods=["GET"], host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version)
def post(self, uri, host=None, strict_slashes=False):
def post(self, uri, host=None, strict_slashes=False, stream=False,
version=None):
return self.route(uri, methods=["POST"], host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, stream=stream,
version=version)
def put(self, uri, host=None, strict_slashes=False):
def put(self, uri, host=None, strict_slashes=False, stream=False,
version=None):
return self.route(uri, methods=["PUT"], host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, stream=stream,
version=version)
def head(self, uri, host=None, strict_slashes=False):
def head(self, uri, host=None, strict_slashes=False, version=None):
return self.route(uri, methods=["HEAD"], host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version)
def options(self, uri, host=None, strict_slashes=False):
def options(self, uri, host=None, strict_slashes=False, version=None):
return self.route(uri, methods=["OPTIONS"], host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version)
def patch(self, uri, host=None, strict_slashes=False):
def patch(self, uri, host=None, strict_slashes=False, stream=False,
version=None):
return self.route(uri, methods=["PATCH"], host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, stream=stream,
version=version)
def delete(self, uri, host=None, strict_slashes=False):
def delete(self, uri, host=None, strict_slashes=False, version=None):
return self.route(uri, methods=["DELETE"], host=host,
strict_slashes=strict_slashes)
strict_slashes=strict_slashes, version=version)

View File

@@ -1,21 +1,23 @@
from sanic.defaultFilter import DefaultFilter
import os
import sys
import syslog
import platform
import types
from sanic.log import DefaultFilter
SANIC_PREFIX = 'SANIC_'
_address_dict = {
'Windows': ('localhost', 514),
'Darwin': '/var/run/syslog',
'Linux': '/dev/log',
'FreeBSD': '/dev/log'
'FreeBSD': '/var/run/log'
}
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'filters': {
'accessFilter': {
'()': DefaultFilter,
@@ -76,24 +78,6 @@ LOGGING = {
'filters': ['errorFilter'],
'formatter': 'simple'
},
'accessTimedRotatingFile': {
'class': 'logging.handlers.TimedRotatingFileHandler',
'filters': ['accessFilter'],
'formatter': 'access',
'when': 'D',
'interval': 1,
'backupCount': 7,
'filename': 'access.log'
},
'errorTimedRotatingFile': {
'class': 'logging.handlers.TimedRotatingFileHandler',
'filters': ['errorFilter'],
'when': 'D',
'interval': 1,
'backupCount': 7,
'filename': 'error.log',
'formatter': 'simple'
}
},
'loggers': {
'sanic': {
@@ -139,9 +123,12 @@ class Config(dict):
▌ ▐ ▀▀▄▄▄▀
▀▀▄▄▀
"""
self.REQUEST_MAX_SIZE = 100000000 # 100 megababies
self.REQUEST_MAX_SIZE = 100000000 # 100 megabytes
self.REQUEST_TIMEOUT = 60 # 60 seconds
self.KEEP_ALIVE = keep_alive
self.WEBSOCKET_MAX_SIZE = 2 ** 20 # 1 megabytes
self.WEBSOCKET_MAX_QUEUE = 32
self.GRACEFUL_SHUTDOWN_TIMEOUT = 15.0 # 15 sec
if load_env:
self.load_environment_vars()
@@ -210,10 +197,16 @@ class Config(dict):
def load_environment_vars(self):
"""
Looks for any SANIC_ prefixed environment variables and applies
Looks for any ``SANIC_`` prefixed environment variables and applies
them to the configuration if present.
"""
for k, v in os.environ.items():
if k.startswith(SANIC_PREFIX):
_, config_key = k.split(SANIC_PREFIX, 1)
try:
self[config_key] = int(v)
except ValueError:
try:
self[config_key] = float(v)
except ValueError:
self[config_key] = v

View File

@@ -116,8 +116,7 @@ class Cookie(dict):
))
except AttributeError:
output.append('%s=%s' % (self._keys[key], value))
elif key in self._flags:
if self[key]:
elif key in self._flags and self[key]:
output.append(self._keys[key])
else:
output.append('%s=%s' % (self._keys[key], value))

View File

@@ -1,13 +0,0 @@
import logging
class DefaultFilter(logging.Filter):
def __init__(self, param=None):
self.param = param
def filter(self, record):
if self.param is None:
return True
if record.levelno in self.param:
return True
return False

View File

@@ -1,3 +1,5 @@
from sanic.response import ALL_STATUS_CODES, COMMON_STATUS_CODES
TRACEBACK_STYLE = '''
<style>
body {
@@ -115,6 +117,20 @@ INTERNAL_SERVER_ERROR_HTML = '''
'''
_sanic_exceptions = {}
def add_status_code(code):
"""
Decorator used for adding exceptions to _sanic_exceptions.
"""
def class_decorator(cls):
cls.status_code = code
_sanic_exceptions[code] = cls
return cls
return class_decorator
class SanicException(Exception):
def __init__(self, message, status_code=None):
@@ -124,24 +140,27 @@ class SanicException(Exception):
self.status_code = status_code
@add_status_code(404)
class NotFound(SanicException):
status_code = 404
pass
@add_status_code(400)
class InvalidUsage(SanicException):
status_code = 400
pass
@add_status_code(500)
class ServerError(SanicException):
status_code = 500
pass
class URLBuildError(SanicException):
status_code = 500
class URLBuildError(ServerError):
pass
class FileNotFound(NotFound):
status_code = 404
pass
def __init__(self, message, path, relative_url):
super().__init__(message)
@@ -149,20 +168,23 @@ class FileNotFound(NotFound):
self.relative_url = relative_url
@add_status_code(408)
class RequestTimeout(SanicException):
status_code = 408
pass
@add_status_code(413)
class PayloadTooLarge(SanicException):
status_code = 413
pass
class HeaderNotFound(SanicException):
status_code = 400
class HeaderNotFound(InvalidUsage):
pass
@add_status_code(416)
class ContentRangeError(SanicException):
status_code = 416
pass
def __init__(self, message, content_range):
super().__init__(message)
@@ -172,5 +194,69 @@ class ContentRangeError(SanicException):
}
@add_status_code(403)
class Forbidden(SanicException):
pass
class InvalidRangeType(ContentRangeError):
pass
@add_status_code(401)
class Unauthorized(SanicException):
"""
Unauthorized exception (401 HTTP status code).
:param message: Message describing the exception.
:param scheme: Name of the authentication scheme to be used.
When present, kwargs is used to complete the WWW-Authentication header.
Examples::
# With a Basic auth-scheme, realm MUST be present:
raise Unauthorized("Auth required.", "Basic", realm="Restricted Area")
# With a Digest auth-scheme, things are a bit more complicated:
raise Unauthorized("Auth required.",
"Digest",
realm="Restricted Area",
qop="auth, auth-int",
algorithm="MD5",
nonce="abcdef",
opaque="zyxwvu")
# With a Bearer auth-scheme, realm is optional so you can write:
raise Unauthorized("Auth required.", "Bearer")
# or, if you want to specify the realm:
raise Unauthorized("Auth required.", "Bearer", realm="Restricted Area")
"""
def __init__(self, message, scheme, **kwargs):
super().__init__(message)
values = ["{!s}={!r}".format(k, v) for k, v in kwargs.items()]
challenge = ', '.join(values)
self.headers = {
"WWW-Authenticate": "{} {}".format(scheme, challenge).rstrip()
}
def abort(status_code, message=None):
"""
Raise an exception based on SanicException. Returns the HTTP response
message appropriate for the given status code, unless provided.
:param status_code: The HTTP status code to return.
:param message: The HTTP response body. Defaults to the messages
in response.py for the given status code.
"""
if message is None:
message = COMMON_STATUS_CODES.get(status_code,
ALL_STATUS_CODES.get(status_code))
# These are stored as bytes in the STATUS_CODES dict
message = message.decode('utf8')
sanic_exception = _sanic_exceptions.get(status_code, SanicException)
raise sanic_exception(message=message, status_code=status_code)

View File

@@ -1,4 +1,18 @@
import logging
class DefaultFilter(logging.Filter):
def __init__(self, param=None):
self.param = param
def filter(self, record):
if self.param is None:
return True
if record.levelno in self.param:
return True
return False
log = logging.getLogger('sanic')
netlog = logging.getLogger('network')

View File

@@ -1,3 +1,5 @@
import sys
import json
from cgi import parse_header
from collections import namedtuple
from http.cookies import SimpleCookie
@@ -7,7 +9,12 @@ from urllib.parse import parse_qs, urlunparse
try:
from ujson import loads as json_loads
except ImportError:
from json import loads as json_loads
if sys.version_info[:2] == (3, 5):
def json_loads(data):
# on Python 3.5 json.loads only supports str not bytes
return json.loads(data.decode())
else:
json_loads = json.loads
from sanic.exceptions import InvalidUsage
from sanic.log import log
@@ -38,7 +45,7 @@ class Request(dict):
__slots__ = (
'app', 'headers', 'version', 'method', '_cookies', 'transport',
'body', 'parsed_json', 'parsed_args', 'parsed_form', 'parsed_files',
'_ip', '_parsed_url', 'uri_template'
'_ip', '_parsed_url', 'uri_template', 'stream', '_remote_addr'
)
def __init__(self, url_bytes, headers, version, method, transport):
@@ -59,6 +66,7 @@ class Request(dict):
self.parsed_args = None
self.uri_template = None
self._cookies = None
self.stream = None
@property
def json(self):
@@ -78,10 +86,14 @@ class Request(dict):
:return: token related to request
"""
prefixes = ('Bearer', 'Token')
auth_header = self.headers.get('Authorization')
if 'Token ' in auth_header:
return auth_header.partition('Token ')[-1]
else:
if auth_header is not None:
for prefix in prefixes:
if prefix in auth_header:
return auth_header.partition(prefix)[-1].strip()
return auth_header
@property
@@ -130,7 +142,7 @@ class Request(dict):
@property
def cookies(self):
if self._cookies is None:
cookie = self.headers.get('Cookie') or self.headers.get('cookie')
cookie = self.headers.get('Cookie')
if cookie is not None:
cookies = SimpleCookie()
cookies.load(cookie)
@@ -143,9 +155,29 @@ class Request(dict):
@property
def ip(self):
if not hasattr(self, '_ip'):
self._ip = self.transport.get_extra_info('peername')
self._ip = (self.transport.get_extra_info('peername') or
(None, None))
return self._ip
@property
def remote_addr(self):
"""Attempt to return the original client ip based on X-Forwarded-For.
:return: original client ip.
"""
if not hasattr(self, '_remote_addr'):
forwarded_for = self.headers.get('X-Forwarded-For', '').split(',')
remote_addrs = [
addr for addr in [
addr.strip() for addr in forwarded_for
] if addr
]
if len(remote_addrs) > 0:
self._remote_addr = remote_addrs[0]
else:
self._remote_addr = ''
return self._remote_addr
@property
def scheme(self):
if self.app.websocket_enabled \
@@ -165,6 +197,15 @@ class Request(dict):
# so pull it from the headers
return self.headers.get('Host', '')
@property
def content_type(self):
return self.headers.get('Content-Type', DEFAULT_HTTP_CONTENT_TYPE)
@property
def match_info(self):
"""return matched info after resolving route"""
return self.app.router.get(self)[2]
@property
def path(self):
return self._parsed_url.path.decode('utf-8')
@@ -216,15 +257,15 @@ def parse_multipart_form(body, boundary):
break
colon_index = form_line.index(':')
form_header_field = form_line[0:colon_index]
form_header_field = form_line[0:colon_index].lower()
form_header_value, form_parameters = parse_header(
form_line[colon_index + 2:])
if form_header_field == 'Content-Disposition':
if form_header_field == 'content-disposition':
if 'filename' in form_parameters:
file_name = form_parameters['filename']
field_name = form_parameters.get('name')
elif form_header_field == 'Content-Type':
elif form_header_field == 'content-type':
file_type = form_header_value
post_data = form_part[line_index:-4]

View File

@@ -233,22 +233,25 @@ class HTTPResponse(BaseHTTPResponse):
return self._cookies
def json(body, status=200, headers=None, **kwargs):
def json(body, status=200, headers=None,
content_type="application/json", **kwargs):
"""
Returns response object with body in json format.
:param body: Response data to be serialized.
:param status: Response code.
:param headers: Custom Headers.
:param kwargs: Remaining arguments that are passed to the json encoder.
"""
return HTTPResponse(json_dumps(body, **kwargs), headers=headers,
status=status, content_type="application/json")
status=status, content_type=content_type)
def text(body, status=200, headers=None,
content_type="text/plain; charset=utf-8"):
"""
Returns response object with body in text format.
:param body: Response data to be encoded.
:param status: Response code.
:param headers: Custom Headers.
@@ -263,6 +266,7 @@ def raw(body, status=200, headers=None,
content_type="application/octet-stream"):
"""
Returns response object without encoding the body.
:param body: Response data.
:param status: Response code.
:param headers: Custom Headers.
@@ -275,6 +279,7 @@ def raw(body, status=200, headers=None,
def html(body, status=200, headers=None):
"""
Returns response object with body in html format.
:param body: Response data to be encoded.
:param status: Response code.
:param headers: Custom Headers.
@@ -310,6 +315,53 @@ async def file(location, mime_type=None, headers=None, _range=None):
body_bytes=out_stream)
async def file_stream(location, chunk_size=4096, mime_type=None, headers=None,
_range=None):
"""Return a streaming response object with file data.
:param location: Location of file on system.
:param chunk_size: The size of each chunk in the stream (in bytes)
:param mime_type: Specific mime_type.
:param headers: Custom Headers.
:param _range:
"""
filename = path.split(location)[-1]
_file = await open_async(location, mode='rb')
async def _streaming_fn(response):
nonlocal _file, chunk_size
try:
if _range:
chunk_size = min((_range.size, chunk_size))
await _file.seek(_range.start)
to_send = _range.size
while to_send > 0:
content = await _file.read(chunk_size)
if len(content) < 1:
break
to_send -= len(content)
response.write(content)
else:
while True:
content = await _file.read(chunk_size)
if len(content) < 1:
break
response.write(content)
finally:
await _file.close()
return # Returning from this fn closes the stream
mime_type = mime_type or guess_type(filename)[0] or 'text/plain'
if _range:
headers['Content-Range'] = 'bytes %s-%s/%s' % (
_range.start, _range.end, _range.total)
return StreamingHTTPResponse(streaming_fn=_streaming_fn,
status=200,
headers=headers,
content_type=mime_type)
def stream(
streaming_fn, status=200, headers=None,
content_type="text/plain; charset=utf-8"):

View File

@@ -98,8 +98,25 @@ class Router:
return name, _type, pattern
def add(self, uri, methods, handler, host=None, strict_slashes=False):
def add(self, uri, methods, handler, host=None, strict_slashes=False,
version=None):
"""Add a handler to the route list
:param uri: path to match
:param methods: sequence of accepted method names. If none are
provided, any method is allowed
:param handler: request handler function.
When executed, it should provide a response object.
:param strict_slashes: strict to trailing slash
:param version: current version of the route or blueprint. See
docs for further details.
:return: Nothing
"""
if version is not None:
if uri.startswith('/'):
uri = "/".join(["/v{}".format(str(version)), uri[1:]])
else:
uri = "/".join(["/v{}".format(str(version)), uri])
# add regular version
self._add(uri, methods, handler, host)
@@ -345,3 +362,17 @@ class Router:
if hasattr(route_handler, 'handlers'):
route_handler = route_handler.handlers[method]
return route_handler, [], kwargs, route.uri
def is_stream_handler(self, request):
""" Handler for request is stream or not.
:param request: Request object
:return: bool
"""
try:
handler = self.get(request)[0]
except (NotFound, InvalidUsage):
return False
if (hasattr(handler, 'view_class') and
hasattr(handler.view_class, request.method.lower())):
handler = getattr(handler.view_class, request.method.lower())
return hasattr(handler, 'is_stream')

View File

@@ -64,22 +64,25 @@ class HttpProtocol(asyncio.Protocol):
'parser', 'request', 'url', 'headers',
# request config
'request_handler', 'request_timeout', 'request_max_size',
'request_class',
'request_class', 'is_request_stream', 'router',
# enable or disable access log / error log purpose
'has_log',
# connection management
'_total_request_size', '_timeout_handler', '_last_communication_time')
'_total_request_size', '_timeout_handler', '_last_communication_time',
'_is_stream_handler')
def __init__(self, *, loop, request_handler, error_handler,
signal=Signal(), connections=set(), request_timeout=60,
request_max_size=None, request_class=None, has_log=True,
keep_alive=True):
keep_alive=True, is_request_stream=False, router=None,
state=None, debug=False, **kwargs):
self.loop = loop
self.transport = None
self.request = None
self.parser = None
self.url = None
self.headers = None
self.router = router
self.signal = signal
self.has_log = has_log
self.connections = connections
@@ -88,17 +91,26 @@ class HttpProtocol(asyncio.Protocol):
self.request_timeout = request_timeout
self.request_max_size = request_max_size
self.request_class = request_class or Request
self.is_request_stream = is_request_stream
self._is_stream_handler = False
self._total_request_size = 0
self._timeout_handler = None
self._last_request_time = None
self._request_handler_task = None
self._request_stream_task = None
self._keep_alive = keep_alive
self._header_fragment = b''
self.state = state if state else {}
if 'requests_count' not in self.state:
self.state['requests_count'] = 0
self._debug = debug
@property
def keep_alive(self):
return (self._keep_alive
and not self.signal.stopped
and self.parser.should_keep_alive())
return (
self._keep_alive and
not self.signal.stopped and
self.parser.should_keep_alive())
# -------------------------------------------- #
# Connection
@@ -123,9 +135,13 @@ class HttpProtocol(asyncio.Protocol):
self._timeout_handler = (
self.loop.call_later(time_left, self.connection_timeout))
else:
if self._request_stream_task:
self._request_stream_task.cancel()
if self._request_handler_task:
self._request_handler_task.cancel()
exception = RequestTimeout('Request Timeout')
try:
raise RequestTimeout('Request Timeout')
except RequestTimeout as exception:
self.write_error(exception)
# -------------------------------------------- #
@@ -146,22 +162,39 @@ class HttpProtocol(asyncio.Protocol):
self.headers = []
self.parser = HttpRequestParser(self)
# requests count
self.state['requests_count'] = self.state['requests_count'] + 1
# Parse request chunk or close connection
try:
self.parser.feed_data(data)
except HttpParserError:
exception = InvalidUsage('Bad Request')
message = 'Bad Request'
if self._debug:
message += '\n' + traceback.format_exc()
exception = InvalidUsage(message)
self.write_error(exception)
def on_url(self, url):
if not self.url:
self.url = url
else:
self.url += url
def on_header(self, name, value):
if name == b'Content-Length' and int(value) > self.request_max_size:
self._header_fragment += name
if value is not None:
if self._header_fragment == b'Content-Length' \
and int(value) > self.request_max_size:
exception = PayloadTooLarge('Payload Too Large')
self.write_error(exception)
self.headers.append((name.decode().casefold(), value.decode()))
self.headers.append(
(self._header_fragment.decode().casefold(),
value.decode()))
self._header_fragment = b''
def on_headers_complete(self):
self.request = self.request_class(
@@ -171,13 +204,29 @@ class HttpProtocol(asyncio.Protocol):
method=self.parser.get_method().decode(),
transport=self.transport
)
if self.is_request_stream:
self._is_stream_handler = self.router.is_stream_handler(
self.request)
if self._is_stream_handler:
self.request.stream = asyncio.Queue()
self.execute_request_handler()
def on_body(self, body):
if self.is_request_stream and self._is_stream_handler:
self._request_stream_task = self.loop.create_task(
self.request.stream.put(body))
return
self.request.body.append(body)
def on_message_complete(self):
if self.is_request_stream and self._is_stream_handler:
self._request_stream_task = self.loop.create_task(
self.request.stream.put(None))
return
self.request.body = b''.join(self.request.body)
self.execute_request_handler()
def execute_request_handler(self):
self._request_handler_task = self.loop.create_task(
self.request_handler(
self.request,
@@ -201,8 +250,9 @@ class HttpProtocol(asyncio.Protocol):
netlog.info('', extra={
'status': response.status,
'byte': len(response.body),
'host': '%s:%d' % self.request.ip,
'request': '%s %s' % (self.request.method,
'host': '{0}:{1}'.format(self.request.ip[0],
self.request.ip[1]),
'request': '{0} {1}'.format(self.request.method,
self.request.url)
})
except AttributeError:
@@ -242,8 +292,9 @@ class HttpProtocol(asyncio.Protocol):
netlog.info('', extra={
'status': response.status,
'byte': -1,
'host': '%s:%d' % self.request.ip,
'request': '%s %s' % (self.request.method,
'host': '{0}:{1}'.format(self.request.ip[0],
self.request.ip[1]),
'request': '{0} {1}'.format(self.request.method,
self.request.url)
})
except AttributeError:
@@ -268,6 +319,7 @@ class HttpProtocol(asyncio.Protocol):
self.cleanup()
def write_error(self, exception):
response = None
try:
response = self.error_handler.response(self.request, exception)
version = self.request.version if self.request else '1.1'
@@ -282,19 +334,22 @@ class HttpProtocol(asyncio.Protocol):
from_error=True)
finally:
if self.has_log:
extra = {
'status': response.status,
'host': '',
'request': str(self.request) + str(self.url)
}
if response and isinstance(response, HTTPResponse):
extra = dict()
if isinstance(response, HTTPResponse):
extra['status'] = response.status
extra['byte'] = len(response.body)
else:
extra['status'] = 0
extra['byte'] = -1
if self.request:
extra['host'] = '%s:%d' % self.request.ip,
extra['request'] = '%s %s' % (self.request.method,
self.url)
else:
extra['host'] = 'UNKNOWN'
extra['request'] = 'nil'
if self.parser and not (self.keep_alive
and extra['status'] == 408):
netlog.info('', extra=extra)
self.transport.close()
@@ -317,7 +372,9 @@ class HttpProtocol(asyncio.Protocol):
self.url = None
self.headers = None
self._request_handler_task = None
self._request_stream_task = None
self._total_request_size = 0
self._is_stream_handler = False
def close_if_idle(self):
"""Close the connection if a request is not being sent or received
@@ -329,6 +386,14 @@ class HttpProtocol(asyncio.Protocol):
return True
return False
def close(self):
"""
Force close the connection.
"""
if self.transport is not None:
self.transport.close()
self.transport = None
def update_current_time(loop):
"""Cache the current time, since it is needed at the end of every
@@ -359,7 +424,10 @@ def serve(host, port, request_handler, error_handler, before_start=None,
request_timeout=60, ssl=None, sock=None, request_max_size=None,
reuse_port=False, loop=None, protocol=HttpProtocol, backlog=100,
register_sys_signals=True, run_async=False, connections=None,
signal=Signal(), request_class=None, has_log=True, keep_alive=True):
signal=Signal(), request_class=None, has_log=True, keep_alive=True,
is_request_stream=False, router=None, websocket_max_size=None,
websocket_max_queue=None, state=None,
graceful_shutdown_timeout=15.0):
"""Start asynchronous HTTP Server on an individual process.
:param host: Address to host on
@@ -386,6 +454,8 @@ def serve(host, port, request_handler, error_handler, before_start=None,
:param protocol: subclass of asyncio protocol class
:param request_class: Request class to use
:param has_log: disable/enable access log and error log
:param is_request_stream: disable/enable Request.stream
:param router: Router object
:return: Nothing
"""
if not run_async:
@@ -395,8 +465,6 @@ def serve(host, port, request_handler, error_handler, before_start=None,
if debug:
loop.set_debug(debug)
trigger_events(before_start, loop)
connections = connections if connections is not None else set()
server = partial(
protocol,
@@ -410,6 +478,12 @@ def serve(host, port, request_handler, error_handler, before_start=None,
request_class=request_class,
has_log=has_log,
keep_alive=keep_alive,
is_request_stream=is_request_stream,
router=router,
websocket_max_size=websocket_max_size,
websocket_max_queue=websocket_max_queue,
state=state,
debug=debug,
)
server_coroutine = loop.create_server(
@@ -421,6 +495,7 @@ def serve(host, port, request_handler, error_handler, before_start=None,
sock=sock,
backlog=backlog
)
# Instead of pulling time at the end of every request,
# pull it once per minute
loop.call_soon(partial(update_current_time, loop))
@@ -428,6 +503,8 @@ def serve(host, port, request_handler, error_handler, before_start=None,
if run_async:
return server_coroutine
trigger_events(before_start, loop)
try:
http_server = loop.run_until_complete(server_coroutine)
except:
@@ -463,8 +540,26 @@ def serve(host, port, request_handler, error_handler, before_start=None,
for connection in connections:
connection.close_if_idle()
while connections:
# Gracefully shutdown timeout.
# We should provide graceful_shutdown_timeout,
# instead of letting connection hangs forever.
# Let's roughly calcucate time.
start_shutdown = 0
while connections and (start_shutdown < graceful_shutdown_timeout):
loop.run_until_complete(asyncio.sleep(0.1))
start_shutdown = start_shutdown + 0.1
# Force close non-idle connection after waiting for
# graceful_shutdown_timeout
coros = []
for conn in connections:
if hasattr(conn, "websocket") and conn.websocket:
coros.append(conn.websocket.close_connection(force=True))
else:
conn.close()
_shutdown = asyncio.gather(*coros, loop=loop)
loop.run_until_complete(_shutdown)
trigger_events(after_stop, loop)

View File

@@ -13,11 +13,12 @@ from sanic.exceptions import (
InvalidUsage,
)
from sanic.handlers import ContentRangeHandler
from sanic.response import file, HTTPResponse
from sanic.response import file, file_stream, HTTPResponse
def register(app, uri, file_or_directory, pattern,
use_modified_since, use_content_range):
use_modified_since, use_content_range,
stream_large_files):
# TODO: Though sanic is not a file server, I feel like we should at least
# make a good effort here. Modified-since is nice, but we could
# also look into etags, expires, and caching
@@ -34,6 +35,10 @@ def register(app, uri, file_or_directory, pattern,
server's
:param use_content_range: If true, process header for range requests
and sends the file part that is requested
:param stream_large_files: If true, use the file_stream() handler rather
than the file() handler to send the file
If this is an integer, this represents the
threshold size to switch to file_stream()
"""
# If we're not trying to match a file directly,
# serve from the folder
@@ -93,6 +98,17 @@ def register(app, uri, file_or_directory, pattern,
headers=headers,
content_type=guess_type(file_path)[0] or 'text/plain')
else:
if stream_large_files:
if isinstance(stream_large_files, int):
threshold = stream_large_files
else:
threshold = 1024*1000
if not stats:
stats = await stat(file_path)
if stats.st_size >= threshold:
return await file_stream(file_path, headers=headers,
_range=_range)
return await file(file_path, headers=headers, _range=_range)
except ContentRangeError:
raise

View File

@@ -1,4 +1,5 @@
import traceback
from json import JSONDecodeError
from sanic.log import log
@@ -28,6 +29,14 @@ class SanicTestClient:
response.text = await response.text()
except UnicodeDecodeError as e:
response.text = None
try:
response.json = await response.json()
except (JSONDecodeError,
UnicodeDecodeError,
aiohttp.ClientResponseError):
response.json = None
response.body = await response.read()
return response

View File

@@ -64,6 +64,11 @@ class HTTPMethodView:
return view
def stream(func):
func.is_stream = True
return func
class CompositionView:
"""Simple method-function mapped view for the sanic.
You can add handler functions to methods (get, post, put, patch, delete)
@@ -83,7 +88,9 @@ class CompositionView:
def __init__(self):
self.handlers = {}
def add(self, methods, handler):
def add(self, methods, handler, stream=False):
if stream:
handler.is_stream = stream
for method in methods:
if method not in HTTP_METHODS:
raise InvalidUsage(

View File

@@ -6,9 +6,12 @@ from websockets import ConnectionClosed # noqa
class WebSocketProtocol(HttpProtocol):
def __init__(self, *args, **kwargs):
def __init__(self, *args, websocket_max_size=None,
websocket_max_queue=None, **kwargs):
super().__init__(*args, **kwargs)
self.websocket = None
self.websocket_max_size = websocket_max_size
self.websocket_max_queue = websocket_max_queue
def connection_timeout(self):
# timeouts make no sense for websocket routes
@@ -62,6 +65,9 @@ class WebSocketProtocol(HttpProtocol):
request.transport.write(rv)
# hook up the websocket protocol
self.websocket = WebSocketCommonProtocol()
self.websocket = WebSocketCommonProtocol(
max_size=self.websocket_max_size,
max_queue=self.websocket_max_queue
)
self.websocket.connection_made(request.transport)
return self.websocket

View File

@@ -3,13 +3,18 @@ import sys
import signal
import asyncio
import logging
import traceback
try:
import ssl
except ImportError:
ssl = None
import uvloop
try:
import uvloop
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
except ImportError:
pass
import gunicorn.workers.base as base
from sanic.server import trigger_events, serve, HttpProtocol, Signal
@@ -25,7 +30,7 @@ class GunicornWorker(base.Worker):
self.ssl_context = self._create_ssl_context(cfg)
else:
self.ssl_context = None
self.servers = []
self.servers = {}
self.connections = set()
self.exit_code = 0
self.signal = Signal()
@@ -34,7 +39,6 @@ class GunicornWorker(base.Worker):
# create new event_loop after fork
asyncio.get_event_loop().close()
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
self.loop = asyncio.new_event_loop()
asyncio.set_event_loop(self.loop)
@@ -45,8 +49,6 @@ class GunicornWorker(base.Worker):
protocol = (WebSocketProtocol if self.app.callable.websocket_enabled
else HttpProtocol)
self._server_settings = self.app.callable._helper(
host=None,
port=None,
loop=self.loop,
debug=is_debug,
protocol=protocol,
@@ -68,9 +70,15 @@ class GunicornWorker(base.Worker):
trigger_events(self._server_settings.get('before_stop', []),
self.loop)
self.loop.run_until_complete(self.close())
except:
traceback.print_exc()
finally:
try:
trigger_events(self._server_settings.get('after_stop', []),
self.loop)
except:
traceback.print_exc()
finally:
self.loop.close()
sys.exit(self.exit_code)
@@ -90,16 +98,37 @@ class GunicornWorker(base.Worker):
for conn in self.connections:
conn.close_if_idle()
while self.connections:
# gracefully shutdown timeout
start_shutdown = 0
graceful_shutdown_timeout = self.cfg.graceful_timeout
while self.connections and \
(start_shutdown < graceful_shutdown_timeout):
await asyncio.sleep(0.1)
start_shutdown = start_shutdown + 0.1
# Force close non-idle connection after waiting for
# graceful_shutdown_timeout
coros = []
for conn in self.connections:
if hasattr(conn, "websocket") and conn.websocket:
coros.append(conn.websocket.close_connection(force=True))
else:
conn.close()
_shutdown = asyncio.gather(*coros, loop=self.loop)
await _shutdown
async def _run(self):
for sock in self.sockets:
self.servers.append(await serve(
state = dict(requests_count=0)
self._server_settings["host"] = None
self._server_settings["port"] = None
server = await serve(
sock=sock,
connections=self.connections,
state=state,
**self._server_settings
))
)
self.servers[server] = state
async def _check_alive(self):
# If our parent changed then we shut down.
@@ -108,7 +137,15 @@ class GunicornWorker(base.Worker):
while self.alive:
self.notify()
if pid == os.getpid() and self.ppid != os.getppid():
req_count = sum(
self.servers[srv]["requests_count"] for srv in self.servers
)
if self.max_requests and req_count > self.max_requests:
self.alive = False
self.log.info(
"Max requests exceeded, shutting down: %s", self
)
elif pid == os.getpid() and self.ppid != os.getppid():
self.alive = False
self.log.info("Parent changed, shutting down: %s", self)
else:
@@ -158,9 +195,11 @@ class GunicornWorker(base.Worker):
def handle_quit(self, sig, frame):
self.alive = False
self.app.callable.is_running = False
self.cfg.worker_int(self)
def handle_abort(self, sig, frame):
self.alive = False
self.exit_code = 1
self.cfg.worker_abort(self)
sys.exit(1)

View File

@@ -1,16 +1,42 @@
import asyncio
import inspect
import pytest
from sanic import Sanic
from sanic.blueprints import Blueprint
from sanic.response import json, text
from sanic.exceptions import NotFound, ServerError, InvalidUsage
from sanic.constants import HTTP_METHODS
# ------------------------------------------------------------ #
# GET
# ------------------------------------------------------------ #
@pytest.mark.parametrize('method', HTTP_METHODS)
def test_versioned_routes_get(method):
app = Sanic('test_shorhand_routes_get')
bp = Blueprint('test_text')
method = method.lower()
func = getattr(bp, method)
if callable(func):
@func('/{}'.format(method), version=1)
def handler(request):
return text('OK')
else:
print(func)
raise
app.blueprint(bp)
client_method = getattr(app.test_client, method)
request, response = client_method('/v1/{}'.format(method))
assert response.status == 200
def test_bp():
app = Sanic('test_text')
bp = Blueprint('test_text')
@@ -21,6 +47,7 @@ def test_bp():
app.blueprint(bp)
request, response = app.test_client.get('/')
assert app.is_request_stream is False
assert response.text == 'Hello'
@@ -40,6 +67,7 @@ def test_bp_strict_slash():
request, response = app.test_client.get('/get')
assert response.text == 'OK'
assert response.json == None
request, response = app.test_client.get('/get/')
assert response.status == 404
@@ -268,38 +296,48 @@ def test_bp_shorthand():
@blueprint.get('/get')
def handler(request):
assert request.stream is None
return text('OK')
@blueprint.put('/put')
def handler(request):
assert request.stream is None
return text('OK')
@blueprint.post('/post')
def handler(request):
assert request.stream is None
return text('OK')
@blueprint.head('/head')
def handler(request):
assert request.stream is None
return text('OK')
@blueprint.options('/options')
def handler(request):
assert request.stream is None
return text('OK')
@blueprint.patch('/patch')
def handler(request):
assert request.stream is None
return text('OK')
@blueprint.delete('/delete')
def handler(request):
assert request.stream is None
return text('OK')
@blueprint.websocket('/ws')
async def handler(request, ws):
assert request.stream is None
ev.set()
app.blueprint(blueprint)
assert app.is_request_stream is False
request, response = app.test_client.get('/get')
assert response.text == 'OK'

View File

@@ -3,7 +3,8 @@ from bs4 import BeautifulSoup
from sanic import Sanic
from sanic.response import text
from sanic.exceptions import InvalidUsage, ServerError, NotFound
from sanic.exceptions import InvalidUsage, ServerError, NotFound, Unauthorized
from sanic.exceptions import Forbidden, abort
class SanicExceptionTestException(Exception):
@@ -26,10 +27,37 @@ def exception_app():
def handler_404(request):
raise NotFound("OK")
@app.route('/403')
def handler_403(request):
raise Forbidden("Forbidden")
@app.route('/401/basic')
def handler_401_basic(request):
raise Unauthorized("Unauthorized", "Basic", realm="Sanic")
@app.route('/401/digest')
def handler_401_digest(request):
raise Unauthorized("Unauthorized",
"Digest",
realm="Sanic",
qop="auth, auth-int",
algorithm="MD5",
nonce="abcdef",
opaque="zyxwvu")
@app.route('/401/bearer')
def handler_401_bearer(request):
raise Unauthorized("Unauthorized", "Bearer")
@app.route('/invalid')
def handler_invalid(request):
raise InvalidUsage("OK")
@app.route('/abort')
def handler_invalid(request):
abort(500)
return text("OK")
@app.route('/divide_by_zero')
def handle_unhandled_exception(request):
1 / 0
@@ -44,8 +72,10 @@ def exception_app():
return app
def test_catch_exception_list():
app = Sanic('exception_list')
@app.exception([SanicExceptionTestException, NotFound])
def exception_list(request, exception):
return text("ok")
@@ -60,6 +90,7 @@ def test_catch_exception_list():
request, response = app.test_client.get('/')
assert response.text == 'ok'
def test_no_exception(exception_app):
"""Test that a route works without an exception"""
request, response = exception_app.test_client.get('/')
@@ -85,6 +116,35 @@ def test_not_found_exception(exception_app):
assert response.status == 404
def test_forbidden_exception(exception_app):
"""Test the built-in Forbidden exception"""
request, response = exception_app.test_client.get('/403')
assert response.status == 403
def test_unauthorized_exception(exception_app):
"""Test the built-in Unauthorized exception"""
request, response = exception_app.test_client.get('/401/basic')
assert response.status == 401
assert response.headers.get('WWW-Authenticate') is not None
assert response.headers.get('WWW-Authenticate') == "Basic realm='Sanic'"
request, response = exception_app.test_client.get('/401/digest')
assert response.status == 401
auth_header = response.headers.get('WWW-Authenticate')
assert auth_header is not None
assert auth_header.startswith('Digest')
assert "qop='auth, auth-int'" in auth_header
assert "algorithm='MD5'" in auth_header
assert "nonce='abcdef'" in auth_header
assert "opaque='zyxwvu'" in auth_header
request, response = exception_app.test_client.get('/401/bearer')
assert response.status == 401
assert response.headers.get('WWW-Authenticate') == "Bearer"
def test_handled_unhandled_exception(exception_app):
"""Test that an exception not built into sanic is handled"""
request, response = exception_app.test_client.get('/divide_by_zero')
@@ -97,6 +157,7 @@ def test_handled_unhandled_exception(exception_app):
"The server encountered an internal error and "
"cannot complete your request.")
def test_exception_in_exception_handler(exception_app):
"""Test that an exception thrown in an error handler is handled"""
request, response = exception_app.test_client.get(
@@ -121,3 +182,9 @@ def test_exception_in_exception_handler_debug_off(exception_app):
debug=True)
assert response.status == 500
assert response.body.startswith(b'Exception raised in exception ')
def test_abort(exception_app):
"""Test the abort function"""
request, response = exception_app.test_client.get('/abort')
assert response.status == 500

View File

@@ -24,7 +24,7 @@ def handler_3(request):
@exception_handler_app.route('/4')
def handler_4(request):
foo = bar
foo = bar # noqa -- F821 undefined name 'bar' is done to throw exception
return text(foo)

View File

@@ -1,5 +1,7 @@
import asyncio
import uuid
from importlib import reload
from sanic.config import LOGGING
from sanic.response import text
from sanic import Sanic
from io import StringIO
@@ -10,6 +12,11 @@ function: %(funcName)s(); \
message: %(message)s'''
def reset_logging():
logging.shutdown()
reload(logging)
def test_log():
log_stream = StringIO()
for handler in logging.root.handlers[:]:
@@ -32,5 +39,19 @@ def test_log():
log_text = log_stream.getvalue()
assert rand_string in log_text
def test_default_log_fmt():
reset_logging()
Sanic()
for fmt in [h.formatter for h in logging.getLogger('sanic').handlers]:
assert fmt._fmt == LOGGING['formatters']['simple']['format']
reset_logging()
Sanic(log_config=None)
for fmt in [h.formatter for h in logging.getLogger('sanic').handlers]:
assert fmt._fmt == "%(asctime)s: %(levelname)s: %(message)s"
if __name__ == "__main__":
test_log()

View File

@@ -2,7 +2,11 @@ import random
from sanic import Sanic
from sanic.response import json
from ujson import loads
try:
from ujson import loads
except ImportError:
from json import loads
def test_storage():

View File

@@ -0,0 +1,463 @@
import asyncio
from sanic import Sanic
from sanic.blueprints import Blueprint
from sanic.views import CompositionView
from sanic.views import HTTPMethodView
from sanic.views import stream as stream_decorator
from sanic.response import stream, text
data = "abc" * 100000
def test_request_stream_method_view():
'''for self.is_request_stream = True'''
app = Sanic('test_request_stream_method_view')
class SimpleView(HTTPMethodView):
def get(self, request):
assert request.stream is None
return text('OK')
@stream_decorator
async def post(self, request):
assert isinstance(request.stream, asyncio.Queue)
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
app.add_route(SimpleView.as_view(), '/method_view')
assert app.is_request_stream is True
request, response = app.test_client.get('/method_view')
assert response.status == 200
assert response.text == 'OK'
request, response = app.test_client.post('/method_view', data=data)
assert response.status == 200
assert response.text == data
def test_request_stream_app():
'''for self.is_request_stream = True and decorators'''
app = Sanic('test_request_stream_app')
@app.get('/get')
async def get(request):
assert request.stream is None
return text('GET')
@app.head('/head')
async def head(request):
assert request.stream is None
return text('HEAD')
@app.delete('/delete')
async def delete(request):
assert request.stream is None
return text('DELETE')
@app.options('/options')
async def options(request):
assert request.stream is None
return text('OPTIONS')
@app.post('/_post/<id>')
async def _post(request, id):
assert request.stream is None
return text('_POST')
@app.post('/post/<id>', stream=True)
async def post(request, id):
assert isinstance(request.stream, asyncio.Queue)
async def streaming(response):
while True:
body = await request.stream.get()
if body is None:
break
response.write(body.decode('utf-8'))
return stream(streaming)
@app.put('/_put')
async def _put(request):
assert request.stream is None
return text('_PUT')
@app.put('/put', stream=True)
async def put(request):
assert isinstance(request.stream, asyncio.Queue)
async def streaming(response):
while True:
body = await request.stream.get()
if body is None:
break
response.write(body.decode('utf-8'))
return stream(streaming)
@app.patch('/_patch')
async def _patch(request):
assert request.stream is None
return text('_PATCH')
@app.patch('/patch', stream=True)
async def patch(request):
assert isinstance(request.stream, asyncio.Queue)
async def streaming(response):
while True:
body = await request.stream.get()
if body is None:
break
response.write(body.decode('utf-8'))
return stream(streaming)
assert app.is_request_stream is True
request, response = app.test_client.get('/get')
assert response.status == 200
assert response.text == 'GET'
request, response = app.test_client.head('/head')
assert response.status == 200
assert response.text == ''
request, response = app.test_client.delete('/delete')
assert response.status == 200
assert response.text == 'DELETE'
request, response = app.test_client.options('/options')
assert response.status == 200
assert response.text == 'OPTIONS'
request, response = app.test_client.post('/_post/1', data=data)
assert response.status == 200
assert response.text == '_POST'
request, response = app.test_client.post('/post/1', data=data)
assert response.status == 200
assert response.text == data
request, response = app.test_client.put('/_put', data=data)
assert response.status == 200
assert response.text == '_PUT'
request, response = app.test_client.put('/put', data=data)
assert response.status == 200
assert response.text == data
request, response = app.test_client.patch('/_patch', data=data)
assert response.status == 200
assert response.text == '_PATCH'
request, response = app.test_client.patch('/patch', data=data)
assert response.status == 200
assert response.text == data
def test_request_stream_handle_exception():
'''for handling exceptions properly'''
app = Sanic('test_request_stream_exception')
@app.post('/post/<id>', stream=True)
async def post(request, id):
assert isinstance(request.stream, asyncio.Queue)
async def streaming(response):
while True:
body = await request.stream.get()
if body is None:
break
response.write(body.decode('utf-8'))
return stream(streaming)
# 404
request, response = app.test_client.post('/in_valid_post', data=data)
assert response.status == 404
assert response.text == 'Error: Requested URL /in_valid_post not found'
# 405
request, response = app.test_client.get('/post/random_id', data=data)
assert response.status == 405
assert response.text == 'Error: Method GET not allowed for URL /post/random_id'
def test_request_stream_blueprint():
'''for self.is_request_stream = True'''
app = Sanic('test_request_stream_blueprint')
bp = Blueprint('test_blueprint_request_stream_blueprint')
@app.get('/get')
async def get(request):
assert request.stream is None
return text('GET')
@bp.head('/head')
async def head(request):
assert request.stream is None
return text('HEAD')
@bp.delete('/delete')
async def delete(request):
assert request.stream is None
return text('DELETE')
@bp.options('/options')
async def options(request):
assert request.stream is None
return text('OPTIONS')
@bp.post('/_post/<id>')
async def _post(request, id):
assert request.stream is None
return text('_POST')
@bp.post('/post/<id>', stream=True)
async def post(request, id):
assert isinstance(request.stream, asyncio.Queue)
async def streaming(response):
while True:
body = await request.stream.get()
if body is None:
break
response.write(body.decode('utf-8'))
return stream(streaming)
@bp.put('/_put')
async def _put(request):
assert request.stream is None
return text('_PUT')
@bp.put('/put', stream=True)
async def put(request):
assert isinstance(request.stream, asyncio.Queue)
async def streaming(response):
while True:
body = await request.stream.get()
if body is None:
break
response.write(body.decode('utf-8'))
return stream(streaming)
@bp.patch('/_patch')
async def _patch(request):
assert request.stream is None
return text('_PATCH')
@bp.patch('/patch', stream=True)
async def patch(request):
assert isinstance(request.stream, asyncio.Queue)
async def streaming(response):
while True:
body = await request.stream.get()
if body is None:
break
response.write(body.decode('utf-8'))
return stream(streaming)
app.blueprint(bp)
assert app.is_request_stream is True
request, response = app.test_client.get('/get')
assert response.status == 200
assert response.text == 'GET'
request, response = app.test_client.head('/head')
assert response.status == 200
assert response.text == ''
request, response = app.test_client.delete('/delete')
assert response.status == 200
assert response.text == 'DELETE'
request, response = app.test_client.options('/options')
assert response.status == 200
assert response.text == 'OPTIONS'
request, response = app.test_client.post('/_post/1', data=data)
assert response.status == 200
assert response.text == '_POST'
request, response = app.test_client.post('/post/1', data=data)
assert response.status == 200
assert response.text == data
request, response = app.test_client.put('/_put', data=data)
assert response.status == 200
assert response.text == '_PUT'
request, response = app.test_client.put('/put', data=data)
assert response.status == 200
assert response.text == data
request, response = app.test_client.patch('/_patch', data=data)
assert response.status == 200
assert response.text == '_PATCH'
request, response = app.test_client.patch('/patch', data=data)
assert response.status == 200
assert response.text == data
def test_request_stream_composition_view():
'''for self.is_request_stream = True'''
app = Sanic('test_request_stream__composition_view')
def get_handler(request):
assert request.stream is None
return text('OK')
async def post_handler(request):
assert isinstance(request.stream, asyncio.Queue)
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
view = CompositionView()
view.add(['GET'], get_handler)
view.add(['POST'], post_handler, stream=True)
app.add_route(view, '/composition_view')
assert app.is_request_stream is True
request, response = app.test_client.get('/composition_view')
assert response.status == 200
assert response.text == 'OK'
request, response = app.test_client.post('/composition_view', data=data)
assert response.status == 200
assert response.text == data
def test_request_stream():
'''test for complex application'''
bp = Blueprint('test_blueprint_request_stream')
app = Sanic('test_request_stream')
class SimpleView(HTTPMethodView):
def get(self, request):
assert request.stream is None
return text('OK')
@stream_decorator
async def post(self, request):
assert isinstance(request.stream, asyncio.Queue)
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
@app.post('/stream', stream=True)
async def handler(request):
assert isinstance(request.stream, asyncio.Queue)
async def streaming(response):
while True:
body = await request.stream.get()
if body is None:
break
response.write(body.decode('utf-8'))
return stream(streaming)
@app.get('/get')
async def get(request):
assert request.stream is None
return text('OK')
@bp.post('/bp_stream', stream=True)
async def bp_stream(request):
assert isinstance(request.stream, asyncio.Queue)
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
@bp.get('/bp_get')
async def bp_get(request):
assert request.stream is None
return text('OK')
def get_handler(request):
assert request.stream is None
return text('OK')
async def post_handler(request):
assert isinstance(request.stream, asyncio.Queue)
result = ''
while True:
body = await request.stream.get()
if body is None:
break
result += body.decode('utf-8')
return text(result)
app.add_route(SimpleView.as_view(), '/method_view')
view = CompositionView()
view.add(['GET'], get_handler)
view.add(['POST'], post_handler, stream=True)
app.blueprint(bp)
app.add_route(view, '/composition_view')
assert app.is_request_stream is True
request, response = app.test_client.get('/method_view')
assert response.status == 200
assert response.text == 'OK'
request, response = app.test_client.post('/method_view', data=data)
assert response.status == 200
assert response.text == data
request, response = app.test_client.get('/composition_view')
assert response.status == 200
assert response.text == 'OK'
request, response = app.test_client.post('/composition_view', data=data)
assert response.status == 200
assert response.text == data
request, response = app.test_client.get('/get')
assert response.status == 200
assert response.text == 'OK'
request, response = app.test_client.post('/stream', data=data)
assert response.status == 200
assert response.text == data
request, response = app.test_client.get('/bp_get')
assert response.status == 200
assert response.text == 'OK'
request, response = app.test_client.post('/bp_stream', data=data)
assert response.status == 200
assert response.text == data

View File

@@ -8,7 +8,7 @@ import pytest
from sanic import Sanic
from sanic.exceptions import ServerError
from sanic.response import json, text
from sanic.request import DEFAULT_HTTP_CONTENT_TYPE
from sanic.testing import HOST, PORT
@@ -175,13 +175,79 @@ def test_token():
token = 'a1d895e0-553a-421a-8e22-5ff8ecb48cbf'
headers = {
'content-type': 'application/json',
'Authorization': 'Bearer Token {}'.format(token)
'Authorization': 'Bearer {}'.format(token)
}
request, response = app.test_client.get('/', headers=headers)
assert request.token == token
# no Authorization headers
headers = {
'content-type': 'application/json'
}
request, response = app.test_client.get('/', headers=headers)
assert request.token is None
def test_content_type():
app = Sanic('test_content_type')
@app.route('/')
async def handler(request):
return text(request.content_type)
request, response = app.test_client.get('/')
assert request.content_type == DEFAULT_HTTP_CONTENT_TYPE
assert response.text == DEFAULT_HTTP_CONTENT_TYPE
headers = {
'content-type': 'application/json',
}
request, response = app.test_client.get('/', headers=headers)
assert request.content_type == 'application/json'
assert response.text == 'application/json'
def test_remote_addr():
app = Sanic('test_content_type')
@app.route('/')
async def handler(request):
return text(request.remote_addr)
headers = {
'X-Forwarded-For': '127.0.0.1, 127.0.1.2'
}
request, response = app.test_client.get('/', headers=headers)
assert request.remote_addr == '127.0.0.1'
assert response.text == '127.0.0.1'
request, response = app.test_client.get('/')
assert request.remote_addr == ''
assert response.text == ''
headers = {
'X-Forwarded-For': '127.0.0.1, , ,,127.0.1.2'
}
request, response = app.test_client.get('/', headers=headers)
assert request.remote_addr == '127.0.0.1'
assert response.text == '127.0.0.1'
def test_match_info():
app = Sanic('test_match_info')
@app.route('/api/v1/user/<user_id>/')
async def handler(request, user_id):
return json(request.match_info)
request, response = app.test_client.get('/api/v1/user/sanic_user/')
assert request.match_info == {"user_id": "sanic_user"}
assert json_loads(response.text) == {"user_id": "sanic_user"}
# ------------------------------------------------------------ #
@@ -220,19 +286,26 @@ def test_post_form_urlencoded():
assert request.form.get('test') == 'OK'
def test_post_form_multipart_form_data():
@pytest.mark.parametrize(
'payload', [
'------sanic\r\n' \
'Content-Disposition: form-data; name="test"\r\n' \
'\r\n' \
'OK\r\n' \
'------sanic--\r\n',
'------sanic\r\n' \
'content-disposition: form-data; name="test"\r\n' \
'\r\n' \
'OK\r\n' \
'------sanic--\r\n',
])
def test_post_form_multipart_form_data(payload):
app = Sanic('test_post_form_multipart_form_data')
@app.route('/', methods=['POST'])
async def handler(request):
return text('OK')
payload = '------sanic\r\n' \
'Content-Disposition: form-data; name="test"\r\n' \
'\r\n' \
'OK\r\n' \
'------sanic--\r\n'
headers = {'content-type': 'multipart/form-data; boundary=----sanic'}
request, response = app.test_client.post(data=payload, headers=headers)

View File

@@ -1,13 +1,22 @@
import asyncio
import inspect
import os
from aiofiles import os as async_os
from mimetypes import guess_type
from urllib.parse import unquote
import pytest
from random import choice
from sanic import Sanic
from sanic.response import HTTPResponse, stream, StreamingHTTPResponse
from sanic.response import HTTPResponse, stream, StreamingHTTPResponse, file, file_stream, json
from sanic.testing import HOST, PORT
from unittest.mock import MagicMock
JSON_DATA = {'ok': True}
def test_response_body_not_a_string():
"""Test when a response body sent from the application is not a string"""
app = Sanic('response_body_not_a_string')
@@ -27,6 +36,24 @@ async def sample_streaming_fn(response):
response.write('bar')
@pytest.fixture
def json_app():
app = Sanic('json')
@app.route("/")
async def test(request):
return json(JSON_DATA)
return app
def test_json_response(json_app):
from sanic.response import json_dumps
request, response = json_app.test_client.get('/')
assert response.status == 200
assert response.text == json_dumps(JSON_DATA)
assert response.json == JSON_DATA
@pytest.fixture
def streaming_app():
app = Sanic('streaming')
@@ -94,3 +121,107 @@ def test_stream_response_writes_correct_content_to_transport(streaming_app):
app.stop()
streaming_app.run(host=HOST, port=PORT)
@pytest.fixture
def static_file_directory():
"""The static directory to serve"""
current_file = inspect.getfile(inspect.currentframe())
current_directory = os.path.dirname(os.path.abspath(current_file))
static_directory = os.path.join(current_directory, 'static')
return static_directory
def get_file_content(static_file_directory, file_name):
"""The content of the static file to check"""
with open(os.path.join(static_file_directory, file_name), 'rb') as file:
return file.read()
@pytest.mark.parametrize('file_name', ['test.file', 'decode me.txt', 'python.png'])
def test_file_response(file_name, static_file_directory):
app = Sanic('test_file_helper')
@app.route('/files/<filename>', methods=['GET'])
def file_route(request, filename):
file_path = os.path.join(static_file_directory, filename)
file_path = os.path.abspath(unquote(file_path))
return file(file_path, mime_type=guess_type(file_path)[0] or 'text/plain')
request, response = app.test_client.get('/files/{}'.format(file_name))
assert response.status == 200
assert response.body == get_file_content(static_file_directory, file_name)
@pytest.mark.parametrize('file_name', ['test.file', 'decode me.txt'])
def test_file_head_response(file_name, static_file_directory):
app = Sanic('test_file_helper')
@app.route('/files/<filename>', methods=['GET', 'HEAD'])
async def file_route(request, filename):
file_path = os.path.join(static_file_directory, filename)
file_path = os.path.abspath(unquote(file_path))
stats = await async_os.stat(file_path)
headers = dict()
headers['Accept-Ranges'] = 'bytes'
headers['Content-Length'] = str(stats.st_size)
if request.method == "HEAD":
return HTTPResponse(
headers=headers,
content_type=guess_type(file_path)[0] or 'text/plain')
else:
return file(file_path, headers=headers,
mime_type=guess_type(file_path)[0] or 'text/plain')
request, response = app.test_client.head('/files/{}'.format(file_name))
assert response.status == 200
assert 'Accept-Ranges' in response.headers
assert 'Content-Length' in response.headers
assert int(response.headers[
'Content-Length']) == len(
get_file_content(static_file_directory, file_name))
@pytest.mark.parametrize('file_name', ['test.file', 'decode me.txt', 'python.png'])
def test_file_stream_response(file_name, static_file_directory):
app = Sanic('test_file_helper')
@app.route('/files/<filename>', methods=['GET'])
def file_route(request, filename):
file_path = os.path.join(static_file_directory, filename)
file_path = os.path.abspath(unquote(file_path))
return file_stream(file_path, chunk_size=32,
mime_type=guess_type(file_path)[0] or 'text/plain')
request, response = app.test_client.get('/files/{}'.format(file_name))
assert response.status == 200
assert response.body == get_file_content(static_file_directory, file_name)
@pytest.mark.parametrize('file_name', ['test.file', 'decode me.txt'])
def test_file_stream_head_response(file_name, static_file_directory):
app = Sanic('test_file_helper')
@app.route('/files/<filename>', methods=['GET', 'HEAD'])
async def file_route(request, filename):
file_path = os.path.join(static_file_directory, filename)
file_path = os.path.abspath(unquote(file_path))
headers = dict()
headers['Accept-Ranges'] = 'bytes'
if request.method == "HEAD":
# Return a normal HTTPResponse, not a
# StreamingHTTPResponse for a HEAD request
stats = await async_os.stat(file_path)
headers['Content-Length'] = str(stats.st_size)
return HTTPResponse(
headers=headers,
content_type=guess_type(file_path)[0] or 'text/plain')
else:
return file_stream(file_path, chunk_size=32, headers=headers,
mime_type=guess_type(file_path)[0] or 'text/plain')
request, response = app.test_client.head('/files/{}'.format(file_name))
assert response.status == 200
# A HEAD request should never be streamed/chunked.
if 'Transfer-Encoding' in response.headers:
assert response.headers['Transfer-Encoding'] != "chunked"
assert 'Accept-Ranges' in response.headers
# A HEAD request should get the Content-Length too
assert 'Content-Length' in response.headers
assert int(response.headers[
'Content-Length']) == len(
get_file_content(static_file_directory, file_name))

View File

@@ -4,12 +4,33 @@ import pytest
from sanic import Sanic
from sanic.response import text
from sanic.router import RouteExists, RouteDoesNotExist
from sanic.constants import HTTP_METHODS
# ------------------------------------------------------------ #
# UTF-8
# ------------------------------------------------------------ #
@pytest.mark.parametrize('method', HTTP_METHODS)
def test_versioned_routes_get(method):
app = Sanic('test_shorhand_routes_get')
method = method.lower()
func = getattr(app, method)
if callable(func):
@func('/{}'.format(method), version=1)
def handler(request):
return text('OK')
else:
print(func)
raise
client_method = getattr(app.test_client, method)
request, response = client_method('/v1/{}'.format(method))
assert response.status== 200
def test_shorthand_routes_get():
app = Sanic('test_shorhand_routes_get')
@@ -28,12 +49,16 @@ def test_route_strict_slash():
@app.get('/get', strict_slashes=True)
def handler(request):
assert request.stream is None
return text('OK')
@app.post('/post/', strict_slashes=True)
def handler(request):
assert request.stream is None
return text('OK')
assert app.is_request_stream is False
request, response = app.test_client.get('/get')
assert response.text == 'OK'
@@ -77,21 +102,43 @@ def test_shorthand_routes_put():
@app.put('/put')
def handler(request):
assert request.stream is None
return text('OK')
assert app.is_request_stream is False
request, response = app.test_client.put('/put')
assert response.text == 'OK'
request, response = app.test_client.get('/put')
assert response.status == 405
def test_shorthand_routes_delete():
app = Sanic('test_shorhand_routes_delete')
@app.delete('/delete')
def handler(request):
assert request.stream is None
return text('OK')
assert app.is_request_stream is False
request, response = app.test_client.delete('/delete')
assert response.text == 'OK'
request, response = app.test_client.get('/delete')
assert response.status == 405
def test_shorthand_routes_patch():
app = Sanic('test_shorhand_routes_patch')
@app.patch('/patch')
def handler(request):
assert request.stream is None
return text('OK')
assert app.is_request_stream is False
request, response = app.test_client.patch('/patch')
assert response.text == 'OK'
@@ -103,8 +150,11 @@ def test_shorthand_routes_head():
@app.head('/head')
def handler(request):
assert request.stream is None
return text('OK')
assert app.is_request_stream is False
request, response = app.test_client.head('/head')
assert response.status == 200
@@ -116,8 +166,11 @@ def test_shorthand_routes_options():
@app.options('/options')
def handler(request):
assert request.stream is None
return text('OK')
assert app.is_request_stream is False
request, response = app.test_client.options('/options')
assert response.status == 200

View File

@@ -16,6 +16,7 @@ def test_methods(method):
class DummyView(HTTPMethodView):
async def get(self, request):
assert request.stream is None
return text('', headers={'method': 'GET'})
def post(self, request):
@@ -37,6 +38,7 @@ def test_methods(method):
return text('', headers={'method': 'DELETE'})
app.add_route(DummyView.as_view(), '/')
assert app.is_request_stream is False
request, response = getattr(app.test_client, method.lower())('/')
assert response.headers['method'] == method
@@ -79,6 +81,7 @@ def test_with_bp():
class DummyView(HTTPMethodView):
def get(self, request):
assert request.stream is None
return text('I am get method')
bp.add_route(DummyView.as_view(), '/')
@@ -86,6 +89,7 @@ def test_with_bp():
app.blueprint(bp)
request, response = app.test_client.get('/')
assert app.is_request_stream is False
assert response.text == 'I am get method'
@@ -227,10 +231,15 @@ def test_composition_view_runs_methods_as_expected(method):
app = Sanic('test_composition_view')
view = CompositionView()
view.add(['GET', 'POST', 'PUT'], lambda x: text('first method'))
def first(request):
assert request.stream is None
return text('first method')
view.add(['GET', 'POST', 'PUT'], first)
view.add(['DELETE', 'PATCH'], lambda x: text('second method'))
app.add_route(view, '/')
assert app.is_request_stream is False
if method in ['GET', 'POST', 'PUT']:
request, response = getattr(app.test_client, method.lower())('/')

135
tests/test_worker.py Normal file
View File

@@ -0,0 +1,135 @@
import time
import json
import shlex
import subprocess
import urllib.request
from unittest import mock
from sanic.worker import GunicornWorker
from sanic.app import Sanic
import asyncio
import logging
import pytest
@pytest.fixture(scope='module')
def gunicorn_worker():
command = 'gunicorn --bind 127.0.0.1:1337 --worker-class sanic.worker.GunicornWorker examples.simple_server:app'
worker = subprocess.Popen(shlex.split(command))
time.sleep(3)
yield
worker.kill()
def test_gunicorn_worker(gunicorn_worker):
with urllib.request.urlopen('http://localhost:1337/') as f:
res = json.loads(f.read(100).decode())
assert res['test']
class GunicornTestWorker(GunicornWorker):
def __init__(self):
self.app = mock.Mock()
self.app.callable = Sanic("test_gunicorn_worker")
self.servers = {}
self.exit_code = 0
self.cfg = mock.Mock()
self.notify = mock.Mock()
@pytest.fixture
def worker():
return GunicornTestWorker()
def test_worker_init_process(worker):
with mock.patch('sanic.worker.asyncio') as mock_asyncio:
try:
worker.init_process()
except TypeError:
pass
assert mock_asyncio.get_event_loop.return_value.close.called
assert mock_asyncio.new_event_loop.called
assert mock_asyncio.set_event_loop.called
def test_worker_init_signals(worker):
worker.loop = mock.Mock()
worker.init_signals()
assert worker.loop.add_signal_handler.called
def test_handle_abort(worker):
with mock.patch('sanic.worker.sys') as mock_sys:
worker.handle_abort(object(), object())
assert not worker.alive
assert worker.exit_code == 1
mock_sys.exit.assert_called_with(1)
def test_handle_quit(worker):
worker.handle_quit(object(), object())
assert not worker.alive
assert worker.exit_code == 0
def test_run_max_requests_exceeded(worker):
loop = asyncio.new_event_loop()
worker.ppid = 1
worker.alive = True
sock = mock.Mock()
sock.cfg_addr = ('localhost', 8080)
worker.sockets = [sock]
worker.wsgi = mock.Mock()
worker.connections = set()
worker.log = mock.Mock()
worker.loop = loop
worker.servers = {
"server1": {"requests_count": 14},
"server2": {"requests_count": 15},
}
worker.max_requests = 10
worker._run = mock.Mock(wraps=asyncio.coroutine(lambda *a, **kw: None))
# exceeding request count
_runner = asyncio.ensure_future(worker._check_alive(), loop=loop)
loop.run_until_complete(_runner)
assert worker.alive == False
worker.notify.assert_called_with()
worker.log.info.assert_called_with("Max requests exceeded, shutting down: %s",
worker)
def test_worker_close(worker):
loop = asyncio.new_event_loop()
asyncio.sleep = mock.Mock(wraps=asyncio.coroutine(lambda *a, **kw: None))
worker.ppid = 1
worker.pid = 2
worker.cfg.graceful_timeout = 1.0
worker.signal = mock.Mock()
worker.signal.stopped = False
worker.wsgi = mock.Mock()
conn = mock.Mock()
conn.websocket = mock.Mock()
conn.websocket.close_connection = mock.Mock(
wraps=asyncio.coroutine(lambda *a, **kw: None)
)
worker.connections = set([conn])
worker.log = mock.Mock()
worker.loop = loop
server = mock.Mock()
server.close = mock.Mock(wraps=lambda *a, **kw: None)
server.wait_closed = mock.Mock(wraps=asyncio.coroutine(lambda *a, **kw: None))
worker.servers = {
server: {"requests_count": 14},
}
worker.max_requests = 10
# close worker
_close = asyncio.ensure_future(worker.close(), loop=loop)
loop.run_until_complete(_close)
assert worker.signal.stopped == True
conn.websocket.close_connection.assert_called_with(force=True)
assert len(worker.servers) == 0

28
tox.ini
View File

@@ -1,22 +1,22 @@
[tox]
envlist = py35, py36, flake8, check
[travis]
python =
3.5: py35, flake8, check
3.6: py36, flake8, check
envlist = py35, py36, {py35,py36}-no-ext, flake8, check
[testenv]
usedevelop = True
setenv =
{py35,py36}-no-ext: SANIC_NO_UJSON=1
{py35,py36}-no-ext: SANIC_NO_UVLOOP=1
deps =
-rrequirements-dev.txt
coverage
pytest
pytest-cov
pytest-sugar
aiohttp==1.3.5
chardet<=2.3.0
beautifulsoup4
gunicorn
commands =
pytest tests {posargs}
coverage erase
coverage run -m sanic.app
coverage report
pytest tests --cov sanic --cov-report term-missing {posargs}
[testenv:flake8]
deps =