Compare commits

...

651 Commits

Author SHA1 Message Date
Eli Uriegas
dab802fbf4 Merge pull request #1530 from seemethere/bump_19031
Bump version to 19.03.1
2019-03-22 19:48:50 -07:00
Eli Uriegas
7bca95205d Bump version to 19.03.1
Couldn't delete the release on github so we go with the next best thing
which is to just bump the patch version

Signed-off-by: Eli Uriegas <seemethere101@gmail.com>
2019-03-22 16:44:28 -07:00
Eli Uriegas
669e2ed5b0 Merge pull request #1529 from huge-success/pypi-credentials
Update PyPI credentials
2019-03-22 16:41:12 -07:00
Eli Uriegas
783eb1a6e8 Merge pull request #1527 from seemethere/bump_1903
Bump version to 19.03.0
2019-03-22 16:40:54 -07:00
Adam Hopkins
9b9599b12f Update PyPI credentials 2019-03-22 13:44:13 +02:00
Eli Uriegas
6ed0d3def7 Bump version to 19.03.0
Signed-off-by: Eli Uriegas <seemethere101@gmail.com>
2019-03-21 16:24:57 -07:00
Amit Garu
c42731a55c await keyword missing fix in response doc (#1520) 2019-03-19 07:15:28 -05:00
Sam Havens
abf8534ea9 fix typo in Asyncio example (#1510)
* fix typo

* args to kwargs
2019-03-15 12:28:15 -05:00
Moshe Zada
773a66bc5b Fix typo (#1516) 2019-03-15 11:49:18 -05:00
Harsha Narayana
269100eac1 format: fix linter issue causing travis build failures (fix #1514) (#1515)
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2019-03-14 12:18:47 -05:00
Serge Fedoruk
2a15583b87 add Request.not_grouped_args, deprecation warning Request.raw_args (#1476)
* add Request.not_grouped_args, deprecation warning Request.raw_args

* add 1 more test for coverage

* custom parser for Request.args and Request.query_args, some additional tests

* add docs for custom queryset parsing

* fix import sorting

* docstrings for get_query_args and get_args methods

* lost import
2019-03-14 09:04:05 -05:00
Daniel Thorn
d5813152ab Allow sanic test client to bind to a random port (#1376) 2019-03-04 15:23:03 -06:00
Harsha Narayana
348964fe12 Enable Middleware Support for Blueprint Groups (#1399)
* enable blueprint group middleware support

This commit will enable the users to implement a middleware at the
blueprint group level whereby enforcing the middleware automatically to
each of the available Blueprints that are part of the group.

This will eanble a simple way in which a certain set of common features
and criteria can be enforced on a Blueprint group. i.e. authentication
and authorization

This commit will address the feature request raised as part of Issue #1386

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>

* enable indexing of BlueprintGroup object

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>

* rename blueprint group file to fix spelling error

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>

* add documentation and additional unit tests

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>

* cleanup and optimize headers in unit test file

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>

* fix Bluprint Group iteratable method

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>

* add additional unit test to check StopIteration condition

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>

* cleanup iter protocol implemenation for blueprint group and add slots

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>

* fix blueprint group middleware invocation identification

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>

* feat: enable list behavior on blueprint group object and use append instead of properly to add blueprint to group

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2019-03-03 16:26:05 -06:00
Markus Lång
e5c7589fc0 Remove update_current_time refresh (#1502) 2019-03-03 11:22:26 -06:00
Ashley Sommer
4260528645 Fix the auto_reloader to work when the executable was launched with a module, rather than a script. (#1501) 2019-03-03 11:03:26 -06:00
Harsha Narayana
34fe26e51b Add Route Resolution Benchmarking to Unit Test (#1499)
* feat: add benchmark tester for route resolution and cleanup test warnings

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>

* feat: refactor sanic benchmark test util into fixtures

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2019-02-28 08:56:41 -06:00
PWZER
8a59907319 Recognizes non-ASCII filenames in RFC 2231, and suport filename length is zero for multipart/form-data. (#1497)
* suport filename length is 0

* 1. suport filename length is zero for multipart/form-data.
2. Now recognizes non-ASCII filenames in RFC 2231, "filename*" format
3. Add some test cases in tests/test_requests.py::test_request_multipart_files.

* reformat sanic/request.py
2019-02-28 08:55:32 -06:00
7
52deebaf65 Merge pull request #1490 from chenjr0719/fix_doc_build
Fix python version in doc build
2019-02-19 16:26:56 -08:00
jacob
1e05b22fbc Fix python version in environment.yml 2019-02-18 14:02:45 +08:00
7
ab56af5d15 Merge pull request #1489 from tomchristie/patch-1
Added "databases"
2019-02-15 16:43:38 -08:00
Tom Christie
123f00eee6 Added "databases"
Adds https://github.com/encode/databases to the "Database Integration" section.
2019-02-14 13:44:18 +00:00
Mykhailo Kushchenko
42bf103269 Remove deleted repo (#1487)
https://github.com/Sniedes722/Sanic-OAuth  (Sanic-OAuth: OAuth Library for connecting to & creating your own token providers.) returns  404
2019-02-08 08:43:43 -06:00
0xflotus
c8d2a462e3 did you mean specific? (#1486) 2019-02-06 16:28:32 -06:00
Leonardo Teixeira Menezes
08794ae1cf Enforce Datetime Type for Expires on Set-Cookie (#1484)
* Enforce Datetime Type for Expires on Set-Cookie

* Fix lint issues

* Format code and improve error type

* Fix import order
2019-02-06 12:29:33 -06:00
Kevin ZHANG Qing
4f70dba935 sanic-zipkin (#1483) 2019-02-05 07:59:33 -06:00
Enda Farrell
b926a2c9b0 sanic#1480 Allow negative int/number in path (#1481)
* sanic#1480 Allow negative int/number

* Rerun ``make beautify`` on this change.
2019-02-05 07:54:48 -06:00
Jacob
52bdd1d5a2 Add stream support for bp.add_route() (#1482)
* Fix #1454

* Update doc

* Fix F632 in response.py
2019-02-05 07:47:46 -06:00
7
bc7d0f0da5 Merge pull request #1478 from chenjr0719/fix_doc_build
Upgrade setuptools version and use native docutils in doc build
2019-01-21 22:42:51 -08:00
jacob
6a8e9c9e95 Add deps based on docs extras require, Remove unnecessary deps 2019-01-22 14:05:29 +08:00
jacob
211a922f3c Upgrade setuptools version and use native docutils 2019-01-21 10:16:57 +08:00
7
2758a3ade6 Merge pull request #1472 from xxNB/dev
Remove unwanted None check for __repr__ in `Request` class
2019-01-20 14:21:57 -08:00
7
ef3c9eae73 Merge pull request #1477 from kyb3r/patch-2
Fix grammar in README.md
2019-01-20 14:21:27 -08:00
7
9cf2e1b519 Merge pull request #1470 from denismakogon/create-server
make Sanic.create_server return an asyncio.Server
2019-01-20 14:21:11 -08:00
Kyber
51c2f7a599 Use backticks 2019-01-19 20:10:44 +11:00
Kyber
5bdd046b11 Fix grammar in README.md
>  It allows usage the async and await syntax 

Doesn't make sense.
2019-01-19 20:08:47 +11:00
章昕
af7ad0a621 Remove unwanted None check for __repr__ in class 2019-01-17 00:24:11 +08:00
Denis Makogon
1473753d43 linter fix 2019-01-15 17:48:26 +02:00
Denis Makogon
b36bd21813 fix uvloop check 2019-01-15 17:45:47 +02:00
Denis Makogon
f8f0241c27 refactor uvloop detection in its own method 2019-01-15 17:33:53 +02:00
Denis Makogon
1af16836d4 make tests dependent on uvloop 2019-01-15 17:30:32 +02:00
Denis Makogon
757974714e skip tests if python version is not 3.7 at least 2019-01-15 17:27:41 +02:00
Denis Makogon
eed22a7a24 Fix app.create_server calls 2019-01-15 15:47:35 +02:00
Denis Makogon
0242bc999f Fix type asserting 2019-01-15 15:11:38 +02:00
Denis Makogon
b89c533818 Adding doc 2019-01-15 15:04:30 +02:00
Denis Makogon
2cb05ab865 More tests, attempting to fix CI 2019-01-15 14:52:53 +02:00
Denis Makogon
391639210e make Sanic.create_server return an asyncio.Server
- adding 2 new parameters to Sanic.create_server:
   * return_asyncio_server=False - defines whether there's
     a need to return an asyncio.Server or run it right away
   * asyncio_server_kwargs=None - for python 3.7 uvloop doesn't
     support all necessary features like "start_serving",
     so, in order to make sanic work well with asyncio from 3.7
     there's a need to introduce generic way for passing
     kwargs for "loop.create_server"

Closes: #1469
2019-01-15 13:38:47 +02:00
7
99f34c9f50 Merge pull request #1457 from huge-success/max-age-integer
enforce integer for max-age cookie
2019-01-13 13:15:10 -08:00
Raphael Deem
d418cc9950 formatting 2019-01-12 20:41:35 -08:00
Raphael Deem
6dfafb0787 test float handling 2019-01-12 20:41:35 -08:00
Raphael Deem
7067295e67 enforce integer for max-age cookie 2019-01-12 20:41:35 -08:00
Eli Uriegas
2af229eb1a Merge pull request #1445 from huge-success/r0fls-977
add handler name to request as endpoint
2019-01-08 16:12:25 -08:00
7
8dd8e9916e upgrade pytest version that compatible with pytest-cov, fixes some caplog unit tests (#1464) 2019-01-08 09:15:23 -06:00
7
96af1fe7cf Merge pull request #1460 from huge-success/18.12-changelog
18.12 Changelog
2019-01-06 22:33:37 -08:00
Yun Xu
cb3a03356b added changelogs to README and readthedocs 2019-01-06 13:50:40 -08:00
Yun Xu
68aa2ae3ce added changelog for 18.12 release 2019-01-06 13:44:18 -08:00
7
52de354e24 Merge pull request #1442 from Amanit/feature/gunicorn-logging
add an option to change access_log using gunicorn
2019-01-05 11:40:55 -08:00
7
f4f90cada4 Merge pull request #1449 from chenjr0719/add_amending_request_object_example
Add example of amending request object
2019-01-02 18:32:24 -08:00
Sergey Fedoruk
102e651741 refactor typing imports 2019-01-02 23:28:06 +01:00
Sergey Fedoruk
65daaaf64b linteger fix and delete old tests 2019-01-02 23:28:05 +01:00
Sergey Fedoruk
b7a6f36e95 add type annotations in run and create_server 2019-01-02 23:28:05 +01:00
Sergey Fedoruk
a86a10b128 add control of access_log argument type 2019-01-02 23:28:05 +01:00
Sergey Fedoruk
0b728ade3a change Config.__init__ 2019-01-02 23:28:05 +01:00
Sergey Fedoruk
74f05108d7 async test for access_log in create_server 2019-01-02 23:28:05 +01:00
Sergey Fedoruk
9d4d15ddc7 add config tests 2019-01-02 23:28:05 +01:00
Sergey Fedoruk
0c5c6dff8f fix linting 2019-01-02 23:28:05 +01:00
Sergey Fedoruk
391fcdc83d fix access_log in run server and fix bool in env variables 2019-01-02 23:28:05 +01:00
Sergey Fedoruk
d76d5e2c5f add an option to change access_log using gunicorn 2019-01-02 23:28:05 +01:00
jacob
f0ada573bb Fix a grammar error 2019-01-02 20:37:26 +08:00
jacob
ec5b790b51 Extend example of modifying the request in middleware document 2019-01-02 17:29:01 +08:00
jacob
613b23748d Add example of amending request object 2019-01-02 14:52:25 +08:00
Adam Hopkins
cea1547e08 Merge pull request #1446 from huge-success/ahopkins-patch-1
Update README.rst
2019-01-01 14:51:05 +02:00
7
fd5ae01e1d Merge pull request #1444 from ja8zyjits/master
Updated README.md
2018-12-31 11:49:03 -08:00
Adam Hopkins
9b6b93d467 Update README.rst 2018-12-31 21:41:35 +02:00
Adam Hopkins
ca179c12a1 Update README.rst 2018-12-31 18:47:27 +02:00
Adam Hopkins
4d527035ae Add dotted endpoint notation and additional tests 2018-12-31 13:40:07 +02:00
Jitesh Nair
19b42830ea Merge pull request #1 from ja8zyjits/ja8zyjits-patch-1-readme
Update README.rst
2018-12-31 16:01:01 +05:30
Jitesh Nair
f5162f8ab1 Update README.rst
Made the optional Environment variable declaration for installation more clear.
2018-12-31 16:00:34 +05:30
7
ff38a3c6b6 Merge pull request #1443 from huge-success/new-readme
Update README with new logo, change Congfig.LOGO, run linter
2018-12-30 13:23:12 -08:00
Adam Hopkins
94e85686b5 Ignore first row of logs when no uvloop 2018-12-30 14:07:21 +02:00
Adam Hopkins
aea4a8ed33 Modify test_logo runner 2018-12-30 13:46:08 +02:00
Adam Hopkins
05dd3b2e9d Run linter 2018-12-30 13:18:06 +02:00
Adam Hopkins
040468755c Change ASCII Logo
Update logo text

Reformat app.py
2018-12-30 12:49:23 +02:00
Adam Hopkins
50b359fdb2 Update README.rst
Add new logo and update contents of README.

Update README.rst

Fix image syntax.

Update README.rst

Fix README links.
2018-12-30 11:48:59 +02:00
7
72f2e18a84 Merge pull request #1440 from harshanarayana/fix/Contribution_Guide_Pip_Install
fix minor type and pip install instruction mismatch
2018-12-28 18:43:41 -08:00
Harsha Narayana
15b1c875f5 fix minor type and pip install instruction mismatch 2018-12-28 11:32:30 +05:30
7
13804dc380 Merge pull request #1424 from harshanarayana/enh/Documentation_Update
Documentation Enhancements
2018-12-27 21:30:02 -08:00
Harsha Narayana
9bea23da29 fix makefile phony targets
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:24:03 +05:30
Harsha Narayana
7005fabd4d add code beautification task to makefile
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:24:03 +05:30
Harsha Narayana
de8c37ad00 fix pip install typo in contribution page
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:24:03 +05:30
Harsha Narayana
a80499c4b7 update installation steps to be consistent across documentation and readme
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:24:03 +05:30
Harsha Narayana
82f7f847ba cleanup requirements and move dependency inside setup.py
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:24:03 +05:30
Harsha Narayana
4880761fe0 add setuputil based test running and makefile support
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:24:02 +05:30
Harsha Narayana
87ab0b386d fix current version in setup.cfg for relase script
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:24:02 +05:30
Harsha Narayana
c42c274002 update manifest configuration
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:24:02 +05:30
Harsha Narayana
2d82b8951f make release script black compliant and tweak documentation with indexing and format
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:24:02 +05:30
Harsha Narayana
b7702bc3e8 add monitoring examples and documents 2018-12-28 10:22:28 +05:30
Harsha Narayana
9c6b83501f add release note chnage log generation
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:22:28 +05:30
Harsha Narayana
5189d8b14c fix string formatting error in git commands for release script
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:22:28 +05:30
Harsha Narayana
e13053ed89 add automated calendar version manager
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:22:28 +05:30
Harsha Narayana
efa77cf5ec add api documentation for router and server
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:22:28 +05:30
Harsha Narayana
f6355bd075 add additional examples to documentation
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:22:28 +05:30
Harsha Narayana
e3dfce88ff fix linter issues
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:22:28 +05:30
Harsha Narayana
939b5ea095 update copyright date and add example section with category
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:22:28 +05:30
Harsha Narayana
e6fba01682 add documentation for cookies, exception, blueprint and handlers
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:22:28 +05:30
Harsha Narayana
1623d397be categorize the sanic extensions list
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:22:27 +05:30
Harsha Narayana
09678d601d add sanic app module documentations
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-12-28 10:22:27 +05:30
7
67d51f7e1b Merge pull request #1423 from yunstanford/request-streaming-support
basic request streaming support with flow control
2018-12-27 18:06:02 -08:00
7
aa7f2759a6 Merge pull request #1438 from yunstanford/master
18.12 Release
2018-12-27 11:15:07 -08:00
Yun Xu
9b9dd67797 adopt CalVer: MM.YY.MICRO, 18.12.0 release 2018-12-27 11:00:38 -08:00
7
3f73bc075a Merge pull request #1437 from FlouieInCl/master
Fix typo in exceptions.md
2018-12-27 09:59:00 -08:00
Yun Xu
56989a017b 18.12 release 2018-12-27 08:55:17 -08:00
JeongKyungSeo
ada5918bc8 Fix typo in exceptions.md 2018-12-27 16:11:37 +09:00
Jacob
4efd450b32 Add tests (#1433)
* Add tests for remove_route()

* Add tests for sanic/router.py

* Add tests for sanic/cookies.py

* Disable reset logging in test_logging.py

* Add tests for sanic/request.py

* Add tests for ContentRangeHandler

* Add tests for exception at response middleware

* Fix cached_handlers for ErrorHandler.lookup()

* Add test for websocket request timeout

* Add tests for getting cookies of StreamResponse, Remove some unused variables in tests/test_cookies.py

* Add tests for nested error handle
2018-12-22 09:21:45 -06:00
Omar Ryhan
d2670664ba Update exceptions.md (#1431)
Documented error handling from ``app.error_handler.add``
Documented custom error handling by subclassing.
2018-12-22 09:21:03 -06:00
7
fa7405fe9c Merge pull request #1422 from ashleysommer/server_slots
Add in some server.py __slots__ attribute names that are missing.
2018-12-15 13:58:22 -08:00
Jacob
33297f48a5 Add tests (#1430) 2018-12-13 11:50:50 -06:00
Yun Xu
956793e066 address review feedback, small code refactoring 2018-12-09 15:18:33 -08:00
Yun Xu
1bfbc67c62 expose request_buffer_queue_size to be configurable and update documentation
fix StreamBuffer buffer_size
2018-12-04 20:21:00 -08:00
Yun Xu
b5287184e9 fix lint
fix isort
2018-12-03 23:25:41 -08:00
Yun Xu
7c9957e058 update README.rst (clean up badges) 2018-12-03 23:03:14 -08:00
Yun Xu
fca7cb9fb0 update request streaming doc 2018-12-03 22:51:09 -08:00
Yun Xu
268d254d85 fix unit tests 2018-12-03 22:28:22 -08:00
Yun Xu
181adebf82 add StreamBuffer for request flow control 2018-12-03 22:19:26 -08:00
Ashley Sommer
06297a1918 Add in some server.py __slots__ property names that are missing. 2018-12-03 11:22:17 +10:00
Harsha Narayana
aa0874b6d8 100% Coverage for Sanic Blueprint (#1419)
* add unit tests to completely cover blueprints

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>

* fix typo in the unit test code

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-11-25 13:56:34 -06:00
7
822ced6294 Merge pull request #1416 from chenjr0719/add_tests_for_static
Add tests for static and update document
2018-11-21 23:01:37 +08:00
jacob
1a59614f79 Add stream_large_files and host examples in static_file document 2018-11-20 14:28:08 +08:00
jacob
f2d528e52a Add tests for static 2018-11-20 12:28:00 +08:00
Hasan Ramezani
f7adc5f84c Fix remove_entity_headers helper function (#1415)
* Fix `remove_entity_headers` helper function

* Add test for `remove_entity_headers` helper function
2018-11-19 09:30:53 -06:00
7
e955e833c4 Merge pull request #33 from huge-success/master
Merge upstream master branch
2018-11-16 13:02:16 +08:00
Tim&Anna
096c44b910 Update extensions.md (#1263)
* Update extensions.md

add an extension: sanic-script

* Update extensions.md
2018-11-14 07:16:43 -06:00
Nir Galon
efb9a42045 Change deprecated verify_ssl to ssl (#1155) 2018-11-14 07:16:14 -06:00
7
296cda7801 Merge pull request #1411 from devArtoria/patch-2
ADD: Sanic-JWT-Extended extension to extension docs
2018-11-13 13:49:35 +08:00
Lewis
90b9d73244 ADD: Sanic-JWT-Extended extension 2018-11-13 14:39:29 +09:00
Richard K
c8b0e7f2a7 Created methods to append and finish body content on Request (#1379)
* created methods to append and finish body content on request.py so the underlying body instance can have certain flexibility; modified server.py to reflect these changes

* - made some adjustments (including the Request.body_init method) as requested by @ahopkins;
- created a new test with a custom Request class implementation of the flexibility provided by the new methods;
2018-11-12 09:11:41 -06:00
7
6ce88ab5a4 Merge pull request #1400 from chenjr0719/add_tests_for_log
Add test for sanic.root logger and update the docs of logging
2018-11-12 20:45:05 +08:00
7
e13ab805df Merge pull request #1409 from yunstanford/windows-ci
CI Support for Windows
2018-11-12 20:05:21 +08:00
Yun Xu
e58ea8c7b4 fix unit test for windows ci
fix unit tests for windows ci

add appveyor build status badge

add readthedoc build status badge
2018-11-12 01:04:53 -08:00
jacob
dd5bac61cb Update document for logging 2018-11-12 16:09:12 +08:00
Jacob
6270b27a97 Merge branch 'master' into add_tests_for_log 2018-11-12 09:53:44 +08:00
Hasan Ramezani
f89ba1d39f Add tests for is_entity_header and is_hop_by_hop_header helper functions (#1410) 2018-11-11 10:57:57 -06:00
Yun Xu
8b5d137d8f fix .appveyor.yml 2018-11-10 06:11:01 -08:00
Yun Xu
2629fab649 add .appveyor.yml for windows ci support 2018-11-10 05:50:22 -08:00
7
92cd10c6a8 Merge pull request #32 from huge-success/master
merge upstream master branch
2018-11-10 21:26:37 +08:00
7
cc3edb90dc Merge pull request #1408 from harshanarayana/feature/Unit_Test_Enhancements
Additional Unit Tests
2018-11-10 20:46:51 +08:00
Harsha Narayana
c60ba81984 cleanup stale test for cookie object
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-11-10 16:54:24 +05:30
Harsha Narayana
ece3cdaa2e add unit tests for App Config, Cokkies and Request handler
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-11-10 16:50:30 +05:30
7
4cb40f2042 Merge pull request #1403 from harshanarayana/fix/GIT-1398-Http_Response_Content_Length_Mismatch
Fix Content-Length Mismatch while using json and ujson
2018-11-10 00:14:03 +08:00
7
0e9f350982 Merge pull request #1405 from hramezani/test_has_message_body
Add test for has_message_body helper function.
2018-11-08 22:20:07 +08:00
Hasan Ramezani
cf439f01f8 Add test for has_message_body helper function. 2018-11-07 21:29:12 +01:00
Harsha Narayana
f1f1b8a630 add additional test cases to validate Content-Length header
Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-11-07 22:07:28 +05:30
Harsha Narayana
d4d1df03c9 fix content length mismatch in windows and other platform
The current implementation of `sanic` attempts to make use of `ujson` if
it's available in the system and if not, it will default to the inbuilt
`json` module provided by python.

The current implementation of `ujson` does not provide a mechanism to
provide a custom `seperators` parameter as part of the `dumps` method
invocation and the default behavior of the module is to strip all the
spaces around seperators such as `:` and `,`. This leads to an
inconsistency in the response length when the response is generated
using the `ujson` and in built `json` module provided by python.

To maintain the consistency, this commit overrides the default behavior
of the `dumps` method provided by the `json` module to add a `seperators`
argument that will strip the white spaces around these character like
the default behavior of `ujson`

This addresses the issue referenced in #1398

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-11-07 21:38:32 +05:30
Harsha Narayana
92b73a6f4f fix Range header handling for static files (#1402)
This commit fixes the issue in the `Range` header handling that was done
while serving the file contents.

As per the HTTP response standards, a status code of 206 will be used in
case if the Range is returning a partial value and default of 200 in
other cases.

Signed-off-by: Harsha Narayana <harsha2k4@gmail.com>
2018-11-07 07:36:56 -06:00
Meng Wang
b63c06c75a fix the logger and make it work (#1397)
* fix the logger and make it work

* modify test_logging parameters and add a new unit test
2018-11-06 08:39:38 -06:00
jacob
3e3bce422e Add test for sanic.root logger and update the docs of logging 2018-11-06 21:27:01 +08:00
Stephen Sadowski
e3a27c2cc4 Merge pull request #1391 from AndresSan6/loop_exception
Handle "loop" exception in app.py
2018-11-05 08:19:01 -06:00
Stephen Sadowski
f13f451084 Merge pull request #1385 from lixxu/master
update doc for latest blueprint code
2018-11-05 07:40:12 -06:00
Stephen Sadowski
df0e3de911 Merge pull request #1393 from ashleysommer/pickleable-app-blueprint
Fix pickling blueprints Fixes #1392
2018-11-05 07:24:15 -06:00
Ashley Sommer
8466be8728 Fix type pikcle->pickle in multiprocessing test 2018-11-04 15:27:25 +10:00
Ashley Sommer
5cf2144b3f Fix pickling blueprints
Change the string passed in the "name" section of the namedtuples in Blueprint to match the name of the Blueprint module attribute name.
This allows blueprints to be pickled and unpickled, without errors, which is a requirment of running Sanic in multiprocessing mode in Windows.
Added a test for pickling and unpickling blueprints
Added a test for pickling and unpickling sanic itself
Added a test for enabling multiprocessing on an app with a blueprint (only useful to catch this bug if the tests are run on Windows).
2018-11-04 15:04:12 +10:00
Andres Sanchez
7c182f63c8 Indentation fix 2018-11-01 10:59:45 -06:00
Andres Sanchez
056180782c Removed unnecessary changes to request and router files, changes to fix lint test 2018-11-01 10:53:53 -06:00
Andres Sanchez
ff0d5870e9 Merge branch 'lintfix' into loop_exception
Made changes unnecesarry changes in request and router files, went back to previous commit and made correct changes to fix lint
2018-11-01 10:40:47 -06:00
Andres Sanchez
b70176f8c7 Fixed character limit per line in requested changes for app.py 2018-11-01 10:36:34 -06:00
Andres Sanchez
e3655b525d Modifications to ruequest and router files to fix linting issues. 2018-11-01 10:04:40 -06:00
Andres Sanchez
e63d0091af Assert was chnaged for an if and updated error messages 2018-10-31 15:23:29 -06:00
Andres Sanchez
7b0af2d80d Handle loop exception in app.py 2018-10-31 13:35:03 -06:00
Stephen Sadowski
7d79a86d4d Merge pull request #1387 from huge-success/docbuild
Resolve build of latex documentation relating to markdown lists
2018-10-30 16:13:29 -05:00
Adam Hopkins
ba46aff069 Resolve build of latex documentation relating to markdown lists 2018-10-30 22:39:17 +02:00
lixxu
7a65471ba5 update doc for latest blueprint code 2018-10-29 16:54:34 +08:00
Stephen Sadowski
c7c46da975 Merge pull request #1383 from huge-success/docbuild
Fix documentation build errors
2018-10-26 08:19:10 -05:00
Adam Hopkins
c708e8425f Fix documentation build errors 2018-10-26 11:57:28 +03:00
Eli Uriegas
905c51bef0 Merge pull request #1371 from yunstanford/integrate-isort
codestyle: Integrate isort
2018-10-23 16:05:36 -07:00
Eli Uriegas
bd87098b7e Merge pull request #1368 from yunstanford/fix-redirect
Add '%' to quote_plus's `safe` parameter in response.redirect
2018-10-23 15:12:02 -07:00
Eli Uriegas
5f486cc25f Merge pull request #1378 from hramezani/fix_some_lint_error
Fix some test files lint errors.
2018-10-23 15:10:15 -07:00
Eli Uriegas
f79fb72a33 Merge pull request #1377 from yunstanford/fix-readthedoc-build
Fix readthedoc build
2018-10-23 15:07:25 -07:00
Yun Xu
0505aa2dda refactor import 2018-10-23 14:53:39 -07:00
Hasan Ramezani
485ff32e42 Fix all test files lint errors. 2018-10-23 11:04:17 +02:00
Stephen Sadowski
5ead67972f Merge pull request #1375 from sjsadowski/master
Added documentation for AF_INET6 and AF_UNIX socket usage
2018-10-21 15:28:40 -05:00
Yun Xu
9c860dbff3 fix readthedoc build 2018-10-21 01:56:48 -07:00
Stephen Sadowski
a20ad99638 Added documentation for AF_INET6 and AF_UNIX socket usage 2018-10-19 13:33:01 -05:00
Yun Xu
8ef7bf8e7b integrate with isort 2018-10-17 21:20:16 -07:00
7
0d5be1969a Merge pull request #31 from huge-success/master
Merge Upstream master branch
2018-10-17 21:02:44 -07:00
Adam Hopkins
d06ea9bfc3 Merge pull request #1370 from huge-success/ahopkins-patch-1
Update issue templates
2018-10-17 09:47:22 +03:00
Adam Hopkins
57e79882e1 Update issue templates 2018-10-16 15:42:52 +03:00
Yun Xu
20d1ab60c7 remove unused json import 2018-10-15 22:13:42 -07:00
Yun Xu
277c2ce2d2 fix redirect with quoted param 2018-10-15 21:53:11 -07:00
7
34e51f01d1 Merge pull request #30 from huge-success/master
Merge Upstream Master Branch
2018-10-15 20:04:57 -07:00
7
f4b4e3a58c Merge pull request #1366 from hramezani/lint_test_blueprints
Fix some lint errors and warnings in `tests/test_blueprints.py`
2018-10-14 21:02:48 -07:00
7
def2e033c8 Merge pull request #1365 from yunstanford/codestyle-black
Codestyle black
2018-10-14 10:07:09 -07:00
Hasan Ramezani
dfec18278b Fix some lint errors and warnings in tests/test_blueprints.py. 2018-10-14 16:09:47 +02:00
Yun Xu
cd5bdecda3 add codestyle badge in README 2018-10-13 18:33:02 -07:00
Yun Xu
9b6217ba41 fix travisci 2018-10-13 18:19:08 -07:00
Yun Xu
272f6e195d added black for lint check 2018-10-13 18:10:43 -07:00
Yun Xu
aa9bf04dfe run black against sanic module 2018-10-13 17:55:33 -07:00
7
9ae6dfb6d2 Merge pull request #29 from huge-success/master
merge upstream master branch
2018-10-13 17:28:32 -07:00
7
619bb79a2f Merge pull request #1336 from untitaker/logging-refactor
Try not to stringify exception in logging messages
2018-10-13 16:54:57 -07:00
7
0cad831eca Merge pull request #1364 from yunstanford/raise-exception-when-param-conflicts
Raise exception when param conflicts
2018-10-13 16:28:59 -07:00
Yun Xu
f15a7fb588 fix flake8 2018-10-12 23:06:43 -07:00
Yun Xu
1bdf9ca057 add py37 in setup.py 2018-10-12 22:58:49 -07:00
Yun Xu
c8c370b784 raise exception when param conflicts in route 2018-10-12 22:57:56 -07:00
7
63182f55f7 Merge pull request #28 from huge-success/master
Merge upstream master branch
2018-10-12 22:38:37 -07:00
Stephen Sadowski
41759248e2 Merge pull request #1361 from yunstanford/cancel-request-when-connection-lost
Cancel request when connection lost
2018-10-12 07:25:10 -05:00
Yun Xu
3149d5a66d add unit test for request_stream 2018-10-11 23:12:33 -07:00
Yun Xu
8b13597099 add unit tests for verifying 2018-10-11 23:02:21 -07:00
Yun Xu
36032cc26e cancel task when connection_lost 2018-10-11 22:38:26 -07:00
7
4cb107aedc Merge pull request #27 from huge-success/master
Merge upstream master branch
2018-10-11 22:34:09 -07:00
7
176f8d1981 Merge pull request #1358 from hramezani/fix_config_tests
Change the config test to remove `NamedTemporaryFile`
2018-10-11 21:39:48 -07:00
Hasan Ramezani
9a26030bd5 Change the config test to remove NamedTemporaryFile 2018-10-11 17:34:46 +02:00
Stephen Sadowski
6778f4d9e0 Merge pull request #1342 from hramezani/load_config_file_syntax_error
Handle syntax error in load config file.
2018-10-11 08:56:48 -05:00
Stephen Sadowski
fd61b9e3e2 Merge pull request #1327 from hatarist/fix-1323
Rename the `http` module to `helpers`
2018-10-11 07:56:51 -05:00
Stephen Sadowski
298d5cdf24 Merge pull request #1334 from chenjr0719/master
Fix TypeError when use Blueprint.group() to group blueprint with defa…
2018-10-11 07:28:10 -05:00
7
1bf1c9d006 Merge pull request #26 from huge-success/master
Merge upstream master branch
2018-10-10 20:33:57 -07:00
7
7dc62be5cf Merge pull request #1335 from abuckenheimer/fix_windows_unittests
unittests passing on windows again
2018-10-10 20:15:35 -07:00
jacob
be580a6a5b Clean up files created by pytest-html 2018-10-11 10:06:05 +08:00
7
8ce519668b Merge pull request #1353 from abn/fix-unhandled-exception
Simplify request ip and port retrieval logic
2018-10-09 23:33:51 -07:00
jacob
801258c46a Merge branch 'master' of github.com:chenjr0719/sanic 2018-10-10 14:04:45 +08:00
jacob
32a1db3622 Remove normpath 2018-10-10 14:04:21 +08:00
7
ed1f3daacc Merge pull request #1352 from devArtoria/patch-1
Fix missing quotes in decorator example
2018-10-08 21:57:34 -07:00
Alec Buckenheimer
b7d74c82ba simplified aiohttp version diffs, reverted worker import policy 2018-10-08 22:48:21 -04:00
Arun Babu Neelicattu
c3b31a6fb0 Simplify request ip and port retrieval logic
This change also ensures that cases where transport stream is
already closed is handled gracefully.
2018-10-08 21:25:47 +02:00
Hasan Ramezani
f4c55bbc07 Handle config error in load config file. 2018-10-08 19:17:06 +02:00
Lewis
a16842f7bc Fix missing quotes in decorator example 2018-10-08 18:59:15 +09:00
7
439a38664f Merge pull request #25 from huge-success/master
Merge upstream master branch
2018-10-07 20:32:52 -07:00
7
5cc12fd945 Merge pull request #1348 from hramezani/add_config_test
Add test for `config.from_object`.
2018-10-07 19:53:58 -07:00
7
fe116fff5a Merge pull request #1350 from hramezani/config_documentation
Add missed documentation for config section.
2018-10-07 13:58:06 -07:00
Stephen Sadowski
06aaaf4727 Merge pull request #1351 from yunstanford/integrate-with-codecov
Integrate with codecov
2018-10-07 10:13:31 -05:00
Yun Xu
6deb9b49b2 correct Codecov badge url 2018-10-06 21:39:04 -07:00
Yun Xu
d59e92d3e5 integrate with codecov 2018-10-06 21:31:04 -07:00
7
cc83c1f0cf Merge pull request #24 from huge-success/master
merge upstream master branch
2018-10-06 21:22:54 -07:00
Hasan Ramezani
1fe7306af8 Add missed documentation for config section. 2018-10-07 01:32:36 +02:00
Hasan Ramezani
c796d73fc3 Add test for config.from_object. 2018-10-07 00:14:37 +02:00
Markus Unterwaditzer
eb93f884f3 fix: Missing import 2018-10-05 16:47:12 +02:00
Markus Unterwaditzer
3673feb256 fix: typo 2018-10-05 16:33:46 +02:00
Markus Unterwaditzer
7c9c783e9d deprecate Handler.log 2018-10-05 16:31:01 +02:00
Stephen Sadowski
74a4b9efaa Merge pull request #1345 from huge-success/ahopkins-patch-1
Update README.rst
2018-10-04 18:45:47 -05:00
Stephen Sadowski
4466e8cce1 Merge pull request #1304 from ignatenkobrain/fedora
Switch to websockets 6.0
2018-10-04 18:45:22 -05:00
Adam Hopkins
b689037984 Update README.rst 2018-10-04 12:31:57 +03:00
Stephen Sadowski
db1ba21d88 Merge pull request #1343 from vltr/httptools_pinned
pinned httptools requirement to version 0.0.10+
2018-10-03 19:27:25 -05:00
Eli Uriegas
50d270ef7c Merge pull request #1316 from sjsadowski/master
Updated changelog.md for 0.8.x
2018-10-03 15:19:21 -07:00
Richard Kuesters
d1a578b555 pinned httptools requirement to version 0.0.10+ 2018-10-03 12:22:29 -03:00
Stephen Sadowski
76e9859cf8 Merge branch 'master' into master 2018-10-03 09:56:29 -05:00
Stephen Sadowski
add9d363c5 Merge branch 'master' into logging-refactor 2018-10-03 09:55:01 -05:00
Stephen Sadowski
1498baab0f Merge pull request #1338 from hramezani/improve_config_test
Check error message and fix some lint error in test config.
2018-10-03 09:18:46 -05:00
Stephen Sadowski
df7f63d45d Merge branch 'master' into improve_config_test 2018-10-03 06:30:44 -05:00
Stephen Sadowski
f7425126a1 Merge pull request #1341 from ashleysommer/unnecessary_code
Fixes #1340
2018-10-03 06:30:22 -05:00
Ashley Sommer
790047e450 Fixes #1340 2018-10-03 10:59:24 +10:00
Stephen Sadowski
9198b5b0be Merge branch 'master' into improve_config_test 2018-10-02 13:21:23 -05:00
Stephen Sadowski
d534acb79d Merge branch 'master' into logging-refactor 2018-10-01 15:41:07 -05:00
Hasan Ramezani
d100f54551 Check error message and fix some lint error in test config. 2018-10-01 20:36:21 +02:00
Stephen Sadowski
7a9e100b0f Merge branch 'master' into fix_windows_unittests 2018-10-01 10:10:48 -05:00
Stephen Sadowski
fafe23d7c2 Merge pull request #1337 from cmcaine/fix-error-msg
Fix whitespace in error message
2018-10-01 09:31:45 -05:00
Alec Buckenheimer
9a08bdae4a fix flake8 linelength errors 2018-10-01 09:46:18 -04:00
Colin Caine
bcc11fa7fe Fix whitespace in error message 2018-09-30 09:36:55 +01:00
Markus Unterwaditzer
7d0c0fdf7c fix: Namespacing of sanic logger 2018-09-29 22:40:05 +02:00
Markus Unterwaditzer
0e33d46ead Try not to stringify exception in logging messages
This just fixes the worst offenders that trip up error reporting tools
like Sentry.io
2018-09-29 22:32:51 +02:00
Alec Buckenheimer
efbacc17cf unittests passing on windows again 2018-09-29 13:54:47 -04:00
jacob
bd6dbd9090 Fix TypeError when use Blueprint.group() to group blueprint with default url_prefix, Use os.path.normpath to avoid invalid url_prefix like api//v1 2018-09-29 18:23:16 +08:00
Eli Uriegas
076cf51fb2 Merge pull request #1305 from Stranger6667/app-fixture
Reuse app fixture in tests
2018-09-26 18:30:46 -07:00
Igor Hatarist
f8a6af1e28 Rename the http module to helpers to prevent conflicts with the built-in Python http library (fixes #1323) 2018-09-25 20:46:40 +03:00
Stephen Sadowski
96912f436d Corrected Raphael Deem's name in changelog - sorry @r0fls! 2018-09-24 09:05:58 -05:00
Raphael Deem
f0e162442f Merge branch 'master' into app-fixture 2018-09-21 15:16:00 -07:00
Eli Uriegas
04b8dd989f Merge pull request #1315 from seemethere/multidocs
Add multidict to readthedocs environment.yml
2018-09-15 19:03:56 +02:00
Stephen Sadowski
5851c8bd91 revised formatting for CHANGELOG.md 2018-09-14 13:30:57 -05:00
Stephen Sadowski
78efcf93f8 Updated changelog for all accepted PRs from 0.7.0 to Current 2018-09-14 10:56:32 -05:00
Eli Uriegas
bb35bc3898 Add multidict to readthedocs environment.yml
Signed-off-by: Eli Uriegas <seemethere101@gmail.com>
2018-09-14 16:00:29 +02:00
Stephen Sadowski
f38783bdef Merge pull request #1 from huge-success/master
Merge from head
2018-09-14 08:20:37 -05:00
Channel Cat
d8f9986089 Re-releasing with updated credentials 2018-09-13 02:24:31 -07:00
Channel Cat
3e616b599a update encrypted creds for new org 2018-09-13 02:17:27 -07:00
Channel Cat
d38fc17191 Update version to test pypi 2018-09-13 01:50:32 -07:00
Channel Cat
7ae0eb0dc3 Transfer ownership 2018-09-13 01:39:24 -07:00
Channel Cat
9082eb56a7 Update version to circumvent pypi upload errors 2018-09-06 13:51:31 -07:00
Igor Gnatenko
c578974246 Switch to websockets 6.0
Signed-off-by: Igor Gnatenko <i.gnatenko.brain@gmail.com>
2018-09-02 09:23:30 +02:00
dmitry.dygalo
fec81ffe73 Reuse app fixture in tests 2018-08-26 16:43:14 +02:00
Ashley Sommer
30e6a310f1 Pausable response streams (#1179)
* This commit adds handlers for the asyncio/uvloop protocol callbacks for pause_writing and resume_writing.
These are needed for the correct functioning of built-in tcp flow-control provided by uvloop and asyncio.
This is somewhat of a breaking change, because the `write` function in user streaming callbacks now must be `await`ed.
This is necessary because it is possible now that the http protocol may be paused, and any calls to write may need to wait on an async event to be called to become unpaused.

Updated examples and tests to reflect this change.

This change does not apply to websocket connections. A change to websocket connections may be required to match this change.

* Fix a couple of PEP8 errors caused by previous rebase.

* update docs

add await syntax to response.write in response-streaming docs.

* remove commented out code from a test file
2018-08-18 18:12:13 -07:00
Eli Uriegas
a87934d434 Merge pull request #1292 from seemethere/increment_080
Increment to 0.8.0
2018-08-17 11:52:47 -07:00
Eli Uriegas
b398c1fe72 Increment to 0.8.0
Signed-off-by: Eli Uriegas <seemethere101@gmail.com>
2018-08-17 11:43:15 -07:00
Eli Uriegas
6f813f940e Merge pull request #1278 from ashleysommer/graceful_cancel
Gracefully handle when the request_handler_task is cancelled.
2018-08-17 11:41:39 -07:00
Eli Uriegas
d52498b787 Merge pull request #1284 from ashleysommer/aiohttp_update
Fix broken tests when aiohttp >= 3.3.0
2018-08-17 11:40:49 -07:00
Ashley Sommer
79e35bbdf6 Fix auto_reload in Linux (#1286)
* Fix two problems with the auto_reloader in Linux.
1) Change 'posix' to 'linux' in sys.plaform check, because 'posix' is an invalid value and 'linux' is the correct value to use here.
2) In kill_process_children, don't just kill the 2nd level procs, also kill the 1st level procs.
   Also in kill_process_children, catch and ignore errors in the case that the child proc is already killed.

* Fix flake8 formatting on PR
2018-08-16 23:30:03 -07:00
Innokenty Lebedev
1814ff05f4 Add sse extension (#1288) 2018-08-16 11:59:58 -07:00
Ashley Sommer
ec226e33cb Pin aiohttp <= 3.2.1 in requirements-dev.txt (fixes errors for new contributors checking out the code and setting up a dev environment)
Future-proof the some test cases so they work with aiohttp >= 3.3.0, in case we bump the aiohttp version in the future.
2018-08-16 15:00:23 +10:00
hqy
6abdf9f9c1 fixed #1143 (#1276)
* fixed #1143

* fixed build failed with create_serve call _helper failed
2018-08-15 10:23:04 -07:00
abuckenheimer
212da1029e disabled auto_reload by default in windows (#1280) 2018-08-07 11:48:18 -07:00
Ashley Sommer
afea15e4a7 Add a test for the graceful CancelledError handling. The user app should _never_ see a CancelledError bubble up, nor should they be able to catch it, because the response is already sent at that point. 2018-08-06 15:02:12 +10:00
Ashley Sommer
39ff02b6e4 Modifications the handle_request function to detect and gracefully handle the case that the request_handler Task is canceled by the sanic server while it is handling the request. One common occurrence of this is when the server issues a ResponseTimeout error, it also cancels the response_handler Task.
The Canceled exception handler purposely sets `response` to `None` to drop references to the handler coroutine, in an attempt to preemptively release resources.
This commit also fixes a possible reference-before-assignment of the `response` variable in the `handle_request` function.
Finally, another byproduct of this change is that ResponseMiddleware will no longer run if the `response` is `None`.
2018-08-06 14:12:30 +10:00
Cosmo Borsky
b238be54a4 Add content_type flag to Sanic.static (#1267)
* Add content_type flag to Sanic.static

Fixes #1266

* Fix flake8 error in travis

Add line to document `content_type` arg

* Fix content_type for file streams

Update tests

herp derp

* Remove content_type as an arg to HTTPResponse

`response.HTTPResponse` will default to `headers['Content-Type']` instead of `content_type`
https://github.com/channelcat/sanic/pull/1267#discussion_r204190913
2018-07-20 22:31:15 -07:00
Cosmo Borsky
377c9890a3 Support status code for file reponse (#1269)
Fixes #1268
2018-07-20 13:39:10 -07:00
ciscorn
599834b0e1 Add subprotocols param to add_websocket_route (#1261) 2018-07-16 12:20:26 -07:00
John Doe
a39a7ca9d5 Add url_bytes to Request (#1258)
We need to have access to the raw unparsed URL.
2018-07-16 12:13:27 -07:00
Ave
cd22745e6b Sanitize the URL before redirecting (#1260)
* URL Quote the URL before redirecting

* Use safe url instead of unsafe one

* Fix query params

* fix build

* Whitelist all reserved characters from rfc3986

* Add tests for redirect url sanitizing

* Remove check for resulting URL on header injection test

The thing the tests are testing for can be implemented in other
ways that don't redirect to 100% the same address, but they'll all have
to match the remaining parts of the test to succeed.
2018-07-12 21:31:33 -07:00
7
334649dfd4 Fix response ci header (#1244)
* add unit tests, which should fail

* fix CIDict

* moving CIDict to avoid circular imports

* fix unit tests

* use multidict for headers

* fix cookie

* add version constraint for multidict

* omit test coverage for __main__.py

* make flake8 happy

* consolidate check in for loop

* travisci retry build
2018-07-11 01:44:21 -07:00
fanjindong
becbc5f9ef fix one example and add one example (#1257) 2018-07-11 01:42:34 -07:00
7
a7dd73c657 Merge pull request #23 from channelcat/master
py37 (#1256)
2018-07-03 22:12:02 -07:00
7
f9b29fd7e7 py37 (#1256)
* add py37 to travisci

* use dist:xenial for py37

* sudo: true in .travici

* bump websockets version for py37 support and fix unit tests
2018-07-03 22:07:08 -07:00
7
f770e16f6d Merge pull request #22 from channelcat/master
merge upstream master branch
2018-06-26 23:33:35 -07:00
Arnulfo Solís
9092ee9f0e HTTP Entity Headers (#1127)
* introduced basic entity and hopbyhop header identification

* removed entity headers

* coding style fixes

* remove unneeded header check

* moved from bytes to unicode in headers

* changed list to tuple in empty response statuses
2018-06-26 22:25:25 -07:00
GaryO
01257f65a6 Make auto reloader work on Mac (#1249) 2018-06-18 15:16:10 -07:00
7
c1222175b3 Merge pull request #21 from channelcat/master
remote tracking
2018-06-10 20:17:27 -07:00
Volodymyr Maksymiv
5ff481952d add UUID support (#1241) 2018-06-09 01:16:17 -07:00
7
baa689ad43 Fix failed build and add websockets version specifier (#1239)
* add websockets version constraint

* fix failed build
2018-06-07 10:07:26 -07:00
Philip Xu
2f30f4f69f Fixed #1231 - release resource no matter what (#1232) 2018-06-06 14:43:57 -07:00
Raphael Deem
202a4c6525 make request truthy if has transport (#1222) 2018-05-16 14:12:12 -07:00
7
7928b9b3a2 Merge pull request #20 from channelcat/master
merge upstream master branch
2018-04-29 21:50:07 -07:00
Adam Hopkins
e1c9020268 Update extensions.md (#1205)
Changing the description of [Sanic JWT](https://github.com/ahopkins/sanic-jwt) to include permission scoping
2018-04-29 18:41:17 -07:00
Philip Xu
04a12b436e Added Sanic-Auth, Sanic-CookieSession and Sanic-WTF to Extensions doc (#1210) 2018-04-29 18:40:18 -07:00
Fantix King
818a8c2196 Added GINO to Extensions doc (#1200) 2018-04-21 21:02:49 -07:00
Arnulfo Solís
b6715464fd added init docs (#1167) 2018-04-01 20:53:08 -07:00
Raphael Deem
8f2d543d9f default to auto_reload in debug mode (#1159)
* default to auto_reload in debug mode

* disable auto-reload in testing client
2018-04-01 20:52:56 -07:00
Raphael Deem
6cf320bedb Merge pull request #1181 from kot83/patch-1
rename function in examples to post_json
2018-03-29 20:13:48 -07:00
kot83
a850ce5086 rename function to something else
function already defined
2018-03-29 15:57:10 -07:00
Raphael Deem
ef3bdf5408 Merge pull request #1180 from ashleysommer/fix_aiohttp_breakages
Fix failing tests when aiohttp>=3.1.0
2018-03-29 01:05:50 -07:00
Ashley Sommer
94b9bc7950 Some of the tests in Sanic (test_request_timout, test_response_timeout, test_keep_alive_timeout) use a custom SanicClient with modified methods. This relies on overriding internal aiohttp Client classes.
In aiohttp 3.1.0 there were some breaking changes that caused the custom methods to be no longer compatible with latest upstream aiohttp Client class.
See: 903073283f
and: b42e0ced46

This commit adds aiohttp version checks to adapt to these changes.
2018-03-29 11:54:59 +10:00
Raphael Deem
8a07463a67 Merge pull request #1175 from PyManiacGR/patch-1
Fix try_everything example.
2018-03-28 00:41:07 -07:00
PyManiac
2995b23929 Update try_everything.py 2018-03-24 15:55:15 +02:00
TheRubyDoggy
eb4276373b Fix try_everything example. 2018-03-24 15:34:41 +02:00
Raphael Deem
79df52e519 Merge pull request #1169 from charlax/patch-1
Clarify arguments to request/response middleware
2018-03-21 10:46:09 -07:00
Charles-Axel Dein
3dfb31b1b9 Clarify arguments to request/response middleware 2018-03-21 12:07:26 +01:00
Raphael Deem
c4c4ed70d9 Merge pull request #1163 from vopankov/master
Add __weakref__ to Request slots
2018-03-17 14:52:25 -07:00
Raphael Deem
45422df1b7 Merge pull request #1162 from yunstanford/fix-hang-build
Fix hang build and failed builds
2018-03-16 11:14:44 -07:00
Yun Xu
e0b7624414 fix hang build 2018-03-15 22:06:58 -07:00
Yun Xu
b0ecb3170f fix hang build 2018-03-15 22:03:36 -07:00
Yun Xu
fc8b5f378a migrate to trusty 2018-03-15 21:39:21 -07:00
Yun Xu
d42cb7ddb3 fix hang build 2018-03-15 21:28:52 -07:00
Панков Василий
6454ac0944 Add __weakref__ to Request slots 2018-03-14 13:37:15 +03:00
7
31cf83f10b Merge pull request #19 from channelcat/master
merge upstream master branch
2018-03-13 22:11:40 -07:00
Raphael Deem
cc84005593 Merge pull request #1157 from kinware/feature/add-route-streams
Allow streaming handlers in app.add_route()
2018-03-13 00:08:25 -07:00
Kinware
915d2732a1 Allow streaming handlers in add_route 2018-03-12 20:21:59 +01:00
Raphael Deem
44bc47361e Merge pull request #1149 from channelcat/travis-retry
use travis_retry on tox
2018-03-06 15:54:19 -08:00
Raphael Deem
3619b07843 Merge pull request #1146 from yunstanford/upgrade-test-client
Upgrade test client
2018-03-01 23:18:20 -08:00
Raphael Deem
ad3f588c79 use travis_retry on tox 2018-03-01 23:16:49 -08:00
Yun Xu
a2fc37121b migrating all to async syntax 2018-03-01 22:35:58 -08:00
Raphael Deem
7f36d20123 Merge pull request #1145 from yingshaoxo/patch-1
add an necessary import for better understanding
2018-02-28 01:19:29 -08:00
Yun Xu
d1a8e8b042 fixed unit tests 2018-02-27 22:25:38 -08:00
Yun Xu
c39ddd00d3 workaround fix for an issue in aiohttp.Client 2018-02-27 21:42:41 -08:00
Yun Xu
d55e453bd5 cleaning up 2018-02-27 20:26:49 -08:00
Raphael Deem
bffed27bdb Merge pull request #1142 from clarksun/patch-1
exception.md code sample miss 'async' prefix
2018-02-27 01:03:47 -08:00
7
fffcb158f1 Merge pull request #18 from channelcat/master
Merge upstream master branch
2018-02-26 22:19:30 -08:00
Yun Xu
eca98a54eb fixed all unit tests 2018-02-26 22:18:21 -08:00
Yun Xu
46ed2c5270 upgrade aiohttp for test_client 2018-02-26 22:08:05 -08:00
Sun Wei
23ea0b7ec9 exception.md code sample miss 'async' prefix 2018-02-26 16:09:26 +08:00
yingshaoxo
ef26cb283b add an necessary import for better understanding
add `from sanic.response import redirect`
2018-02-26 11:24:54 +08:00
Eli Uriegas
b8bb77eff6 Merge pull request #1137 from Julien00859/1136
sanic.handlers.ErrorHandler.response handler call was too restrictive
2018-02-21 09:55:52 -06:00
Julien00859
9c75ad3de1 close #1136 2018-02-21 00:50:27 +01:00
Eli Uriegas
0b38dea613 Merge pull request #1117 from abuckenheimer/env_dependent_ujson_uvloop
only install ujson and uvloop with cpython on non windows machines
2018-02-20 13:13:21 -06:00
Raphael Deem
7e4a9e3bc2 Merge pull request #1047 from Yaser-Amiri/master
Add auto reloading.
2018-02-16 11:11:49 -08:00
Raphael Deem
36f12c822f Merge pull request #1122 from knowsuchagency/master
add app.register_listener method
2018-02-15 16:58:27 -08:00
Raphael Deem
0cbea0f5d3 Merge pull request #1129 from cloudship/patch-1
raw requires a bytes-like object
2018-02-14 20:44:40 -08:00
panxb
e735fe54c3 raw requires a bytes-like object
raw requires a bytes-like object, or an object that implements __bytes__, not 'str'
2018-02-15 00:11:37 +08:00
Stephan Fitzpatrick
e911e2e1df updated doc 2018-02-13 23:58:03 -08:00
Stephan Fitzpatrick
1d75f6c2be changed docstring spacing 2018-02-13 10:15:16 -08:00
Raphael Deem
ad8a168469 Merge pull request #1121 from tandalf/issue-1120
Fixed bug when passing a list into route decorator's host argument #1120
2018-02-12 12:48:13 -08:00
Raphael Deem
74fc502089 Merge pull request #1124 from yunstanford/add-doc
Expose WebSocket Param and Add Doc
2018-02-11 02:02:13 -08:00
Yun Xu
dfc2166d8b add websocket.rst to index.rst 2018-02-10 12:21:23 -08:00
Yun Xu
2b70346db4 fix doc 2018-02-09 21:32:09 -08:00
Yun Xu
090df6f224 add websocket section in doc 2018-02-09 21:26:39 -08:00
Yun Xu
745a1d6e94 document websocket args 2018-02-09 21:03:21 -08:00
Yun Xu
0fe0796870 expose websocket protocol arguments 2018-02-09 20:44:02 -08:00
7
224b56bd3a Merge pull request #17 from channelcat/master
merge from upstream sanic
2018-02-09 20:12:29 -08:00
Stephan Fitzpatrick
571b5b544d added app.register_listener method w/test 2018-02-09 14:01:17 -08:00
Timothy Ebiuwhe
220b40f7f4 Added regression tests for issue #1120 2018-02-09 22:33:34 +01:00
Timothy Ebiuwhe
60774c5a49 Fixed bug that occurs on calling @app.route or any of it's variants
causes a route to be added twice. One without the slash, the other with the
Setting strict_slashes to false when a route does not end with slashes
slash. This is ok if the Router._add method runs linearly, but problematic
when it runs recursively. Unfortunately recursion is triggered when
the host param to the Router._add function is a list of hosts.
2018-02-09 22:27:20 +01:00
Raphael Deem
6d37ef7256 Merge pull request #1109 from DirkGuijt/master
fixed bug in multipart/form-data parser
2018-02-08 00:11:20 -08:00
Dirk Guijt
e083224df1 changed bewline formatting 2018-02-07 09:29:44 +01:00
Alec Buckenheimer
5ef567405f fixed platform from windows to win32 2018-02-06 19:56:25 -05:00
Raphael Deem
ea2521f430 Merge pull request #1112 from boboldehampsink/extend_websocketprotocol_arguments
Extend WebSocketProtocol arguments
2018-02-06 15:05:58 -08:00
Alec Buckenheimer
82cb182fe7 added pip requirement to only install ujson and uvloop with cpython on non windows machines 2018-02-06 09:57:16 -05:00
Raphael Deem
3fe31ff551 Merge pull request #1104 from arnulfojr/minor/keep-alive-timeout-log-level
KeepAlive Timeout log level change to debug
2018-02-02 18:54:24 -08:00
Dirk Guijt
48d45f1ca4 sorry, style issue again 2018-02-03 03:14:04 +01:00
Dirk Guijt
ddf2a604d1 changed 'file' variable to 'form_file' to prevent overwriting the reserved word 2018-02-03 03:07:07 +01:00
Raphael Deem
8b920d9d56 Merge pull request #1113 from arnulfojr/bugfix/content-length-header-on-X04
Content Length header on 204 and 304 responses
2018-02-02 13:18:08 -08:00
Arnulfo Solis
f2c0489452 replaced comparison for in operator 2018-02-02 20:19:15 +01:00
Arnulfo Solis
f5a2d19199 touch commit 2018-02-02 14:13:14 +01:00
Arnulfo Solis
86fed12d91 less flake8 warnings in response test 2018-02-02 14:05:57 +01:00
Arnulfo Solis
7ca3ad5d4c no body and content length to 0 when 304 response is returned 2018-02-02 13:24:51 +01:00
Dirk Guijt
1eecffce97 fixed minor flake8 style problem 2018-02-02 09:57:06 +01:00
Dirk Guijt
5c341a2b00 made field name mandatory in multipart/form-data headers
A field name in the Content-Disposition header is required by the multipart/form-data spec. If one field/part does not have it, it will be omitted from the request. When this happens, we log it to DEBUG.
2018-02-02 09:43:42 +01:00
Arnulfo Solis
0ab64e9803 simplified logic when handling the body 2018-02-02 09:29:54 +01:00
Dirk Guijt
27108334f1 Merge branch 'master' of https://github.com/DirkGuijt/sanic 2018-02-02 00:55:58 +01:00
Dirk Guijt
788253cbe8 changes based on discussion on PR #1109 2018-02-02 00:55:51 +01:00
Arnulfo Solis
4b6e89a526 added one more test 2018-02-01 20:00:32 +01:00
Arnulfo Solis
68fd1b66b5 Response model now handles the 204 no content 2018-02-01 17:51:51 +01:00
Bob Olde Hampsink
5806666949 Extend WebSocketProtocol arguments to accept all arguments of websockets.protocol.WebSocketCommonProtocol 2018-02-01 16:23:10 +01:00
DirkGuijt
a76d8108fe small code style change
changed double quotes to single quotes to match the coding style
2018-02-01 11:55:30 +01:00
Arnulfo Solis
2135294e2e changed None to return empty string instead of null string 2018-02-01 11:52:55 +01:00
Dirk Guijt
ed1c563d1f fixed bug in multipart/form-data parser
Sanic automatically assumes that a form field is a file if it has a content-type header, even though the header is text/plain or application/json. This is a fix for it, I took into account the RFC7578 specification regarding the defaults.
2018-02-01 11:30:24 +01:00
Raphael Deem
74efa3a108 Merge pull request #1105 from manisenkov/upgrade-status-to-beta
Upgrade development status to beta
2018-01-31 14:36:06 -08:00
manisenkov
49c29e6862 Upgrade development status to beta 2018-01-31 23:25:50 +01:00
Raphael Deem
17d7f24825 Merge pull request #1108 from manisenkov/pin-pytest-version
Pin pytest version to 3.3.2
2018-01-31 14:24:04 -08:00
manisenkov
f23c3da4ff Pin pytest version to 3.3.2 2018-01-31 22:58:48 +01:00
Arnulfo Solis
cabcf50fbe KeepAlive Timeout log level change to debug 2018-01-30 11:26:15 +01:00
Raphael Deem
8b23b5d389 Merge pull request #1101 from SirEdvin/master
Provide information about sanic-oauth extension
2018-01-29 15:00:11 -08:00
SirEdvin
37eb2c1db6 Provide information about sanic-oauth extension 2018-01-27 10:28:53 +02:00
Eli Uriegas
9751a37343 Merge pull request #1098 from shahinism/refactor/docker
Install Python 3.5 and 3.6 on docker container
2018-01-26 14:01:21 -08:00
Eli Uriegas
ed3bdd3443 Merge pull request #1100 from NyanKiyoshi/master
No longer raising a missing parameter when value is null
2018-01-26 13:56:52 -08:00
NyanKiyoshi
285ad9bdc1 No longer raising a missing parameter when value is null
When passing a null value as parameter (ex.: 0, None or False), Sanic said "Error: Required parameter `param` was not passed to url_for"

Example:

```
@app.route("/<idx>")
def route(rq, idx):
    pass
```

```
url_for("route", idx=0)
```

No longer raises: `Error: Required parameter `idx` was not passed to url_for`
2018-01-26 21:13:43 +01:00
Shahin
16f5914c90 Install Python 3.5 and 3.6 on docker container
To cover all supported versions of Python using Tox
2018-01-24 16:46:45 +03:30
Raphael Deem
72d56a89a2 Merge pull request #1094 from caitinggui/master
update class_based_views
2018-01-23 17:25:59 -08:00
Raphael Deem
1135c8c1b1 Merge pull request #1097 from howie6879/master
Add parameter check
2018-01-23 17:25:25 -08:00
caitinggui
ec4339bd47 update description 2018-01-24 09:02:07 +08:00
howie6879
6c0fbef843 Add parameter check 2018-01-24 08:17:55 +08:00
howie6879
040c85a43b Add parameter check 2018-01-24 08:11:47 +08:00
Raphael Deem
420554c737 Merge pull request #1096 from kyb3r/patch-1
Typo in readme?
2018-01-22 23:56:16 -08:00
howie6879
f20b854dd2 Add parameter check 2018-01-22 14:52:30 +08:00
howie6879
3844cec7a4 Add parameter check 2018-01-22 14:12:41 +08:00
howie.hu
0db49f7520 Merge pull request #4 from channelcat/master
Update
2018-01-22 13:36:48 +08:00
Kyber
f8dedcaa1e Typo in readme? 2018-01-22 15:55:29 +11:00
Yaser Amiri
f8b1122467 Revert "Change parsing cookies mechanism. (like Django instade of http.cookies.SimpleCookie)"
This reverts commit ba1dbacd35.
2018-01-21 09:10:15 +03:30
Raphael Deem
f3bf5e9a5c Merge pull request #1090 from yunstanford/patch-signal-handling
Patch signal handling
2018-01-20 14:03:23 -08:00
Yaser Amiri
ba1dbacd35 Change parsing cookies mechanism. (like Django instade of http.cookies.SimpleCookie) 2018-01-20 12:49:16 +03:30
caitinggui
4036f1c121 update class_based_views 2018-01-19 16:20:07 +08:00
Raphael Deem
22ad697d1f Merge pull request #1078 from eltrhn/master
Add support for blueprint groups and nesting
2018-01-18 17:26:52 -08:00
Eli
a10d7469cd Add blueprint groups + nesting 2018-01-18 17:20:51 -08:00
Raphael Deem
06c3153d22 Merge pull request #1092 from mattfox/patch-1
Add request.method to documentation
2018-01-17 16:10:42 -08:00
Matt Fox
9677158b75 Add request.method to documentation 2018-01-17 07:31:39 -08:00
Raphael Deem
226a73141b Merge pull request #1091 from yunstanford/patch-router-fix
Patch router fix
2018-01-17 00:15:33 -08:00
Raphael Deem
da3201bf35 Merge pull request #1088 from cosven/master
use single quote in readme.rst
2018-01-16 13:34:39 -08:00
Yun Xu
7daebc6aea fix Router.check_dynamic_route_exists 2018-01-15 17:53:37 -08:00
Yun Xu
d9002769cf fix a typo 2018-01-15 17:49:11 -08:00
Yun Xu
6d0b30953a add unit test which should fail on original code 2018-01-15 17:40:44 -08:00
Yun Xu
09d6452475 fixed unit test 2018-01-15 15:15:08 -08:00
Yun Xu
6a61fce84e worker process should ignore SIGINT when run_multiple 2018-01-15 11:53:15 -08:00
Yun Xu
11017902be signal handling 2018-01-15 11:23:49 -08:00
7
bd7333723e Merge pull request #16 from channelcat/master
Merge upstream master branch
2018-01-15 10:48:34 -08:00
cosven
6648250fb9 Merge pull request #1 from cosven/cosven-patch-1
use single quote in readme.rst
2018-01-15 14:56:52 +08:00
cosven
a94a2d46d0 use single quote in readme.rst
As we use single quote in sanic package, we may be supposed to use single quote in readme also?
2018-01-15 14:55:36 +08:00
Raphael Deem
ab97018c78 Merge pull request #1082 from channelcat/1042
fix exception handling
2018-01-13 17:06:46 -08:00
Raphael Deem
be702b0924 Merge pull request #1087 from Stranger6667/fix-get_socket
Fix typo
2018-01-13 17:06:03 -08:00
Dmitry Dygalo
c5c10cfb50 Fix typo 2018-01-13 17:56:29 +01:00
howie.hu
5682d642a6 Merge pull request #3 from channelcat/master
Update
2018-01-10 09:23:43 +08:00
Raphael Deem
62c6d7274c Merge pull request #1083 from bow/fix/log_response_host
Fix log_response to correctly output request ip and port
2018-01-09 13:55:14 -08:00
bow
4f8633375d Fix log_response to correctly output request ip and port 2018-01-09 13:47:01 +01:00
Raphael Deem
9f559818e5 Merge pull request #1081 from howie6879/master
Fix: the Chinese URI
2018-01-08 15:54:46 -08:00
howie6879
5f329f72ee Update test_routes.py 2018-01-08 08:38:54 +08:00
howie6879
7303a06f83 Fix: the Chinese URI 2018-01-07 12:07:18 +08:00
howie6879
e34de96b24 Fix: the Chinese URI 2018-01-07 12:06:21 +08:00
howie6879
42cd424274 Fix: the Chinese URI 2018-01-07 10:59:12 +08:00
howie.hu
9426e94314 Merge pull request #2 from channelcat/master
Update
2018-01-07 09:57:51 +08:00
r0fls
7a1dab3319 fix exception handling 2018-01-05 14:12:22 -08:00
Raphael Deem
d63ec84745 Merge pull request #1037 from youknowone/xdist
Boost test speed by pytest-xdist
2018-01-05 12:10:27 -08:00
Raphael Deem
0e7e2f4e5b Merge pull request #1080 from channelcat/1079
fix timeout bug when self.transport is None
2018-01-04 15:01:40 -08:00
r0fls
46521240a9 fix timeout bug when self.transport is None 2018-01-03 23:33:22 -08:00
Jeong YunWon
a8827a5d95 Boost test speed by pytest-xdist 2018-01-03 18:52:25 +09:00
Raphael Deem
ca0bc1cb7d Merge pull request #1076 from channelcat/1074
fix strict_slashes bug when route has slash
2018-01-01 12:47:48 -08:00
r0fls
8c28ce7d79 fix strict_slashes bug when route has slash 2018-01-01 02:24:48 -08:00
r0fls
5b051f0891 add test 2018-01-01 02:14:55 -08:00
Raphael Deem
c93de9450a Merge pull request #1073 from Kurlov/master
remove uvloop for windows setup
2017-12-29 14:56:38 -08:00
Aleksandr Kurlov
96976fa892 remove uvloop for windows setup 2017-12-29 23:04:22 +05:00
Raphael Deem
c14e99cef0 Merge pull request #1071 from r0fls/1062-docs
udpate docs with add_task app injection
2017-12-28 02:01:23 -08:00
Raphael Deem
c91a806774 udpate docs with add_task app injection 2017-12-28 01:59:16 -08:00
Raphael Deem
2f0076f429 Merge pull request #1063 from r0fls/1062
try to inject the app in add_task method
2017-12-27 11:40:05 -08:00
Raphael Deem
a1ffc6d55b try to inject the app in add_task method 2017-12-27 01:06:43 -08:00
Yaser Amiri
9bdf7a9980 Revert files those fixed for flake problems. 2017-12-26 23:35:54 +03:30
Yaser Amiri
81494453b0 Remove dependency on requests library.
Change auto reloader enviroment varible name to SANIC_SERVER_RUNNING
Fix some typo mistakes, flake uncompatibilities and such problems.
Raise NotImplementedError for operating systems except posix systems for auto reloading.
2017-12-26 19:17:13 +03:30
Raphael Deem
008cbe5ce7 Merge pull request #1069 from r0fls/1050
add samesite cookie to cookie keys
2017-12-24 02:50:37 -08:00
Raphael Deem
5ee35e7eeb add samesite cookie to cookie keys 2017-12-24 02:33:52 -08:00
Raphael Deem
1a98e70281 Merge pull request #1066 from r0fls/1065
allow add_task after server starts
2017-12-21 23:44:18 -08:00
Raphael Deem
8e3f3977bd allow add_task after server starts 2017-12-21 23:37:42 -08:00
Raphael Deem
04b04f094c Merge pull request #1064 from r0fls/1061
double quotes in unauthorized exception per rfc7230
2017-12-21 18:59:41 -08:00
Raphael Deem
9c02cdbad9 double quotes in unauthorized exception per rfc7230 2017-12-21 18:05:05 -08:00
Raphael Deem
c30e805623 Merge pull request #1060 from seemethere/fix_readthedocs_builds
Add sphinxcontrib-asyncio to environment.yml
2017-12-20 23:40:04 -08:00
Eli Uriegas
cd81538ce3 Add sphinxcontrib-asyncio to environment.yml
readthedocs builds were broken because they were missing this dependency

Signed-off-by: Eli Uriegas <seemethere101@gmail.com>
2017-12-19 22:55:50 -06:00
howie.hu
68cb280513 Merge pull request #1 from channelcat/master
Update
2017-12-20 09:08:52 +08:00
Raphael Deem
19466a15b4 Merge pull request #1055 from youknowone/cancel-timeout
Cancel request tasks when response timeout is triggered
2017-12-17 16:46:29 -08:00
Jeong YunWon
d54b406cba Cancel request tasks when response timeout is triggered
Before: Even after raising ResponseTimeout, server still processes
remaining tasks until it is done
After: Before raising ResponseTimeout, server stops working task.
2017-12-14 18:43:52 +09:00
Raphael Deem
72254a7af9 Merge pull request #1054 from r0fls/rfc7231
fix issues with method not allowed response
2017-12-13 23:42:37 -08:00
Raphael Deem
52feff266e fix edge case with methods as None 2017-12-13 23:23:04 -08:00
Raphael Deem
2c3f50e34a fix stream handling 2017-12-13 23:06:18 -08:00
Raphael Deem
2b0258c13a fix issues with method not allowed response 2017-12-11 20:12:26 -08:00
Raphael Deem
2585900692 Merge pull request #1053 from danpalmer/master
Fix ip and socket data format on V6
2017-12-11 20:24:52 -06:00
Dan Palmer
48c2dcb110 Fix ip and socket data format on V6 2017-12-11 22:16:03 +00:00
Dan Palmer
10a378bd46 Typo 2017-12-11 14:33:43 +00:00
Yaser Amiri
3fe3c2c79f Add test for auto reloading. 2017-12-07 20:19:40 +03:30
Yaser Amiri
52c2a8484e Add auto reloader. 2017-12-07 16:30:54 +03:30
Eli Uriegas
21435c1863 Merge pull request #1045 from seemethere/increment_070
Increment to 0.7.0
2017-12-05 19:27:11 -08:00
Eli Uriegas
1ea3ab7fe8 Increment to 0.7.0
Signed-off-by: Eli Uriegas <seemethere101@gmail.com>
2017-12-05 19:13:16 -08:00
Raphael Deem
1b0ad2c3cd Merge pull request #1035 from yunstanford/patch-N
Adopt new websockets interface
2017-12-02 01:27:09 -08:00
Raphael Deem
aa4821864a Merge pull request #1039 from lixxu/master
check request.ip before using it
2017-11-28 19:47:34 -08:00
lixxu
283762224c clean codes 2017-11-28 14:47:43 +08:00
lixxu
f50a37fc88 ignore error if request.ip is None 2017-11-28 14:44:32 +08:00
Yun Xu
076f0515ca Fix flake8 2017-11-25 21:14:18 -08:00
Yun Xu
049f12096d fix unit tests 2017-11-25 21:07:38 -08:00
Yun Xu
f09c0393ba adopt new websockets interface 2017-11-25 21:01:22 -08:00
7
472bbcf293 Merge pull request #15 from channelcat/master
Merge upstream master branch
2017-11-25 20:49:09 -08:00
Raphael Deem
7a3f9daccf Merge pull request #1025 from nkoshell/route-version-params
Route version params
2017-11-20 23:43:22 -08:00
Nikita Koshelev
76511d61e0 Added removing duplicate 'v' for Router.add() version parameter
Fix sanic/router.py:123:80: E501 line too long (80 > 79 characters)
2017-11-18 01:39:00 +03:00
Nikita Koshelev
8e7475ccf6 Added regex escaping for Router.add() version parameter 2017-11-18 01:22:42 +03:00
Raphael Deem
820d8c7bf5 Merge pull request #1021 from EdwardBetts/spelling
Correct spelling mistakes.
2017-11-16 16:26:40 -08:00
Edward Betts
cfc75b4f1a Correct spelling mistakes. 2017-11-15 15:46:39 +00:00
Raphael Deem
98567fe5a8 Merge pull request #1008 from youknowone/pytest-xdist
Let SanicTestClient has its own port
2017-11-10 10:50:01 -08:00
Raphael Deem
05bb812e2b Merge pull request #1010 from Yaser-Amiri/master
Change unit tests names with repeated names.
2017-11-08 20:25:45 -08:00
Yaser Amiri
c9876a6c88 Change unit tests names with repeated names. 2017-11-08 14:14:57 +03:30
Raphael Deem
979b5a52d3 Merge pull request #1005 from joar/feature/static-strict-slashes
Add strict_slashes to {app, blueprint}.static()
2017-11-07 07:49:32 -08:00
Joar Wandborg
e70535e8d7 Use .get instead of .pop 2017-11-07 10:34:17 +01:00
Jeong YunWon
ed8725bf6c Let SanicTestClient has its own port
For parallel test running, the servers must have different ports.
See examples/pytest_xdist.py for example.
2017-11-06 17:29:32 +09:00
Raphael Deem
098cd70e82 Merge pull request #1007 from furious-luke/master
Call connection_open after websocket handshake
2017-11-05 14:26:19 -08:00
Raphael Deem
969dac2033 Merge pull request #1004 from Stibbons/optionalize_log_config
Optionalize app.run dictConfig (fix #1000)
2017-11-04 12:35:38 -07:00
Gaetan Semet
49b1d667f1 Optionalize app.run dictConfig (fix #1000)
Signed-off-by: Gaetan Semet <gaetan@xeberon.net>
2017-11-04 15:58:27 +01:00
Luke Hodkinson
bca1e08411 Call connection_open after websocket handshake
It seems that due to (recent?) changes in the websocket library, we
now need to call "connection_open" to flag that the websocket is now
ready to use. I've added that call just after the call to
"connection_made".
2017-11-04 22:04:59 +11:00
Raphael Deem
bf6ed217c2 Merge pull request #1006 from r0fls/routing-fix
check if method is added in strict slash logic
2017-11-04 00:09:52 -07:00
Raphael Deem
bb8e9c6438 check if method is added in strict slash logic 2017-11-03 18:36:06 -07:00
Joar Wandborg
f128ed5b1f Set threshold to 1MiB instead of 0.97MiB
Reference: https://en.wikipedia.org/wiki/Mebibyte#Definition
2017-11-03 14:37:01 +01:00
Joar Wandborg
ff5786d61b pep8 2017-11-03 14:33:24 +01:00
Joar Wandborg
ca596c8ecd Add strict_slashes to {Sanic, Blueprint}().static() 2017-11-02 15:44:36 +01:00
Raphael Deem
c3bcafb514 Merge pull request #997 from ignatenkobrain/localhost
tests: do not assume that locahost == 127.0.0.1
2017-11-01 00:22:14 -07:00
Igor Gnatenko
a9c7d95e9b tests: do not assume that locahost == 127.0.0.1
Signed-off-by: Igor Gnatenko <ignatenkobrain@fedoraproject.org>
2017-10-31 09:39:09 +01:00
Raphael Deem
63bbcb5152 Merge branch 'master' into 977 2017-10-25 22:18:25 -07:00
Raphael Deem
01042c1d98 Merge pull request #992 from r0fls/968
remove port from ip
2017-10-25 22:15:07 -07:00
Raphael Deem
5bf722c7ae remove bare exceptions 2017-10-25 21:58:31 -07:00
Raphael Deem
c2191153cf remove port from ip 2017-10-23 21:37:59 -07:00
davidtgq
5bcbc5a337 Replaced COMMON_STATUS_CODES with a simple 200 check for more fast (#982)
* Replaced COMMON_STATUS_CODES with a simple 200 check for more fast

* Added IPware algorithm

* Remove HTTP prefix from Django-style headers
Remove right_most_proxy because it's outside spec

* Remove obvious docstrings

* Revert "Replaced COMMON_STATUS_CODES with a simple 200 check for more fast"

This reverts commit 15b6980

* Revert "Added IPware algorithm"

This reverts commit bdf66cb

WTF HOW DO I GIT

* Revert "Revert "Replaced COMMON_STATUS_CODES with a simple 200 check for more fast""

This reverts commit d8df095

* Revert "Added IPware algorithm"

This reverts commit bdf66cb

* Delete ip.py
2017-10-19 16:43:07 -07:00
Eli Uriegas
f721f90add Merge pull request #820 from youknowone/worker-protocol
Protocol configurable gunicorn worker
2017-10-19 16:21:28 -07:00
Eli Uriegas
0e92d8ce2c Merge branch 'master' into worker-protocol 2017-10-19 16:21:18 -07:00
Eli Uriegas
727d6a1b61 Merge pull request #972 from lanfon72/patch-2
to fix condition error that used in `log_response`
2017-10-19 16:16:57 -07:00
Raphael Deem
666c0847b7 Merge pull request #976 from ashleysommer/fix_websocket_timeout
Fix Websocket protocol timeouts after #939
2017-10-18 21:20:52 -07:00
Raphael Deem
0a411f9bba Merge pull request #985 from ashleysommer/ashleysommer-docs-spf
Add Sanic-Plugins-Framework library to Extensions doc
2017-10-18 21:20:13 -07:00
Ashley Sommer
49f3ba39f9 Add Sanic-Plugins-Framework library to Extensions doc
I made a new tool for devs to use for easily and quickly creating Sanic Plugins (extensions), and for application builders to easily use those plugins in their app.
2017-10-18 17:52:03 +10:00
Raphael Deem
794128a053 Merge pull request #981 from kszucs/manifest
Include LICENSE file in manifest
2017-10-17 10:31:19 -07:00
Krisztián Szűcs
e6be3b2313 include LICENSE file in manifest 2017-10-17 16:05:24 +02:00
Raphael Deem
9150767574 add blueprint name to request.endpoint 2017-10-16 23:25:37 -07:00
Raphael Deem
75f2180cb1 add handler name to request as endpoint 2017-10-16 22:43:40 -07:00
Raphael Deem
c5cdcf0f95 Merge pull request #975 from ashleysommer/timeouts_documentation
Add documentation for new Timeout values, after #939
2017-10-16 09:13:49 -07:00
Ashley Sommer
ea5b07f636 Update websocket protocol to accomodate changes in HTTP protocol from https://github.com/channelcat/sanic/pull/939
Fixes https://github.com/channelcat/sanic/issues/969
2017-10-16 11:06:33 +10:00
Ashley Sommer
477e6b8663 Add documentation for REQUEST_TIMEOUT, RESPONSE_TIMEOUT and KEEP_ALIVE_TIMEOUT config values.
Fixed some inconsistent default values.
2017-10-16 10:53:45 +10:00
Raphael Deem
a0d8418b40 Merge pull request #965 from samael500/master
fix issue #959
2017-10-13 14:58:46 -07:00
Raphael Deem
006fb08024 Merge pull request #966 from yunstanford/patch-M
Sanic routes should not pass angled params with empty names
2017-10-13 02:18:20 -07:00
lanf0n
4578f6016b to fix condition error that used in log_response
`request` class is derived from `dict`, so it will never be `True`.
2017-10-13 16:48:02 +08:00
Raphael Deem
5b06bcc57d Merge pull request #967 from samael500/custom_filename
Custom filename
2017-10-13 01:35:11 -07:00
Raphael Deem
d4bb14a511 Merge pull request #971 from pcinkh/socket_disconnects_speedup
Critical speedup websocket disconnects from O(N) to O(1)
2017-10-13 01:25:14 -07:00
pcinkh
6d2f5da506 Speedup websocket disconnects. 2017-10-11 14:02:26 +03:00
Yun Xu
c96df86111 make flake8 happy 2017-10-09 07:58:04 -07:00
Maks Skorokhod
86f87cf4ac 🔧 no use f'string' 2017-10-09 17:55:35 +03:00
Yun Xu
770a8fb288 raise exception for invalid param syntax 2017-10-09 07:54:39 -07:00
Maks Skorokhod
c4e3a98ea7 add test for custom filename 2017-10-09 17:45:42 +03:00
Maks Skorokhod
07e95dba4f 🔁 customize filename in file response 2017-10-09 17:45:22 +03:00
7
9bc1abcd00 Merge pull request #14 from channelcat/master
merge upstream master branch
2017-10-09 07:19:57 -07:00
Maks Skorokhod
4d515b05f3 fix missed assertion 2017-10-09 17:18:04 +03:00
Maks Skorokhod
64edf7ad9c upd test for connection lost error 2017-10-09 16:00:32 +03:00
Maks Skorokhod
7610c0fb2e 🔧 log Connection lost only if debug 2017-10-09 15:50:36 +03:00
Raphael Deem
0189e4ed59 Merge pull request #962 from ProstoMaxim/fix_logs
Fix logs
2017-10-08 20:16:11 -07:00
Raphael Deem
8018c9b91d Merge pull request #961 from r0fls/fix-920
fix false cookie encoding and output
2017-10-06 23:57:30 -07:00
Max Murashov
4b3920daba Fix logs 2017-10-06 16:53:30 +03:00
Raphael Deem
d876e3ed5c fix false cookie encoding and output 2017-10-05 22:20:50 -07:00
Raphael Deem
086b5daa53 Merge pull request #960 from piotrbulinski/refactor_server_access_log
Refactor access log for server.HttpProtocol
2017-10-05 20:20:28 -07:00
Piotr Buliński
4b877e3f6b Update server.py 2017-10-05 09:28:13 +02:00
Piotr Buliński
8ce749e339 Update server.py 2017-10-05 09:27:18 +02:00
Piotr Buliński
752ddfa7fc Merge branch 'master' into refactor_server_access_log 2017-10-05 09:26:19 +02:00
Raphael Deem
8700c96c4d Merge pull request #942 from yunstanford/patch-logging-refactor
Patch logging refactor
2017-10-05 00:22:02 -07:00
Piotr Bulinski
e3852ceeca Refactor access log for server 2017-10-04 12:50:57 +02:00
Yun Xu
225ea49b6f resolve conflicts again 2017-10-01 01:22:27 -07:00
Raphael Deem
15fd49037f Merge pull request #939 from ashleysommer/keepalive_timeout
Split RequestTimeout, ResponseTimeout, and KeepAliveTimeout into different timeouts
2017-09-30 22:15:50 -07:00
Raphael Deem
2fb4697e12 Merge pull request #952 from ahopkins/patch-1
Update extensions.md
2017-09-29 18:33:30 -07:00
Eli Uriegas
1a9f770317 Merge pull request #957 from lanfon72/master
add sphinx extension to add asyncio-specific markups
2017-09-29 11:17:29 -07:00
lanf0n
62871ec9b3 add sphinx extension to add asyncio-specific markups 2017-09-30 01:16:26 +08:00
Raphael Deem
39c64214ee Merge pull request #953 from r0fls/949
support vhosts in static routes
2017-09-27 01:28:43 -07:00
Raphael Deem
9aec5febb8 support vhosts in static routes 2017-09-27 01:24:49 -07:00
Adam Hopkins
91b2167eba Update extensions.md
Add - [JWT](https://github.com/ahopkins/sanic-jwt): Authentication extension for JSON Web Tokens (JWT) extension package.
2017-09-27 11:07:06 +03:00
Raphael Deem
00d40a35cd Merge pull request #951 from lixxu/master
fix bug and set scheme to http if not provided
2017-09-26 21:57:54 -07:00
lixxu
f96ab02767 set scheme to http if not provided 2017-09-27 09:59:49 +08:00
Raphael Deem
4ce699e57f Merge pull request #944 from blazehu/master
add __repr__ for sanic request
2017-09-25 13:58:09 -07:00
Raphael Deem
4ee042c330 Merge pull request #948 from chiuczek/json-dependency-injection
Use dependency injection to allow alternative json parser or encoder
2017-09-24 21:05:39 -07:00
Yun Xu
0b23f4ff81 resolve conflicts 2017-09-23 06:19:09 -07:00
Hugh McNamara
5cef1634ed use json_loads function in json property of request 2017-09-22 10:19:15 +01:00
Eli Uriegas
1b0286916e Merge pull request #947 from lanfon72/patch-1
to fix if platform is windows.
2017-09-19 10:15:35 -07:00
Hugh McNamara
a8f764c161 make method instead of property for alternative json decoding of request 2017-09-19 18:12:53 +01:00
Hugh McNamara
1d719252cb use dependency injection to allow alternative json parser or encoder 2017-09-19 14:58:49 +01:00
lanf0n
d8cebe1188 to fix if platform is windows. 2017-09-19 18:14:25 +08:00
Eli Uriegas
329ebf6a5d Merge pull request #946 from trthhrtz/patch-1
Update getting_started.md
2017-09-18 11:13:48 -07:00
Kuzma Leshakov
c836441a75 Update getting_started.md
Hello World example at the main Readme file (https://github.com/channelcat/sanic/blob/master/README.rst) is different, it returns json. Here is returned text. In the following examples, such as Routing (http://sanic.readthedocs.io/en/latest/sanic/routing.html) is again used json. Therefore I suggest to make examples the same, having json as output
2017-09-18 11:37:32 +03:00
huyuhan
074d36eeba add __repr__ for sanic request 2017-09-15 21:15:05 +08:00
huyuhan
f6eb35f67d add __repr__ for sanic request 2017-09-15 21:05:25 +08:00
huyuhan
77f70a0792 add __repr__ for sanic request 2017-09-15 20:56:44 +08:00
huyuhan
12dafd07b8 add __repr__ for sanic request 2017-09-15 18:34:56 +08:00
Eli Uriegas
9fb8bec715 Merge pull request #943 from crvv/master
fix #763, sanic can't decode latin1 encoded header value
2017-09-14 13:38:44 -07:00
Wèi Cōngruì
eb1146c6b6 fix #763, sanic can't decode latin1 encoded header value 2017-09-14 19:23:02 +08:00
Yun Xu
730f7c5e41 add doc for customizing logging config 2017-09-13 18:30:38 -07:00
Yun Xu
5cabc9cff2 update doc 2017-09-13 18:16:58 -07:00
Yun Xu
ddc039ed2e update doc 2017-09-13 18:14:46 -07:00
Eli Uriegas
a146ebd856 Merge pull request #941 from aiosin/master
add status codes and teapot example
2017-09-13 11:39:14 -07:00
Yun Xu
5ee7b6caeb fixing small issue 2017-09-13 10:35:34 -07:00
Yun Xu
9c4b0f7b15 fix flake8 2017-09-13 07:40:42 -07:00
aiosin
2e5d1ddff9 add status codes and teapot example 2017-09-13 14:08:29 +02:00
Yun Xu
24bdb1ce98 add unit tests/refactoring 2017-09-12 23:42:42 -07:00
Ashley Sommer
8eb59ad4dc Fixed error where the RequestTimeout test wasn't actually testing the correct behaviour
Fixed error where KeepAliveTimeout wasn't being triggered in the test suite, when using uvloop
Fixed test cases when using other asyncio loops such as uvloop
Fixed Flake8 linting errors
2017-09-13 10:18:36 +10:00
Raphael Deem
d8c8ccd180 Merge pull request #932 from lixxu/master
static files url building using url_for
2017-09-12 12:59:04 -07:00
Yun Xu
a46e004f07 apply new loggers 2017-09-11 22:12:49 -07:00
Ashley Sommer
173f94216a Fixed the delays, and expected responses, in the keepalive_timeout tests 2017-09-12 13:40:43 +10:00
Ashley Sommer
1a74accd65 finished the keepalive_timeout tests 2017-09-12 13:09:42 +10:00
Ashley Sommer
2979e03148 WIP - Split RequestTimeout, ResponseTimout, and KeepAliveTimeout into different timeouts, with different callbacks. 2017-09-11 17:17:33 +10:00
Yun Xu
4bdb9a2c8e prototype 2017-09-10 23:19:09 -07:00
Yun Xu
8f6fa5e9ff old logging cleanup 2017-09-10 18:44:54 -07:00
Yun Xu
986135ff76 remove DefaultFilter 2017-09-10 18:39:42 -07:00
Yun Xu
c9cbc00e36 use access_log as param 2017-09-10 18:38:52 -07:00
Yun Xu
c9a40c180a remove some logging stuff 2017-09-10 11:11:16 -07:00
7
125cb17fcb Merge pull request #13 from channelcat/master
sync from upstream master branch
2017-09-09 16:52:13 -07:00
Raphael Deem
53a5bd2319 Merge pull request #936 from yunstanford/patch-debug-logging
Patch debug logging
2017-09-08 20:49:16 -07:00
Raphael Deem
4fd68f9af3 Merge pull request #935 from iad42/patch-1
Added information on request.token
2017-09-08 20:49:01 -07:00
Yun Xu
c4417b399b fixing debug logging 2017-09-08 17:47:05 -07:00
7
c2a3e42a53 Merge pull request #12 from channelcat/master
merge upstream master branch
2017-09-08 17:39:50 -07:00
Anatoly Ivanov
73c04f5a89 Added information on request.token
The manual lacked info about request.token, which keeps authorization data. See https://github.com/channelcat/sanic/blob/master/sanic/request.py#L84 for details
2017-09-08 14:21:49 +03:00
lixxu
195f707f14 missing '/' in doc 2017-09-06 19:19:59 +08:00
lixxu
bc20dc5c62 use url_for for url building for static files 2017-09-06 19:17:52 +08:00
Raphael Deem
8b4ca51805 Merge pull request #931 from Tim-Erwin/envvar_prefix
make the prefix for environment variables alterable
2017-09-05 11:22:56 -07:00
Tim Mundt
e2e25eb751 fixed flake convention 2017-09-05 11:05:31 +02:00
Tim Mundt
9572ecc5ea test for env var prefix 2017-09-05 10:58:48 +02:00
Tim Mundt
97d8b9e908 documentation for env var prefix; allow passing in the prefix through the app constructor 2017-09-05 10:41:55 +02:00
Tim Mundt
c59a8a60eb make the prefix for environment variables alterable 2017-09-05 09:53:33 +02:00
Raphael Deem
158da0927a Merge pull request #901 from lixxu/master
add name option for route building
2017-08-31 15:29:03 -07:00
Eli Uriegas
78a7338346 Merge pull request #922 from timka/patch-1
Example logging X-Request-Id transparently
2017-08-31 10:35:48 -07:00
Eli Uriegas
90e5c8d39b Merge pull request #904 from jiaxiaolei/master
feat(examples): add `authorized_sanic.py`
2017-08-31 10:35:23 -07:00
Raphael Deem
7a6f2d8336 Merge pull request #926 from manisenkov/patch-1
Fix LICENSE date and name
2017-08-30 17:49:59 -07:00
Maksim Anisenkov
f49554aa57 Fix LICENSE date and name 2017-08-30 15:30:22 +02:00
Timur
0a72168f8f Example logging X-Request-Id transparently 2017-08-29 23:05:57 +03:00
Raphael Deem
5011bfef55 Merge pull request #917 from CharAct3/bugfix/fix_unauthorized
fix #914, change arguments of Unauthorized.__init__
2017-08-24 13:45:58 -07:00
Darren
6038813d03 fix #914, change arguments of Unauthorized.__init__ 2017-08-24 22:59:25 +08:00
Raphael Deem
fee9de96de Merge pull request #908 from Ezi4Zy/master
fix: error param
2017-08-23 16:22:13 -07:00
xmsun
35e028cd99 fix: error param 2017-08-22 16:40:42 +08:00
lixxu
145cdd5c1b Merge branch 'use-route-name-for-method' 2017-08-22 14:02:56 +08:00
lixxu
762b2782ee use name to define route name for different methods on same url 2017-08-22 14:02:38 +08:00
jiaxiaolei
91f031b661 feat(examples): add authorized_sanic.py
You can check a request if the client is authorized
to access a resource by the decorator `authorized`
2017-08-21 22:40:07 +08:00
lixxu
eab809d410 add name option for route building 2017-08-21 18:05:34 +08:00
Raphael Deem
826f1b4713 Merge pull request #898 from jiaxiaolei/master
feat(examples): add `add_task_sanic.py`
2017-08-21 00:32:38 -07:00
Raphael Deem
fa1a95ae91 Merge pull request #900 from yunstanford/patch-default-strict-slashes
Patch default strict slashes
2017-08-21 00:31:42 -07:00
Yun Xu
63babae63d add doc 2017-08-21 00:28:01 -07:00
Raphael Deem
db9924a399 Merge pull request #899 from hatarist/patch-2
Add a line on headers in the "Request Data" docs
2017-08-20 23:51:07 -07:00
Yun Xu
5d23c7644b add unit tests 2017-08-20 23:37:22 -07:00
Yun Xu
ef81a9f547 make strict_slashes default value configurable 2017-08-20 23:11:38 -07:00
7
747c21da70 Merge pull request #11 from channelcat/master
merge upstream master branch
2017-08-20 22:51:19 -07:00
Igor Hatarist
439ff11d13 Added a line on headers in the "Request Data" docs 2017-08-20 19:28:09 +03:00
jiaxiaolei
947364e15f feat(exapmles): add add_task_sanic.py 2017-08-20 11:11:14 +08:00
Eli Uriegas
750115b727 Merge pull request #894 from pkuphy/patch-1
fix typo
2017-08-18 10:44:35 -07:00
pkuphy
a55efc832d fix typo 2017-08-19 01:03:54 +08:00
Raphael Deem
c96bd21389 Merge pull request #892 from jiaxiaolei/master
docs(README): Make it clear and easy to read.
2017-08-18 02:09:19 -07:00
jiaxiaolei
dd241bd6fa docs(README): Make it clear and easy to read. 2017-08-18 17:00:34 +08:00
Raphael Deem
0dbde7400f Merge pull request #889 from dongweiming/doc
Fix blueprint doc
2017-08-16 00:19:58 -07:00
dongweiming
2587f6753d Fix blueprint doc 2017-08-15 22:04:25 +08:00
Raphael Deem
4155e76a81 Merge pull request #886 from yunstanford/fix-cov-report
Fix cov report
2017-08-14 16:21:50 -07:00
Yun Xu
756bd19181 do not fail if no files for coverage combine 2017-08-10 08:39:02 -07:00
Yun Xu
fbb2344895 fix cov report 2017-08-10 07:55:38 -07:00
7
bda6c85638 Merge pull request #10 from channelcat/master
merge upstream master branch
2017-08-10 07:47:45 -07:00
Raphael Deem
df4a149cd0 Merge pull request #885 from yunstanford/master
add triggers events when async create_server
2017-08-09 15:59:44 -07:00
Yun Xu
80f27b1db9 add unit tests and make flake8 happy 2017-08-08 22:21:40 -07:00
Yun Xu
d5d1d3b45a add trigger before_start events in create_server 2017-08-08 21:58:10 -07:00
Eli Uriegas
c797c3f22d Merge pull request #883 from miguelgrinberg/websocket-subprotocols
Weboscket subprotocol negotiation
2017-08-08 11:49:21 -07:00
Miguel Grinberg
375ed23216 Weboscket subprotocol negotiation
Fixes #874
2017-08-08 11:40:44 -07:00
Raphael Deem
7b66a56cad Merge pull request #870 from MichaelYusko/small-amendment
Did the small changes for better readable
2017-08-03 18:39:26 -07:00
MichaelYusko
7216bf7835 merge master into local branch 2017-08-03 12:11:47 +03:00
Yun Xu
f99a723627 fixed small doc issue 2017-08-02 09:05:33 -07:00
7
181ffb00a7 Merge pull request #9 from channelcat/master
merging upstream master branch
2017-08-02 09:04:31 -07:00
MichaelYusko
429f7377cb Did the small changes for better readable 2017-07-26 19:32:23 +03:00
Jeong YunWon
47abf83960 Protocol configurable gunicorn worker 2017-07-12 22:30:13 +09:00
147 changed files with 13068 additions and 3939 deletions

32
.appveyor.yml Normal file
View File

@@ -0,0 +1,32 @@
version: "{branch}.{build}"
environment:
matrix:
- TOXENV: py35-no-ext
PYTHON: "C:\\Python35-x64"
PYTHON_VERSION: "3.5.x"
PYTHON_ARCH: "64"
- TOXENV: py36-no-ext
PYTHON: "C:\\Python36-x64"
PYTHON_VERSION: "3.6.x"
PYTHON_ARCH: "64"
- TOXENV: py37-no-ext
PYTHON: "C:\\Python37-x64"
PYTHON_VERSION: "3.7.x"
PYTHON_ARCH: "64"
init: SET "PATH=%PYTHON%;%PYTHON%\\Scripts;%PATH%"
install:
- pip install tox
build: off
test_script: tox
notifications:
- provider: Email
on_build_success: false
on_build_status_changed: false

View File

@@ -1,7 +1,7 @@
[run]
branch = True
source = sanic
omit = site-packages, sanic/utils.py
omit = site-packages, sanic/utils.py, sanic/__main__.py
[html]
directory = coverage

25
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@@ -0,0 +1,25 @@
---
name: Bug report
about: Create a report to help us improve
---
**Describe the bug**
A clear and concise description of what the bug is, make sure to paste any exceptions and tracebacks.
**Code snippet**
Relevant source code, make sure to remove what is not necessary.
**Expected behavior**
A clear and concise description of what you expected to happen.
**Environment (please complete the following information):**
- OS: [e.g. iOS]
- Version [e.g. 0.8.3]
**Additional context**
Add any other context about the problem here.

View File

@@ -0,0 +1,16 @@
---
name: Feature request
about: Suggest an idea for Sanic
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Additional context**
Add any other context or sample code about the feature request here.

13
.github/ISSUE_TEMPLATE/help-wanted.md vendored Normal file
View File

@@ -0,0 +1,13 @@
---
name: Help wanted
about: Do you need help? Try community.sanicframework.org
---
*DELETE ALL BEFORE POSTING*
*Post your HELP WANTED questions on [the community forum](https://community.sanicframework.org/)*.
Checkout the community forum before posting any question here.
We prefer if you put these kinds of questions here:
https://community.sanicframework.org/c/questions-and-help

1
.gitignore vendored
View File

@@ -15,3 +15,4 @@ docs/_build/
docs/_api/
build/*
.DS_Store
dist/*

View File

@@ -1,5 +1,4 @@
sudo: false
dist: precise
language: python
cache:
directories:
@@ -14,17 +13,29 @@ matrix:
python: 3.6
- env: TOX_ENV=py36-no-ext
python: 3.6
- env: TOX_ENV=flake8
- env: TOX_ENV=py37
python: 3.7
dist: xenial
sudo: true
- env: TOX_ENV=py37-no-ext
python: 3.7
dist: xenial
sudo: true
- env: TOX_ENV=lint
python: 3.6
- env: TOX_ENV=check
python: 3.6
install: pip install -U tox
script: tox -e $TOX_ENV
install:
- pip install -U tox
- pip install codecov
script: travis_retry tox -e $TOX_ENV
after_success:
- codecov
deploy:
provider: pypi
user: channelcat
user: brewmaster
password:
secure: OgADRQH3+dTL5swGzXkeRJDNbLpFzwqYnXB4iLD0Npvzj9QnKyQVvkbaeq6VmV9dpEFb5ULaAKYQq19CrXYDm28yanUSn6jdJ4SukaHusi7xt07U6H7pmoX/uZ2WZYqCSLM8cSp8TXY/3oV3rY5Jfj/AibE5XTbim5/lrhsvW6NR+ALzxc0URRPAHDZEPpojTCjSTjpY0aDsaKWg4mXVRMFfY3O68j6KaIoukIZLuoHfePLKrbZxaPG5VxNhMHEaICdxVxE/dO+7pQmQxXuIsEOHK1QiVJ9YrSGcNqgEqhN36kYP8dqMeVB07sv8Xa6o/Uax2/wXS2HEJvuwP1YD6WkoZuo9ZB85bcMdg7BV9jJDbVFVPJwc75BnTLHrMa3Q1KrRlKRDBUXBUsQivPuWhFNwUgvEayq2qSI3aRQR4Z0O+DfboEhXYojSoD64/EWBTZ7vhgbvOTGEdukUQSYrKj9P8jc1s8exomTsAiqdFxTUpzfiammUSL+M93lP4urtahl1jjXFX7gd3DzdEEb0NsGkx5lm/qdsty8/TeAvKUmC+RVU6T856W6MqN0P+yGbpWUARcSE7fwztC3SPxwAuxvIN3BHmRhOUHoORPNG2VpfbnscIzBKJR4v0JKzbpi0IDa66K+tCGsCEvQuL4cxVOtoUySPWNSUAyUWWUrGM2k=
secure: "GoawLwmbtJOgKB6AJ0ZSYUUnNwIoonseHBxaAUH3zu79TS/Afrq+yB3lsVaMSG0CbyDgN4FrfD1phT1NzbvZ1VcLIOTDtCrmpQ1kLDw+zwgF40ab8sp8fPkKVHHHfCCs1mjltHIpxQa5lZTJcAs6Bpi/lbUWWwYxFzSV8pHw4W4hY09EHUd2o+evLTSVxaploetSt725DJUYKICUr2eAtCC11IDnIW4CzBJEx6krVV3uhzfTJW0Ls17x0c6sdZ9icMnV/G9xO/eQH6RIHe4xcrWJ6cmLDNKoGAkJp+BKr1CeVVg7Jw/MzPjvZKL2/ki6Beue1y6GUIy7lOS7jPVaOEhJ23b0zQwFcLMZw+Tt+E3v6QfHk+B/WBBBnM3zUZed9UI+QyW8+lqLLt39sQX0FO0P3eaDh8qTXtUuon2jTyFMMAMTFRTNpJmpAzuBH9yeMmDeALPTh0HphI+BkoUl5q1QbWFYjjnZMH2CatApxpLybt9A7rwm//PbOG0TSI93GEKNQ4w5DYryKTfwHzRBptNSephJSuxZYEfJsmUtas5es1D7Fe0PkyjxNNSU+eO+8wsTlitLUsJO4k0jAgy+cEKdU7YJ3J0GZVXocSkrNnUfd2hQPcJ3UtEJx3hLqqr8EM7EZBAasc1yGHh36NFetclzFY24YPih0G1+XurhTys="
on:
tags: true
distributions: "sdist bdist_wheel"

View File

@@ -1,3 +1,116 @@
Version 18.12
-------------
18.12.0
- Changes:
- Improved codebase test coverage from 81% to 91%.
- Added stream_large_files and host examples in static_file document
- Added methods to append and finish body content on Request (#1379)
- Integrated with .appveyor.yml for windows ci support
- Added documentation for AF_INET6 and AF_UNIX socket usage
- Adopt black/isort for codestyle
- Cancel task when connection_lost
- Simplify request ip and port retrieval logic
- Handle config error in load config file.
- Integrate with codecov for CI
- Add missed documentation for config section.
- Deprecate Handler.log
- Pinned httptools requirement to version 0.0.10+
- Fixes:
- Fix `remove_entity_headers` helper function (#1415)
- Fix TypeError when use Blueprint.group() to group blueprint with default url_prefix, Use os.path.normpath to avoid invalid url_prefix like api//v1
f8a6af1 Rename the `http` module to `helpers` to prevent conflicts with the built-in Python http library (fixes #1323)
- Fix unittests on windows
- Fix Namespacing of sanic logger
- Fix missing quotes in decorator example
- Fix redirect with quoted param
- Fix doc for latest blueprint code
- Fix build of latex documentation relating to markdown lists
- Fix loop exception handling in app.py
- Fix content length mismatch in windows and other platform
- Fix Range header handling for static files (#1402)
- Fix the logger and make it work (#1397)
- Fix type pikcle->pickle in multiprocessing test
- Fix pickling blueprints Change the string passed in the "name" section of the namedtuples in Blueprint to match the name of the Blueprint module attribute name. This allows blueprints to be pickled and unpickled, without errors, which is a requirment of running Sanic in multiprocessing mode in Windows. Added a test for pickling and unpickling blueprints Added a test for pickling and unpickling sanic itself Added a test for enabling multiprocessing on an app with a blueprint (only useful to catch this bug if the tests are run on Windows).
- Fix document for logging
Version 0.8
-----------
0.8.3
- Changes:
- Ownership changed to org 'huge-success'
0.8.0
- Changes:
- Add Server-Sent Events extension (Innokenty Lebedev)
- Graceful handling of request_handler_task cancellation (Ashley Sommer)
- Sanitize URL before redirection (aveao)
- Add url_bytes to request (johndoe46)
- py37 support for travisci (yunstanford)
- Auto reloader support for OSX (garyo)
- Add UUID route support (Volodymyr Maksymiv)
- Add pausable response streams (Ashley Sommer)
- Add weakref to request slots (vopankov)
- remove ubuntu 12.04 from test fixture due to deprecation (yunstanford)
- Allow streaming handlers in add_route (kinware)
- use travis_retry for tox (Raphael Deem)
- update aiohttp version for test client (yunstanford)
- add redirect import for clarity (yingshaoxo)
- Update HTTP Entity headers (Arnulfo Solís)
- Add register_listener method (Stephan Fitzpatrick)
- Remove uvloop/ujson dependencies for Windows (abuckenheimer)
- Content-length header on 204/304 responses (Arnulfo Solís)
- Extend WebSocketProtocol arguments and add docs (Bob Olde Hampsink, yunstanford)
- Update development status from pre-alpha to beta (Maksim Anisenkov)
- KeepAlive Timout log level changed to debug (Arnulfo Solís)
- Pin pytest to 3.3.2 because of pytest-dev/pytest#3170 (Maksim Aniskenov)
- Install Python 3.5 and 3.6 on docker container for tests (Shahin Azad)
- Add support for blueprint groups and nesting (Elias Tarhini)
- Remove uvloop for windows setup (Aleksandr Kurlov)
- Auto Reload (Yaser Amari)
- Documentation updates/fixups (multiple contributors)
- Fixes:
- Fix: auto_reload in Linux (Ashley Sommer)
- Fix: broken tests for aiohttp >= 3.3.0 (Ashley Sommer)
- Fix: disable auto_reload by default on windows (abuckenheimer)
- Fix (1143): Turn off access log with gunicorn (hqy)
- Fix (1268): Support status code for file response (Cosmo Borsky)
- Fix (1266): Add content_type flag to Sanic.static (Cosmo Borsky)
- Fix: subprotocols parameter missing from add_websocket_route (ciscorn)
- Fix (1242): Responses for CI header (yunstanford)
- Fix (1237): add version constraint for websockets (yunstanford)
- Fix (1231): memory leak - always release resource (Phillip Xu)
- Fix (1221): make request truthy if transport exists (Raphael Deem)
- Fix failing tests for aiohttp>=3.1.0 (Ashley Sommer)
- Fix try_everything examples (PyManiacGR, kot83)
- Fix (1158): default to auto_reload in debug mode (Raphael Deem)
- Fix (1136): ErrorHandler.response handler call too restrictive (Julien Castiaux)
- Fix: raw requires bytes-like object (cloudship)
- Fix (1120): passing a list in to a route decorator's host arg (Timothy Ebiuwhe)
- Fix: Bug in multipart/form-data parser (DirkGuijt)
- Fix: Exception for missing parameter when value is null (NyanKiyoshi)
- Fix: Parameter check (Howie Hu)
- Fix (1089): Routing issue with named parameters and different methods (yunstanford)
- Fix (1085): Signal handling in multi-worker mode (yunstanford)
- Fix: single quote in readme.rst (Cosven)
- Fix: method typos (Dmitry Dygalo)
- Fix: log_response correct output for ip and port (Wibowo Arindrarto)
- Fix (1042): Exception Handling (Raphael Deem)
- Fix: Chinese URIs (Howie Hu)
- Fix (1079): timeout bug when self.transport is None (Raphael Deem)
- Fix (1074): fix strict_slashes when route has slash (Raphael Deem)
- Fix (1050): add samesite cookie to cookie keys (Raphael Deem)
- Fix (1065): allow add_task after server starts (Raphael Deem)
- Fix (1061): double quotes in unauthorized exception (Raphael Deem)
- Fix (1062): inject the app in add_task method (Raphael Deem)
- Fix: update environment.yml for readthedocs (Eli Uriegas)
- Fix: Cancel request task when response timeout is triggered (Jeong YunWon)
- Fix (1052): Method not allowed response for RFC7231 compliance (Raphael Deem)
- Fix: IPv6 Address and Socket Data Format (Dan Palmer)
Note: Changelog was unmaintained between 0.1 and 0.7
Version 0.1
-----------
- 0.1.7
@@ -5,18 +118,18 @@ Version 0.1
- 0.1.6
- Static files
- Lazy Cookie Loading
- 0.1.5
- 0.1.5
- Cookies
- Blueprint listeners and ordering
- Faster Router
- Fix: Incomplete file reads on medium+ sized post requests
- Breaking: after_start and before_stop now pass sanic as their first argument
- 0.1.4
- 0.1.4
- Multiprocessing
- 0.1.3
- Blueprint support
- Faster Response processing
- 0.1.1 - 0.1.2
- 0.1.1 - 0.1.2
- Struggling to update pypi via CI
- 0.1.0
- Released to public
- 0.1.0
- Released to public

View File

@@ -18,9 +18,22 @@ So assume you have already cloned the repo and are in the working directory with
a virtual environment already set up, then run:
```bash
python setup.py develop && pip install -r requirements-dev.txt
pip3 install -e . "[.dev]"
```
# Dependency Changes
`Sanic` doesn't use `requirements*.txt` files to manage any kind of dependencies related to it in order to simplify the
effort required in managing the dependencies. Please make sure you have read and understood the following section of
the document that explains the way `sanic` manages dependencies inside the `setup.py` file.
| Dependency Type | Usage | Installation |
| ------------------------------------------| -------------------------------------------------------------------------- | --------------------------- |
| requirements | Bare minimum dependencies required for sanic to function | pip3 install -e . |
| tests_require / extras_require['test'] | Dependencies required to run the Unit Tests for `sanic` | pip3 install -e '[.test]' |
| extras_require['dev'] | Additional Development requirements to add contributing | pip3 install -e '[.dev]' |
| extras_require['docs'] | Dependencies required to enable building and enhancing sanic documentation | pip3 install -e '[.docs]' |
## Running tests
To run the tests for sanic it is recommended to use tox like so:

View File

@@ -1,6 +0,0 @@
FROM python:3.6
ADD . /app
WORKDIR /app
RUN pip install tox

View File

@@ -1,6 +1,6 @@
MIT License
Copyright (c) [year] [fullname]
Copyright (c) 2016-present Channel Cat
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
@@ -18,4 +18,4 @@ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
SOFTWARE.

View File

@@ -1,4 +1,15 @@
# Non Code related contents
include LICENSE
include README.rst
include pyproject.toml
recursive-exclude * __pycache__
recursive-exclude * *.py[co]
# Setup
include setup.py
include Makefile
# Tests
include .coveragerc
graft tests
global-exclude __pycache__
global-exclude *.py[co]

View File

@@ -1,4 +1,58 @@
test:
find . -name "*.pyc" -delete
docker build -t sanic/test-image .
.PHONY: help test test-coverage install docker-test black fix-import beautify
.DEFAULT: help
help:
@echo "Please use \`make <target>' where <target> is one of"
@echo "test"
@echo " Run Sanic Unit Tests"
@echo "test-coverage"
@echo " Run Sanic Unit Tests with Coverage"
@echo "install"
@echo " Install Sanic"
@echo "docker-test"
@echo " Run Sanic Unit Tests using Docker"
@echo "black"
@echo " Analyze and fix linting issues using Black"
@echo "fix-import"
@echo " Analyze and fix import order using isort"
@echo "beautify [sort_imports=1] [include_tests=1]"
@echo " Analyze and fix linting issue using black and optionally fix import sort using isort"
@echo ""
clean:
find . ! -path "./.eggs/*" -name "*.pyc" -exec rm {} \;
find . ! -path "./.eggs/*" -name "*.pyo" -exec rm {} \;
find . ! -path "./.eggs/*" -name ".coverage" -exec rm {} \;
rm -rf build/* > /dev/null 2>&1
rm -rf dist/* > /dev/null 2>&1
test: clean
python setup.py test
test-coverage: clean
python setup.py test --pytest-args="--cov sanic --cov-report term --cov-append "
install:
python setup.py install
docker-test: clean
docker build -t sanic/test-image -f docker/Dockerfile .
docker run -t sanic/test-image tox
beautify: black
ifdef sort_imports
ifdef include_tests
$(warning It is suggested that you do not run sort import on tests)
isort -rc sanic tests
else
$(info Sorting Imports)
isort -rc sanic
endif
endif
black:
black --config ./pyproject.toml sanic tests
fix-import: black
isort -rc sanic

View File

@@ -1,15 +1,71 @@
Sanic
=====
.. image:: https://raw.githubusercontent.com/huge-success/sanic-assets/master/png/sanic-framework-logo-400x97.png
:alt: Sanic | Build fast. Run fast.
|Join the chat at https://gitter.im/sanic-python/Lobby| |Build Status| |PyPI| |PyPI version|
Sanic | Build fast. Run fast.
=============================
Sanic is a Flask-like Python 3.5+ web server that's written to go fast. It's based on the work done by the amazing folks at magicstack, and was inspired by `this article <https://magic.io/blog/uvloop-blazing-fast-python-networking/>`_.
.. start-badges
On top of being Flask-like, Sanic supports async request handlers. This means you can use the new shiny async/await syntax from Python 3.5, making your code non-blocking and speedy.
.. list-table::
:stub-columns: 1
Sanic is developed `on GitHub <https://github.com/channelcat/sanic/>`_. Contributions are welcome!
* - Build
- | |Build Status| |AppVeyor Build Status| |Codecov|
* - Docs
- |Documentation|
* - Package
- | |PyPI| |PyPI version| |Wheel| |Supported implementations| |Code style black|
* - Support
- | |Forums| |Join the chat at https://gitter.im/sanic-python/Lobby|
.. |Forums| image:: https://img.shields.io/badge/forums-community-ff0068.svg
:target: https://community.sanicframework.org/
.. |Join the chat at https://gitter.im/sanic-python/Lobby| image:: https://badges.gitter.im/sanic-python/Lobby.svg
:target: https://gitter.im/sanic-python/Lobby?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge
.. |Codecov| image:: https://codecov.io/gh/huge-success/sanic/branch/master/graph/badge.svg
:target: https://codecov.io/gh/huge-success/sanic
.. |Build Status| image:: https://travis-ci.org/huge-success/sanic.svg?branch=master
:target: https://travis-ci.org/huge-success/sanic
.. |AppVeyor Build Status| image:: https://ci.appveyor.com/api/projects/status/d8pt3ids0ynexi8c/branch/master?svg=true
:target: https://ci.appveyor.com/project/huge-success/sanic
.. |Documentation| image:: https://readthedocs.org/projects/sanic/badge/?version=latest
:target: http://sanic.readthedocs.io/en/latest/?badge=latest
.. |PyPI| image:: https://img.shields.io/pypi/v/sanic.svg
:target: https://pypi.python.org/pypi/sanic/
.. |PyPI version| image:: https://img.shields.io/pypi/pyversions/sanic.svg
:target: https://pypi.python.org/pypi/sanic/
.. |Code style black| image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/ambv/black
.. |Wheel| image:: https://img.shields.io/pypi/wheel/sanic.svg
:alt: PyPI Wheel
:target: https://pypi.python.org/pypi/sanic
.. |Supported implementations| image:: https://img.shields.io/pypi/implementation/sanic.svg
:alt: Supported implementations
:target: https://pypi.python.org/pypi/sanic
.. end-badges
Sanic is a Python web server and web framework that's written to go fast. It allows the usage of the ``async/await`` syntax added in Python 3.5, which makes your code non-blocking and speedy.
`Source code on GitHub <https://github.com/huge-success/sanic/>`_ | `Help and discussion board <https://community.sanicframework.org/>`_.
The project is maintained by the community, for the community **Contributions are welcome!**
The goal of the project is to provide a simple way to get up and running a highly performant HTTP server that is easy to build, to expand, and ultimately to scale.
Installation
------------
``pip3 install sanic``
Sanic makes use of ``uvloop`` and ``ujson`` to help with performance. If you do not want to use those packages, simply add an environmental variable ``SANIC_NO_UVLOOP=true`` or ``SANIC_NO_UJSON=true`` at install time.
.. code:: shell
$ export SANIC_NO_UVLOOP=true
$ export SANIC_NO_UJSON=true
$ pip3 install sanic
If you have a project that utilizes Sanic make sure to comment on the `issue <https://github.com/channelcat/sanic/issues/396>`_ that we use to track those projects!
Hello World Example
-------------------
@@ -21,23 +77,33 @@ Hello World Example
app = Sanic()
@app.route("/")
@app.route('/')
async def test(request):
return json({"hello": "world"})
return json({'hello': 'world'})
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8000)
if __name__ == '__main__':
app.run(host='0.0.0.0', port=8000)
Sanic can now be easily run using ``python3 hello.py``.
Installation
------------
.. code::
- ``python -m pip install sanic``
[2018-12-30 11:37:41 +0200] [13564] [INFO] Goin' Fast @ http://0.0.0.0:8000
[2018-12-30 11:37:41 +0200] [13564] [INFO] Starting worker [13564]
To install sanic without uvloop or json using bash, you can provide either or both of these environmental variables
using any truthy string like `'y', 'yes', 't', 'true', 'on', '1'` and setting the NO_X to true will stop that features
installation.
And, we can verify it is working: ``curl localhost:8000 -i``
- ``SANIC_NO_UVLOOP=true SANIC_NO_UJSON=true python -m pip install sanic``
.. code::
HTTP/1.1 200 OK
Connection: keep-alive
Keep-Alive: 5
Content-Length: 17
Content-Type: application/json
{"hello":"world"}
**Now, let's go build something fast!**
Documentation
@@ -45,56 +111,18 @@ Documentation
`Documentation on Readthedocs <http://sanic.readthedocs.io/>`_.
.. |Join the chat at https://gitter.im/sanic-python/Lobby| image:: https://badges.gitter.im/sanic-python/Lobby.svg
:target: https://gitter.im/sanic-python/Lobby?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge
.. |Build Status| image:: https://travis-ci.org/channelcat/sanic.svg?branch=master
:target: https://travis-ci.org/channelcat/sanic
.. |Documentation| image:: https://readthedocs.org/projects/sanic/badge/?version=latest
:target: http://sanic.readthedocs.io/en/latest/?badge=latest
.. |PyPI| image:: https://img.shields.io/pypi/v/sanic.svg
:target: https://pypi.python.org/pypi/sanic/
.. |PyPI version| image:: https://img.shields.io/pypi/pyversions/sanic.svg
:target: https://pypi.python.org/pypi/sanic/
Changelog
---------
`Release Changelogs <https://github.com/huge-success/sanic/blob/master/CHANGELOG.md>`_.
Questions and Discussion
------------------------
Examples
--------
`Non-Core examples <https://github.com/channelcat/sanic/wiki/Examples/>`_. Examples of plugins and Sanic that are outside the scope of Sanic core.
`Ask a question or join the conversation <https://community.sanicframework.org/>`_.
`Extensions <https://github.com/channelcat/sanic/wiki/Extensions/>`_. Sanic extensions created by the community.
Contribution
------------
`Projects <https://github.com/channelcat/sanic/wiki/Projects/>`_. Sanic in production use.
TODO
----
* http2
Limitations
-----------
* No wheels for uvloop and httptools on Windows :(
Final Thoughts
--------------
::
▄▄▄▄▄
▀▀▀██████▄▄▄ _______________
▄▄▄▄▄ █████████▄ / \
▀▀▀▀█████▌ ▀▐▄ ▀▐█ | Gotta go fast! |
▀▀█████▄▄ ▀██████▄██ | _________________/
▀▄▄▄▄▄ ▀▀█▄▀█════█▀ |/
▀▀▀▄ ▀▀███ ▀ ▄▄
▄███▀▀██▄████████▄ ▄▀▀▀▀▀▀█▌
██▀▄▄▄██▀▄███▀ ▀▀████ ▄██
▄▀▀▀▄██▄▀▀▌████▒▒▒▒▒▒███ ▌▄▄▀
▌ ▐▀████▐███▒▒▒▒▒▐██▌
▀▄▄▄▄▀ ▀▀████▒▒▒▒▄██▀
▀▀█████████▀
▄▄██▀██████▀█
▄██▀ ▀▀▀ █
▄█ ▐▌
▄▄▄▄█▌ ▀█▄▄▄▄▀▀▄
▌ ▐ ▀▀▄▄▄▀
▀▀▄▄▀
We are always happy to have new contributions. We have `marked issues good for anyone looking to get started <https://github.com/huge-success/sanic/issues?q=is%3Aopen+is%3Aissue+label%3Abeginner>`_, and welcome `questions on the forums <https://community.sanicframework.org/>`_. Please take a look at our `Contribution guidelines <https://github.com/huge-success/sanic/blob/master/CONTRIBUTING.md>`_.

28
docker/Dockerfile Normal file
View File

@@ -0,0 +1,28 @@
FROM alpine:3.7
RUN apk add --no-cache --update \
curl \
bash \
build-base \
ca-certificates \
git \
bzip2-dev \
linux-headers \
ncurses-dev \
openssl \
openssl-dev \
readline-dev \
sqlite-dev
RUN update-ca-certificates
RUN rm -rf /var/cache/apk/*
ENV PYENV_ROOT="/root/.pyenv"
ENV PATH="$PYENV_ROOT/bin:$PATH"
ADD . /app
WORKDIR /app
RUN /app/docker/bin/install_python.sh 3.5.4 3.6.4
ENTRYPOINT ["./docker/bin/entrypoint.sh"]

11
docker/bin/entrypoint.sh Executable file
View File

@@ -0,0 +1,11 @@
#!/bin/bash
set -e
eval "$(pyenv init -)"
eval "$(pyenv virtualenv-init -)"
source /root/.pyenv/completions/pyenv.bash
pip install tox
exec $@

17
docker/bin/install_python.sh Executable file
View File

@@ -0,0 +1,17 @@
#!/bin/bash
set -e
export CFLAGS='-O2'
export EXTRA_CFLAGS="-DTHREAD_STACK_SIZE=0x100000"
curl -L https://raw.githubusercontent.com/pyenv/pyenv-installer/master/bin/pyenv-installer | bash
eval "$(pyenv init -)"
for ver in $@
do
pyenv install $ver
done
pyenv global $@
pip install --upgrade pip
pyenv rehash

0
docs/_static/.gitkeep vendored Normal file
View File

View File

@@ -25,7 +25,7 @@ import sanic
# -- General configuration ------------------------------------------------
extensions = ['sphinx.ext.autodoc']
extensions = ['sphinx.ext.autodoc', 'sphinxcontrib.asyncio']
templates_path = ['_templates']
@@ -38,7 +38,7 @@ master_doc = 'index'
# General information about the project.
project = 'Sanic'
copyright = '2016, Sanic contributors'
copyright = '2018, Sanic contributors'
author = 'Sanic contributors'
# The version info for the project you're documenting, acts as replacement for

View File

@@ -7,26 +7,33 @@ Guides
:maxdepth: 2
sanic/getting_started
sanic/routing
sanic/config
sanic/logging
sanic/request_data
sanic/response
sanic/cookies
sanic/routing
sanic/blueprints
sanic/static_files
sanic/versioning
sanic/exceptions
sanic/middleware
sanic/blueprints
sanic/config
sanic/cookies
sanic/websocket
sanic/decorators
sanic/streaming
sanic/class_based_views
sanic/custom_protocol
sanic/sockets
sanic/ssl
sanic/logging
sanic/debug_mode
sanic/testing
sanic/deploying
sanic/extensions
sanic/examples
sanic/changelog
sanic/contributing
sanic/api_reference
sanic/asyncio_python37
Module Documentation

View File

@@ -20,6 +20,15 @@ sanic.blueprints module
:undoc-members:
:show-inheritance:
sanic.blueprint_group module
----------------------------
.. automodule:: sanic.blueprint_group
:members:
:undoc-members:
:show-inheritance:
sanic.config module
-------------------

View File

@@ -0,0 +1,58 @@
Python 3.7 AsyncIO examples
###########################
With Python 3.7 AsyncIO got major update for the following types:
- asyncio.AbstractEventLoop
- asyncio.AnstractServer
This example shows how to use sanic with Python 3.7, to be precise: how to retrieve an asyncio server instance:
.. code:: python
import asyncio
import socket
import os
from sanic import Sanic
from sanic.response import json
app = Sanic(__name__)
@app.route("/")
async def test(request):
return json({"hello": "world"})
server_socket = '/tmp/sanic.sock'
sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
try:
os.remove(server_socket)
finally:
sock.bind(server_socket)
if __name__ == "__main__":
loop = asyncio.get_event_loop()
srv_coro = app.create_server(
sock=sock,
return_asyncio_server=True,
asyncio_server_kwargs=dict(
start_serving=False
)
)
srv = loop.run_until_complete(srv_coro)
try:
assert srv.is_serving() is False
loop.run_until_complete(srv.start_serving())
assert srv.is_serving() is True
loop.run_until_complete(srv.serve_forever())
except KeyboardInterrupt:
srv.close()
loop.close()
Please note that uvloop does not support these features yet.

View File

@@ -48,19 +48,86 @@ by that blueprint. In this example, the registered routes in the `app.router`
will look like:
```python
[Route(handler=<function bp_root at 0x7f908382f9d8>, methods=None, pattern=re.compile('^/$'), parameters=[])]
[Route(handler=<function bp_root at 0x7f908382f9d8>, methods=frozenset({'GET'}), pattern=re.compile('^/$'), parameters=[], name='my_blueprint.bp_root', uri='/')]
```
## Using blueprints
## Blueprint groups and nesting
Blueprints have much the same functionality as an application instance.
Blueprints may also be registered as part of a list or tuple, where the registrar will recursively cycle through any sub-sequences of blueprints and register them accordingly. The `Blueprint.group` method is provided to simplify this process, allowing a 'mock' backend directory structure mimicking what's seen from the front end. Consider this (quite contrived) example:
```
api/
├──content/
│ ├──authors.py
│ ├──static.py
│ └──__init__.py
├──info.py
└──__init__.py
app.py
```
Initialization of this app's blueprint hierarchy could go as follows:
```python
# api/content/authors.py
from sanic import Blueprint
authors = Blueprint('content_authors', url_prefix='/authors')
```
```python
# api/content/static.py
from sanic import Blueprint
static = Blueprint('content_static', url_prefix='/static')
```
```python
# api/content/__init__.py
from sanic import Blueprint
from .static import static
from .authors import authors
content = Blueprint.group(static, authors, url_prefix='/content')
```
```python
# api/info.py
from sanic import Blueprint
info = Blueprint('info', url_prefix='/info')
```
```python
# api/__init__.py
from sanic import Blueprint
from .content import content
from .info import info
api = Blueprint.group(content, info, url_prefix='/api')
```
And registering these blueprints in `app.py` can now be done like so:
```python
# app.py
from sanic import Sanic
from .api import api
app = Sanic(__name__)
app.blueprint(api)
```
## Using Blueprints
Blueprints have almost the same functionality as an application instance.
### WebSocket routes
WebSocket handlers can be registered on a blueprint using the `@bp.websocket`
decorator or `bp.add_websocket_route` method.
### Middleware
### Blueprint Middleware
Using blueprints allows you to also register middleware globally.
@@ -78,6 +145,36 @@ async def halt_response(request, response):
return text('I halted the response')
```
### Blueprint Group Middleware
Using this middleware will ensure that you can apply a common middleware to all the blueprints that form the
current blueprint group under consideration.
```python
bp1 = Blueprint('bp1', url_prefix='/bp1')
bp2 = Blueprint('bp2', url_prefix='/bp2')
@bp1.middleware('request')
async def bp1_only_middleware(request):
print('applied on Blueprint : bp1 Only')
@bp1.route('/')
async def bp1_route(request):
return text('bp1')
@bp2.route('/<param>')
async def bp2_route(request, param):
return text(param)
group = Blueprint.group(bp1, bp2)
@group.middleware('request')
async def group_middleware(request):
print('common middleware applied for both bp1 and bp2')
# Register Blueprint group under the app
app.blueprint(group)
```
### Exceptions
Exceptions can be applied exclusively to blueprints globally.
@@ -93,7 +190,14 @@ def ignore_404s(request, exception):
Static files can be served globally, under the blueprint prefix.
```python
bp.static('/folder/to/serve', '/web/path')
# suppose bp.name == 'bp'
bp.static('/web/path', '/folder/to/serve')
# also you can pass name parameter to it for url_for
bp.static('/web/path', '/folder/to/server', name='uploads')
app.url_for('static', name='bp.uploads', filename='file.txt') == '/bp/web/path/file.txt'
```
## Start and stop
@@ -127,7 +231,7 @@ async def close_connection(app, loop):
Blueprints can be very useful for API versioning, where one blueprint may point
at `/v1/<routes>`, and another pointing at `/v2/<routes>`.
When a blueprint is initialised, it can take an optional `url_prefix` argument,
When a blueprint is initialised, it can take an optional `version` argument,
which will be prepended to all routes defined on the blueprint. This feature
can be used to implement our API versioning scheme.
@@ -136,8 +240,8 @@ can be used to implement our API versioning scheme.
from sanic.response import text
from sanic import Blueprint
blueprint_v1 = Blueprint('v1', url_prefix='/v1')
blueprint_v2 = Blueprint('v2', url_prefix='/v2')
blueprint_v1 = Blueprint('v1', url_prefix='/api', version="v1")
blueprint_v2 = Blueprint('v2', url_prefix='/api', version="v2")
@blueprint_v1.route('/')
async def api_v1_root(request):
@@ -148,7 +252,7 @@ async def api_v2_root(request):
return text('Welcome to version 2 of our documentation')
```
When we register our blueprints on the app, the routes `/v1` and `/v2` will now
When we register our blueprints on the app, the routes `/v1/api` and `/v2/api` will now
point to the individual blueprints, which allows the creation of *sub-sites*
for each API version.
@@ -158,8 +262,8 @@ from sanic import Sanic
from blueprints import blueprint_v1, blueprint_v2
app = Sanic(__name__)
app.blueprint(blueprint_v1, url_prefix='/v1')
app.blueprint(blueprint_v2, url_prefix='/v2')
app.blueprint(blueprint_v1)
app.blueprint(blueprint_v2)
app.run(host='0.0.0.0', port=8000, debug=True)
```
@@ -172,7 +276,7 @@ takes the format `<blueprint_name>.<handler_name>`. For example:
```python
@blueprint_v1.route('/')
async def root(request):
url = app.url_for('v1.post_handler', post_id=5) # --> '/v1/post/5'
url = request.app.url_for('v1.post_handler', post_id=5) # --> '/v1/api/post/5'
return redirect(url)
@@ -180,5 +284,3 @@ async def root(request):
async def post_handler(request, post_id):
return text('Post {} in Blueprint V1'.format(post_id))
```

135
docs/sanic/changelog.md Normal file
View File

@@ -0,0 +1,135 @@
Version 18.12
-------------
18.12.0
- Changes:
- Improved codebase test coverage from 81% to 91%.
- Added stream_large_files and host examples in static_file document
- Added methods to append and finish body content on Request (#1379)
- Integrated with .appveyor.yml for windows ci support
- Added documentation for AF_INET6 and AF_UNIX socket usage
- Adopt black/isort for codestyle
- Cancel task when connection_lost
- Simplify request ip and port retrieval logic
- Handle config error in load config file.
- Integrate with codecov for CI
- Add missed documentation for config section.
- Deprecate Handler.log
- Pinned httptools requirement to version 0.0.10+
- Fixes:
- Fix `remove_entity_headers` helper function (#1415)
- Fix TypeError when use Blueprint.group() to group blueprint with default url_prefix, Use os.path.normpath to avoid invalid url_prefix like api//v1
f8a6af1 Rename the `http` module to `helpers` to prevent conflicts with the built-in Python http library (fixes #1323)
- Fix unittests on windows
- Fix Namespacing of sanic logger
- Fix missing quotes in decorator example
- Fix redirect with quoted param
- Fix doc for latest blueprint code
- Fix build of latex documentation relating to markdown lists
- Fix loop exception handling in app.py
- Fix content length mismatch in windows and other platform
- Fix Range header handling for static files (#1402)
- Fix the logger and make it work (#1397)
- Fix type pikcle->pickle in multiprocessing test
- Fix pickling blueprints Change the string passed in the "name" section of the namedtuples in Blueprint to match the name of the Blueprint module attribute name. This allows blueprints to be pickled and unpickled, without errors, which is a requirment of running Sanic in multiprocessing mode in Windows. Added a test for pickling and unpickling blueprints Added a test for pickling and unpickling sanic itself Added a test for enabling multiprocessing on an app with a blueprint (only useful to catch this bug if the tests are run on Windows).
- Fix document for logging
Version 0.8
-----------
0.8.3
- Changes:
- Ownership changed to org 'huge-success'
0.8.0
- Changes:
- Add Server-Sent Events extension (Innokenty Lebedev)
- Graceful handling of request_handler_task cancellation (Ashley Sommer)
- Sanitize URL before redirection (aveao)
- Add url_bytes to request (johndoe46)
- py37 support for travisci (yunstanford)
- Auto reloader support for OSX (garyo)
- Add UUID route support (Volodymyr Maksymiv)
- Add pausable response streams (Ashley Sommer)
- Add weakref to request slots (vopankov)
- remove ubuntu 12.04 from test fixture due to deprecation (yunstanford)
- Allow streaming handlers in add_route (kinware)
- use travis_retry for tox (Raphael Deem)
- update aiohttp version for test client (yunstanford)
- add redirect import for clarity (yingshaoxo)
- Update HTTP Entity headers (Arnulfo Solís)
- Add register_listener method (Stephan Fitzpatrick)
- Remove uvloop/ujson dependencies for Windows (abuckenheimer)
- Content-length header on 204/304 responses (Arnulfo Solís)
- Extend WebSocketProtocol arguments and add docs (Bob Olde Hampsink, yunstanford)
- Update development status from pre-alpha to beta (Maksim Anisenkov)
- KeepAlive Timout log level changed to debug (Arnulfo Solís)
- Pin pytest to 3.3.2 because of pytest-dev/pytest#3170 (Maksim Aniskenov)
- Install Python 3.5 and 3.6 on docker container for tests (Shahin Azad)
- Add support for blueprint groups and nesting (Elias Tarhini)
- Remove uvloop for windows setup (Aleksandr Kurlov)
- Auto Reload (Yaser Amari)
- Documentation updates/fixups (multiple contributors)
- Fixes:
- Fix: auto_reload in Linux (Ashley Sommer)
- Fix: broken tests for aiohttp >= 3.3.0 (Ashley Sommer)
- Fix: disable auto_reload by default on windows (abuckenheimer)
- Fix (1143): Turn off access log with gunicorn (hqy)
- Fix (1268): Support status code for file response (Cosmo Borsky)
- Fix (1266): Add content_type flag to Sanic.static (Cosmo Borsky)
- Fix: subprotocols parameter missing from add_websocket_route (ciscorn)
- Fix (1242): Responses for CI header (yunstanford)
- Fix (1237): add version constraint for websockets (yunstanford)
- Fix (1231): memory leak - always release resource (Phillip Xu)
- Fix (1221): make request truthy if transport exists (Raphael Deem)
- Fix failing tests for aiohttp>=3.1.0 (Ashley Sommer)
- Fix try_everything examples (PyManiacGR, kot83)
- Fix (1158): default to auto_reload in debug mode (Raphael Deem)
- Fix (1136): ErrorHandler.response handler call too restrictive (Julien Castiaux)
- Fix: raw requires bytes-like object (cloudship)
- Fix (1120): passing a list in to a route decorator's host arg (Timothy Ebiuwhe)
- Fix: Bug in multipart/form-data parser (DirkGuijt)
- Fix: Exception for missing parameter when value is null (NyanKiyoshi)
- Fix: Parameter check (Howie Hu)
- Fix (1089): Routing issue with named parameters and different methods (yunstanford)
- Fix (1085): Signal handling in multi-worker mode (yunstanford)
- Fix: single quote in readme.rst (Cosven)
- Fix: method typos (Dmitry Dygalo)
- Fix: log_response correct output for ip and port (Wibowo Arindrarto)
- Fix (1042): Exception Handling (Raphael Deem)
- Fix: Chinese URIs (Howie Hu)
- Fix (1079): timeout bug when self.transport is None (Raphael Deem)
- Fix (1074): fix strict_slashes when route has slash (Raphael Deem)
- Fix (1050): add samesite cookie to cookie keys (Raphael Deem)
- Fix (1065): allow add_task after server starts (Raphael Deem)
- Fix (1061): double quotes in unauthorized exception (Raphael Deem)
- Fix (1062): inject the app in add_task method (Raphael Deem)
- Fix: update environment.yml for readthedocs (Eli Uriegas)
- Fix: Cancel request task when response timeout is triggered (Jeong YunWon)
- Fix (1052): Method not allowed response for RFC7231 compliance (Raphael Deem)
- Fix: IPv6 Address and Socket Data Format (Dan Palmer)
Note: Changelog was unmaintained between 0.1 and 0.7
Version 0.1
-----------
- 0.1.7
- Reversed static url and directory arguments to meet spec
- 0.1.6
- Static files
- Lazy Cookie Loading
- 0.1.5
- Cookies
- Blueprint listeners and ordering
- Faster Router
- Fix: Incomplete file reads on medium+ sized post requests
- Breaking: after_start and before_stop now pass sanic as their first argument
- 0.1.4
- Multiprocessing
- 0.1.3
- Blueprint support
- Faster Response processing
- 0.1.1 - 0.1.2
- Struggling to update pypi via CI
- 0.1.0
- Released to public

View File

@@ -92,10 +92,27 @@ class ViewWithDecorator(HTTPMethodView):
def get(self, request, name):
return text('Hello I have a decorator')
def post(self, request, name):
return text("Hello I also have a decorator")
app.add_route(ViewWithDecorator.as_view(), '/url')
```
#### URL Building
But if you just want to decorate some functions and not all functions, you can do as follows:
```python
class ViewWithSomeDecorator(HTTPMethodView):
@staticmethod
@some_decorator_here
def get(request, name):
return text("Hello I have a decorator")
def post(self, request, name):
return text("Hello I don't have any decorators")
```
## URL Building
If you wish to build a URL for an HTTPMethodView, remember that the class name will be the endpoint
that you will pass into `url_for`. For example:

View File

@@ -29,9 +29,15 @@ In general the convention is to only have UPPERCASE configuration parameters. Th
There are several ways how to load configuration.
### From environment variables.
### From Environment Variables
Any variables defined with the `SANIC_` prefix will be applied to the sanic config. For example, setting `SANIC_REQUEST_TIMEOUT` will be loaded by the application automatically. You can pass the `load_env` boolean to the Sanic constructor to override that:
Any variables defined with the `SANIC_` prefix will be applied to the sanic config. For example, setting `SANIC_REQUEST_TIMEOUT` will be loaded by the application automatically and fed into the `REQUEST_TIMEOUT` config variable. You can pass a different prefix to Sanic:
```python
app = Sanic(load_env='MYAPP_')
```
Then the above variable would be `MYAPP_REQUEST_TIMEOUT`. If you want to disable loading from environment variables you can set it to `False` instead:
```python
app = Sanic(load_env=False)
@@ -79,8 +85,61 @@ DB_USER = 'appuser'
Out of the box there are just a few predefined values which can be overwritten when creating the application.
| Variable | Default | Description |
| ----------------- | --------- | --------------------------------- |
| REQUEST_MAX_SIZE | 100000000 | How big a request may be (bytes) |
| REQUEST_TIMEOUT | 60 | How long a request can take (sec) |
| KEEP_ALIVE | True | Disables keep-alive when False |
| Variable | Default | Description |
| ------------------------- | --------- | --------------------------------------------------------- |
| REQUEST_MAX_SIZE | 100000000 | How big a request may be (bytes) |
| REQUEST_BUFFER_QUEUE_SIZE | 100 | Request streaming buffer queue size |
| REQUEST_TIMEOUT | 60 | How long a request can take to arrive (sec) |
| RESPONSE_TIMEOUT | 60 | How long a response can take to process (sec) |
| KEEP_ALIVE | True | Disables keep-alive when False |
| KEEP_ALIVE_TIMEOUT | 5 | How long to hold a TCP connection open (sec) |
| GRACEFUL_SHUTDOWN_TIMEOUT | 15.0 | How long to wait to force close non-idle connection (sec) |
| ACCESS_LOG | True | Disable or enable access log |
### The different Timeout variables:
#### `REQUEST_TIMEOUT`
A request timeout measures the duration of time between the instant when a new open TCP connection is passed to the
Sanic backend server, and the instant when the whole HTTP request is received. If the time taken exceeds the
`REQUEST_TIMEOUT` value (in seconds), this is considered a Client Error so Sanic generates an `HTTP 408` response
and sends that to the client. Set this parameter's value higher if your clients routinely pass very large request payloads
or upload requests very slowly.
#### `RESPONSE_TIMEOUT`
A response timeout measures the duration of time between the instant the Sanic server passes the HTTP request to the
Sanic App, and the instant a HTTP response is sent to the client. If the time taken exceeds the `RESPONSE_TIMEOUT`
value (in seconds), this is considered a Server Error so Sanic generates an `HTTP 503` response and sends that to the
client. Set this parameter's value higher if your application is likely to have long-running process that delay the
generation of a response.
#### `KEEP_ALIVE_TIMEOUT`
##### What is Keep Alive? And what does the Keep Alive Timeout value do?
`Keep-Alive` is a HTTP feature introduced in `HTTP 1.1`. When sending a HTTP request, the client (usually a web browser application)
can set a `Keep-Alive` header to indicate the http server (Sanic) to not close the TCP connection after it has send the response.
This allows the client to reuse the existing TCP connection to send subsequent HTTP requests, and ensures more efficient
network traffic for both the client and the server.
The `KEEP_ALIVE` config variable is set to `True` in Sanic by default. If you don't need this feature in your application,
set it to `False` to cause all client connections to close immediately after a response is sent, regardless of
the `Keep-Alive` header on the request.
The amount of time the server holds the TCP connection open is decided by the server itself.
In Sanic, that value is configured using the `KEEP_ALIVE_TIMEOUT` value. By default, it is set to 5 seconds.
This is the same default setting as the Apache HTTP server and is a good balance between allowing enough time for
the client to send a new request, and not holding open too many connections at once. Do not exceed 75 seconds unless
you know your clients are using a browser which supports TCP connections held open for that long.
For reference:
```
Apache httpd server default keepalive timeout = 5 seconds
Nginx server default keepalive timeout = 75 seconds
Nginx performance tuning guidelines uses keepalive = 15 seconds
IE (5-9) client hard keepalive limit = 60 seconds
Firefox client hard keepalive limit = 115 seconds
Opera 11 client hard keepalive limit = 120 seconds
Chrome 13+ client keepalive limit > 300+ seconds
```

View File

@@ -1,62 +0,0 @@
# Contributing
Thank you for your interest! Sanic is always looking for contributors. If you
don't feel comfortable contributing code, adding docstrings to the source files
is very appreciated.
## Installation
To develop on sanic (and mainly to just run the tests) it is highly recommend to
install from sources.
So assume you have already cloned the repo and are in the working directory with
a virtual environment already set up, then run:
```bash
python setup.py develop && pip install -r requirements-dev.txt
```
## Running tests
To run the tests for sanic it is recommended to use tox like so:
```bash
tox
```
See it's that simple!
## Pull requests!
So the pull request approval rules are pretty simple:
1. All pull requests must pass unit tests
* All pull requests must be reviewed and approved by at least
one current collaborator on the project
* All pull requests must pass flake8 checks
* If you decide to remove/change anything from any common interface
a deprecation message should accompany it.
* If you implement a new feature you should have at least one unit
test to accompany it.
## Documentation
Sanic's documentation is built
using [sphinx](http://www.sphinx-doc.org/en/1.5.1/). Guides are written in
Markdown and can be found in the `docs` folder, while the module reference is
automatically generated using `sphinx-apidoc`.
To generate the documentation from scratch:
```bash
sphinx-apidoc -fo docs/_api/ sanic
sphinx-build -b html docs docs/_build
```
The HTML documentation will be created in the `docs/_build` folder.
## Warning
One of the main goals of Sanic is speed. Code that lowers the performance of
Sanic without significant gains in usability, security, or features may not be
merged. Please don't let this intimidate you! If you have any concerns about an
idea, open an issue for discussion and help.

View File

@@ -0,0 +1,89 @@
Contributing
============
Thank you for your interest! Sanic is always looking for contributors.
If you dont feel comfortable contributing code, adding docstrings to
the source files is very appreciated.
Installation
------------
To develop on sanic (and mainly to just run the tests) it is highly
recommend to install from sources.
So assume you have already cloned the repo and are in the working
directory with a virtual environment already set up, then run:
.. code:: bash
pip3 install -e '.[dev]'
Dependency Changes
------------------
``Sanic`` doesn't use ``requirements*.txt`` files to manage any kind of dependencies related to it in order to simplify the
effort required in managing the dependencies. Please make sure you have read and understood the following section of
the document that explains the way ``sanic`` manages dependencies inside the ``setup.py`` file.
+------------------------+-----------------------------------------------+--------------------------------+
| Dependency Type | Usage | Installation |
+========================+===============================================+================================+
| requirements | Bare minimum dependencies required for sanic | ``pip3 install -e .`` |
| | to function | |
+------------------------+-----------------------------------------------+--------------------------------+
| tests_require / | Dependencies required to run the Unit Tests | ``pip3 install -e '.[test]'`` |
| extras_require['test'] | for ``sanic`` | |
+------------------------+-----------------------------------------------+--------------------------------+
| extras_require['dev'] | Additional Development requirements to add | ``pip3 install -e '.[dev]'`` |
| | for contributing | |
+------------------------+-----------------------------------------------+--------------------------------+
| extras_require['docs'] | Dependencies required to enable building and | ``pip3 install -e '.[docs]'`` |
| | enhancing sanic documentation | |
+------------------------+-----------------------------------------------+--------------------------------+
Running tests
-------------
To run the tests for sanic it is recommended to use tox like so:
.. code:: bash
tox
See its that simple!
Pull requests!
--------------
So the pull request approval rules are pretty simple:
* All pull requests must pass unit tests
* All pull requests must be reviewed and approved by at least one current collaborator on the project
* All pull requests must pass flake8 checks
* If you decide to remove/change anything from any common interface a deprecation message should accompany it.
* If you implement a new feature you should have at least one unit test to accompany it.
Documentation
-------------
Sanics documentation is built using `sphinx`_. Guides are written in
Markdown and can be found in the ``docs`` folder, while the module
reference is automatically generated using ``sphinx-apidoc``.
To generate the documentation from scratch:
.. code:: bash
sphinx-apidoc -fo docs/_api/ sanic
sphinx-build -b html docs docs/_build
The HTML documentation will be created in the ``docs/_build`` folder.
.. warning::
One of the main goals of Sanic is speed. Code that lowers the
performance of Sanic without significant gains in usability, security,
or features may not be merged. Please dont let this intimidate you! If
you have any concerns about an idea, open an issue for discussion and
help.
.. _sphinx: http://www.sphinx-doc.org/en/1.5.1/

View File

@@ -1,72 +0,0 @@
# Custom Protocols
*Note: this is advanced usage, and most readers will not need such functionality.*
You can change the behavior of Sanic's protocol by specifying a custom
protocol, which should be a subclass
of
[asyncio.protocol](https://docs.python.org/3/library/asyncio-protocol.html#protocol-classes).
This protocol can then be passed as the keyword argument `protocol` to the `sanic.run` method.
The constructor of the custom protocol class receives the following keyword
arguments from Sanic.
- `loop`: an `asyncio`-compatible event loop.
- `connections`: a `set` to store protocol objects. When Sanic receives
`SIGINT` or `SIGTERM`, it executes `protocol.close_if_idle` for all protocol
objects stored in this set.
- `signal`: a `sanic.server.Signal` object with the `stopped` attribute. When
Sanic receives `SIGINT` or `SIGTERM`, `signal.stopped` is assigned `True`.
- `request_handler`: a coroutine that takes a `sanic.request.Request` object
and a `response` callback as arguments.
- `error_handler`: a `sanic.exceptions.Handler` which is called when exceptions
are raised.
- `request_timeout`: the number of seconds before a request times out.
- `request_max_size`: an integer specifying the maximum size of a request, in bytes.
## Example
An error occurs in the default protocol if a handler function does not return
an `HTTPResponse` object.
By overriding the `write_response` protocol method, if a handler returns a
string it will be converted to an `HTTPResponse object`.
```python
from sanic import Sanic
from sanic.server import HttpProtocol
from sanic.response import text
app = Sanic(__name__)
class CustomHttpProtocol(HttpProtocol):
def __init__(self, *, loop, request_handler, error_handler,
signal, connections, request_timeout, request_max_size):
super().__init__(
loop=loop, request_handler=request_handler,
error_handler=error_handler, signal=signal,
connections=connections, request_timeout=request_timeout,
request_max_size=request_max_size)
def write_response(self, response):
if isinstance(response, str):
response = text(response)
self.transport.write(
response.output(self.request.version)
)
self.transport.close()
@app.route('/')
async def string(request):
return 'string'
@app.route('/1')
async def response(request):
return text('response')
app.run(host='0.0.0.0', port=8000, protocol=CustomHttpProtocol)
```

View File

@@ -0,0 +1,76 @@
Custom Protocols
================
.. note::
This is advanced usage, and most readers will not need such functionality.
You can change the behavior of Sanic's protocol by specifying a custom
protocol, which should be a subclass
of `asyncio.protocol <https://docs.python.org/3/library/asyncio-protocol.html#protocol-classes>`_.
This protocol can then be passed as the keyword argument ``protocol`` to the ``sanic.run`` method.
The constructor of the custom protocol class receives the following keyword
arguments from Sanic.
- ``loop``: an ``asyncio``-compatible event loop.
- ``connections``: a ``set`` to store protocol objects. When Sanic receives
``SIGINT`` or ``SIGTERM``, it executes ``protocol.close_if_idle`` for all protocol
objects stored in this set.
- ``signal``: a ``sanic.server.Signal`` object with the ``stopped`` attribute. When
Sanic receives ``SIGINT`` or ``SIGTERM``, ``signal.stopped`` is assigned ``True``.
- ``request_handler``: a coroutine that takes a ``sanic.request.Request`` object
and a ``response`` callback as arguments.
- ``error_handler``: a ``sanic.exceptions.Handler`` which is called when exceptions
are raised.
- ``request_timeout``: the number of seconds before a request times out.
- ``request_max_size``: an integer specifying the maximum size of a request, in bytes.
Example
-------
An error occurs in the default protocol if a handler function does not return
an ``HTTPResponse`` object.
By overriding the ``write_response`` protocol method, if a handler returns a
string it will be converted to an ``HTTPResponse object``.
.. code:: python
from sanic import Sanic
from sanic.server import HttpProtocol
from sanic.response import text
app = Sanic(__name__)
class CustomHttpProtocol(HttpProtocol):
def __init__(self, *, loop, request_handler, error_handler,
signal, connections, request_timeout, request_max_size):
super().__init__(
loop=loop, request_handler=request_handler,
error_handler=error_handler, signal=signal,
connections=connections, request_timeout=request_timeout,
request_max_size=request_max_size)
def write_response(self, response):
if isinstance(response, str):
response = text(response)
self.transport.write(
response.output(self.request.version)
)
self.transport.close()
@app.route('/')
async def string(request):
return 'string'
@app.route('/1')
async def response(request):
return text('response')
app.run(host='0.0.0.0', port=8000, protocol=CustomHttpProtocol)

53
docs/sanic/debug_mode.rst Normal file
View File

@@ -0,0 +1,53 @@
Debug Mode
=============
When enabling Sanic's debug mode, Sanic will provide a more verbose logging output
and by default will enable the Auto Reload feature.
.. warning::
Sanic's debug more will slow down the server's performance
and is therefore advised to enable it only in development environments.
Setting the debug mode
----------------------
By setting the ``debug`` mode a more verbose output from Sanic will be outputed
and the Automatic Reloader will be activated.
.. code-block:: python
from sanic import Sanic
from sanic.response import json
app = Sanic()
@app.route('/')
async def hello_world(request):
return json({"hello": "world"})
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000, debug=True)
Manually setting auto reload
----------------------------
Sanic offers a way to enable or disable the Automatic Reloader manually,
the ``auto_reload`` argument will activate or deactivate the Automatic Reloader.
.. code-block:: python
from sanic import Sanic
from sanic.response import json
app = Sanic()
@app.route('/')
async def hello_world(request):
return json({"hello": "world"})
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000, auto_reload=True)

View File

@@ -34,6 +34,6 @@ def authorized():
@app.route("/")
@authorized()
async def test(request):
return json({status: 'authorized'})
return json({'status': 'authorized'})
```

View File

@@ -15,6 +15,7 @@ keyword arguments:
- `protocol` *(default `HttpProtocol`)*: Subclass
of
[asyncio.protocol](https://docs.python.org/3/library/asyncio-protocol.html#protocol-classes).
- `access_log` *(default `True`)*: Enables log on handling requests (significantly slows server).
## Workers
@@ -63,6 +64,26 @@ of the memory leak.
See the [Gunicorn Docs](http://docs.gunicorn.org/en/latest/settings.html#max-requests) for more information.
## Disable debug logging
To improve the performance add `debug=False` and `access_log=False` in the `run` arguments.
```python
app.run(host='0.0.0.0', port=1337, workers=4, debug=False, access_log=False)
```
Running via Gunicorn you can set Environment variable `SANIC_ACCESS_LOG="False"`
```
env SANIC_ACCESS_LOG="False" gunicorn myapp:app --bind 0.0.0.0:1337 --worker-class sanic.worker.GunicornWorker --log-level warning
```
Or you can rewrite app config directly
```python
app.config.ACCESS_LOG = False
```
## Asynchronous support
This is suitable if you *need* to share the sanic process with other applications, in particular the `loop`.
However be advised that this method does not support using multiple processes, and is not the preferred way

167
docs/sanic/examples.rst Normal file
View File

@@ -0,0 +1,167 @@
Examples
========
This section of the documentation is a simple collection of example code that can help you get a quick start
on your application development. Most of these examples are categorized and provide you with a link to the
working code example in the `Sanic Repository <https://github.com/huge-success/sanic/tree/master/examples>`_
Basic Examples
--------------
This section of the examples are a collection of code that provide a simple use case example of the sanic application.
Simple Apps
~~~~~~~~~~~~
A simple sanic application with a single ``async`` method with ``text`` and ``json`` type response.
.. literalinclude:: ../../examples/teapot.py
.. literalinclude:: ../../examples/simple_server.py
Simple App with ``Sanic Views``
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Showcasing the simple mechanism of using :class:`sanic.viewes.HTTPMethodView` as well as a way to extend the same
into providing a custom ``async`` behavior for ``view``.
.. literalinclude:: ../../examples/simple_async_view.py
URL Redirect
~~~~~~~~~~~~
.. literalinclude:: ../../examples/redirect_example.py
Named URL redirection
~~~~~~~~~~~~~~~~~~~~~
``Sanic`` provides an easy to use way of redirecting the requests via a helper method called ``url_for`` that takes a
unique url name as argument and returns you the actual route assigned for it. This will help in simplifying the
efforts required in redirecting the user between different section of the application.
.. literalinclude:: ../../examples/url_for_example.py
Blueprints
~~~~~~~~~~
``Sanic`` provides an amazing feature to group your APIs and routes under a logical collection that can easily be
imported and plugged into any of your sanic application and it's called ``blueprints``
.. literalinclude:: ../../examples/blueprints.py
Logging Enhancements
~~~~~~~~~~~~~~~~~~~~
Even though ``Sanic`` comes with a battery of Logging support it allows the end users to customize the way logging
is handled in the application runtime.
.. literalinclude:: ../../examples/override_logging.py
The following sample provides an example code that demonstrates the usage of :func:`sanic.app.Sanic.middleware` in order
to provide a mechanism to assign a unique request ID for each of the incoming requests and log them via
`aiotask-context <https://github.com/Skyscanner/aiotask-context>`_.
.. literalinclude:: ../../examples/log_request_id.py
Sanic Streaming Support
~~~~~~~~~~~~~~~~~~~~~~~
``Sanic`` framework comes with in-built support for streaming large files and the following code explains the process
to setup a ``Sanic`` application with streaming support.
.. literalinclude:: ../../examples/request_stream/server.py
Sample Client app to show the usage of streaming application by a client code.
.. literalinclude:: ../../examples/request_stream/client.py
Sanic Concurrency Support
~~~~~~~~~~~~~~~~~~~~~~~~~
``Sanic`` supports the ability to start an app with multiple worker support. However, it's important to be able to limit
the concurrency per process/loop in order to ensure an efficient execution. The following section of the code provides a
brief example of how to limit the concurrency with the help of :class:`asyncio.Semaphore`
.. literalinclude:: ../../examples/limit_concurrency.py
Sanic Deployment via Docker
~~~~~~~~~~~~~~~~~~~~~~~~~~~
Deploying a ``sanic`` app via ``docker`` and ``docker-compose`` is an easy task to achieve and the following example
provides a deployment of the sample ``simple_server.py``
.. literalinclude:: ../../examples/Dockerfile
.. literalinclude:: ../../examples/docker-compose.yml
Monitoring and Error Handling
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
``Sanic`` provides an extendable bare minimum implementation of a global exception handler via
:class:`sanic.handlers.ErrorHandler`. This example shows how to extend it to enable some custom behaviors.
.. literalinclude:: ../../examples/exception_monitoring.py
Monitoring using external Service Providers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
* `LogDNA <https://logdna.com/>`_
.. literalinclude:: ../../examples/logdna_example.py
* `RayGun <https://raygun.com/>`_
.. literalinclude:: ../../examples/raygun_example.py
* `Rollbar <https://rollbar.com>`_
.. literalinclude:: ../../examples/rollbar_example.py
* `Sentry <http://sentry.io>`_
.. literalinclude:: ../../examples/sentry_example.py
Security
~~~~~~~~
The following sample code shows a simple decorator based authentication and authorization mechanism that can be setup
to secure your ``sanic`` api endpoints.
.. literalinclude:: ../../examples/authorized_sanic.py
Sanic Websocket
~~~~~~~~~~~~~~~
``Sanic`` provides an ability to easily add a route and map it to a ``websocket`` handlers.
.. literalinclude:: ../../examples/websocket.html
.. literalinclude:: ../../examples/websocket.py
vhost Suppport
~~~~~~~~~~~~~~
.. literalinclude:: ../../examples/vhosts.py
Unit Testing With Parallel Test Run Support
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The following example shows you how to get up and running with unit testing ``sanic`` application with parallel test
execution support provided by the ``pytest-xdist`` plugin.
.. literalinclude:: ../../examples/pytest_xdist.py
Amending Request Object
~~~~~~~~~~~~~~~~~~~~~~~
The ``request`` object in ``Sanic`` is a kind of ``dict`` object, this means that ``reqeust`` object can be manipulated as a regular ``dict`` object.
.. literalinclude:: ../../examples/amending_request_object.py
For more examples and useful samples please visit the `Huge-Sanic's GitHub Page <https://github.com/huge-success/sanic/tree/master/examples>`_

View File

@@ -13,7 +13,7 @@ To throw an exception, simply `raise` the relevant exception from the
from sanic.exceptions import ServerError
@app.route('/killme')
def i_am_ready_to_die(request):
async def i_am_ready_to_die(request):
raise ServerError("Something bad happened", status_code=500)
```
@@ -24,7 +24,7 @@ from sanic.exceptions import abort
from sanic.response import text
@app.route('/youshallnotpass')
def no_no(request):
async def no_no(request):
abort(401)
# this won't happen
text("OK")
@@ -43,10 +43,40 @@ from sanic.response import text
from sanic.exceptions import NotFound
@app.exception(NotFound)
def ignore_404s(request, exception):
async def ignore_404s(request, exception):
return text("Yep, I totally found the page: {}".format(request.url))
```
You can also add an exception handler as such:
```python
from sanic import Sanic
async def server_error_handler(request, exception):
return text("Oops, server error", status=500)
app = Sanic()
app.error_handler.add(Exception, server_error_handler)
```
In some cases, you might want to add some more error handling
functionality to what is provided by default. In that case, you
can subclass Sanic's default error handler as such:
```python
from sanic import Sanic
from sanic.handlers import ErrorHandler
class CustomErrorHandler(ErrorHandler):
def default(self, request, exception):
''' handles errors that have no error handlers assigned '''
# You custom error handling logic...
return super().default(request, exception)
app = Sanic()
app.error_handler = CustomErrorHandler()
```
## Useful exceptions
Some of the most useful exceptions are presented below:

View File

@@ -2,26 +2,74 @@
A list of Sanic extensions created by the community.
- [Sessions](https://github.com/subyraman/sanic_session): Support for sessions.
Allows using redis, memcache or an in memory store.
## Extension and Plugin Development
- [Sanic-Plugins-Framework](https://github.com/ashleysommer/sanicpluginsframework): Library for easily creating and using Sanic plugins.
- [sanic-script](https://github.com/tim2anna/sanic-script): An extension for Sanic that adds support for writing commands to your application.
## Security
- [Sanic JWT](https://github.com/ahopkins/sanic-jwt): Authentication, JWT, and permission scoping for Sanic.
- [Secure](https://github.com/cakinney/secure): Secure 🔒 is a lightweight package that adds optional security headers and cookie attributes for Python web frameworks.
- [Sessions](https://github.com/subyraman/sanic_session): Support for sessions. Allows using redis, memcache or an in memory store.
- [CORS](https://github.com/ashleysommer/sanic-cors): A port of flask-cors.
- [Compress](https://github.com/subyraman/sanic_compress): Allows you to easily gzip Sanic responses. A port of Flask-Compress.
- [Jinja2](https://github.com/lixxu/sanic-jinja2): Support for Jinja2 template.
- [OpenAPI/Swagger](https://github.com/channelcat/sanic-openapi): OpenAPI support, plus a Swagger UI.
- [Pagination](https://github.com/lixxu/python-paginate): Simple pagination support.
- [Motor](https://github.com/lixxu/sanic-motor): Simple motor wrapper.
- [Sanic CRUD](https://github.com/Typhon66/sanic_crud): CRUD REST API generation with peewee models.
- [Sanic-JWT-Extended](https://github.com/devArtoria/Sanic-JWT-Extended): Provides extended JWT support for
- [UserAgent](https://github.com/lixxu/sanic-useragent): Add `user_agent` to request
- [Limiter](https://github.com/bohea/sanic-limiter): Rate limiting for sanic.
- [Sanic EnvConfig](https://github.com/jamesstidard/sanic-envconfig): Pull environment variables into your sanic config.
- [Babel](https://github.com/lixxu/sanic-babel): Adds i18n/l10n support to Sanic applications with the help of the
`Babel` library
- [Dispatch](https://github.com/ashleysommer/sanic-dispatcher): A dispatcher inspired by `DispatcherMiddleware` in werkzeug. Can act as a Sanic-to-WSGI adapter.
- [Sanic-OAuth](https://github.com/Sniedes722/Sanic-OAuth): OAuth Library for connecting to & creating your own token providers.
- [Sanic-nginx-docker-example](https://github.com/itielshwartz/sanic-nginx-docker-example): Simple and easy to use example of Sanic behined nginx using docker-compose.
- [sanic-graphql](https://github.com/graphql-python/sanic-graphql): GraphQL integration with Sanic
- [sanic-prometheus](https://github.com/dkruchinin/sanic-prometheus): Prometheus metrics for Sanic
- [sanic-oauth](https://gitlab.com/SirEdvin/sanic-oauth): OAuth Library with many provider and OAuth1/OAuth2 support.
- [Sanic-Auth](https://github.com/pyx/sanic-auth): A minimal backend agnostic session-based user authentication mechanism for Sanic.
- [Sanic-CookieSession](https://github.com/pyx/sanic-cookiesession): A client-side only, cookie-based session, similar to the built-in session in Flask.
## Documentation
- [OpenAPI/Swagger](https://github.com/channelcat/sanic-openapi): OpenAPI support, plus a Swagger UI.
- [Sanic-RestPlus](https://github.com/ashleysommer/sanic-restplus): A port of Flask-RestPlus for Sanic. Full-featured REST API with SwaggerUI generation.
- [sanic-transmute](https://github.com/yunstanford/sanic-transmute): A Sanic extension that generates APIs from python function and classes, and also generates Swagger UI/documentation automatically.
## ORM and Database Integration
- [Motor](https://github.com/lixxu/sanic-motor): Simple motor wrapper.
- [Sanic CRUD](https://github.com/Typhon66/sanic_crud): CRUD REST API generation with peewee models.
- [sanic-graphql](https://github.com/graphql-python/sanic-graphql): GraphQL integration with Sanic
- [GINO](https://github.com/fantix/gino): An asyncio ORM on top of SQLAlchemy core, delivered with a Sanic extension. ([Documentation](https://python-gino.readthedocs.io/))
- [Databases](https://github.com/encode/databases): Async database access for SQLAlchemy core, with support for PostgreSQL, MySQL, and SQLite.
## Unit Testing
- [pytest-sanic](https://github.com/yunstanford/pytest-sanic): A pytest plugin for Sanic. It helps you to test your code asynchronously.
- [jinja2-sanic](https://github.com/yunstanford/jinja2-sanic): a jinja2 template renderer for Sanic.([Documentation](http://jinja2-sanic.readthedocs.io/en/latest/))
## Project Creation Template
- [cookiecutter-sanic](https://github.com/harshanarayana/cookiecutter-sanic): Get your sanic application up and running in a matter of second in a well defined project structure.
Batteries included for deployment, unit testing, automated release management and changelog generation.
## Templating
- [Sanic-WTF](https://github.com/pyx/sanic-wtf): Sanic-WTF makes using WTForms with Sanic and CSRF (Cross-Site Request Forgery) protection a little bit easier.
- [Jinja2](https://github.com/lixxu/sanic-jinja2): Support for Jinja2 template.
- [jinja2-sanic](https://github.com/yunstanford/jinja2-sanic): a jinja2 template renderer for Sanic.([Documentation](http://jinja2-sanic.readthedocs.io/en/latest/))
## API Helper Utilities
- [sanic-sse](https://github.com/inn0kenty/sanic_sse): [Server-Sent Events](https://en.wikipedia.org/wiki/Server-sent_events) implementation for Sanic.
- [Compress](https://github.com/subyraman/sanic_compress): Allows you to easily gzip Sanic responses. A port of Flask-Compress.
- [Pagination](https://github.com/lixxu/python-paginate): Simple pagination support.
- [Sanic EnvConfig](https://github.com/jamesstidard/sanic-envconfig): Pull environment variables into your sanic config.
## i18n/l10n Support
- [Babel](https://github.com/lixxu/sanic-babel): Adds i18n/l10n support to Sanic applications with the help of the `Babel` library
## Custom Middlewares
- [Dispatch](https://github.com/ashleysommer/sanic-dispatcher): A dispatcher inspired by `DispatcherMiddleware` in werkzeug. Can act as a Sanic-to-WSGI adapter.
## Monitoring and Reporting
- [sanic-prometheus](https://github.com/dkruchinin/sanic-prometheus): Prometheus metrics for Sanic
- [sanic-zipkin](https://github.com/kevinqqnj/sanic-zipkin): Easily report request/function/RPC traces to zipkin/jaeger, through aiozipkin.
## Sample Applications
- [Sanic-nginx-docker-example](https://github.com/itielshwartz/sanic-nginx-docker-example): Simple and easy to use example of Sanic behined nginx using docker-compose.

View File

@@ -4,24 +4,45 @@ Make sure you have both [pip](https://pip.pypa.io/en/stable/installing/) and at
least version 3.5 of Python before starting. Sanic uses the new `async`/`await`
syntax, so earlier versions of python won't work.
1. Install Sanic: `python3 -m pip install sanic`
2. Create a file called `main.py` with the following code:
## 1. Install Sanic
```
pip3 install sanic
```
To install sanic without `uvloop` or `ujson` using bash, you can provide either or both of these environmental variables
using any truthy string like `'y', 'yes', 't', 'true', 'on', '1'` and setting the `SANIC_NO_X` (`X` = `UVLOOP`/`UJSON`)
to true will stop that features installation.
```bash
SANIC_NO_UVLOOP=true SANIC_NO_UJSON=true pip3 install sanic
```
## 2. Create a file called `main.py`
```python
from sanic import Sanic
from sanic.response import text
from sanic.response import json
app = Sanic(__name__)
app = Sanic()
@app.route("/")
async def test(request):
return text('Hello world!')
return json({"hello": "world"})
app.run(host="0.0.0.0", port=8000, debug=True)
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8000)
```
3. Run the server: `python3 main.py`
4. Open the address `http://0.0.0.0:8000` in your web browser. You should see
the message *Hello world!*.
## 3. Run the server
```
python3 main.py
```
## 4. Check your browser
Open the address `http://0.0.0.0:8000` in your web browser. You should see
the message *Hello world!*.
You now have a working Sanic server!

View File

@@ -1,119 +0,0 @@
# Logging
Sanic allows you to do different types of logging (access log, error log) on the requests based on the [python3 logging API](https://docs.python.org/3/howto/logging.html). You should have some basic knowledge on python3 logging if you want to create a new configuration.
### Quick Start
A simple example using default settings would be like this:
```python
from sanic import Sanic
from sanic.config import LOGGING
# The default logging handlers are ['accessStream', 'errorStream']
# but we change it to use other handlers here for demo purpose
LOGGING['loggers']['network']['handlers'] = [
'accessSysLog', 'errorSysLog']
app = Sanic('test')
@app.route('/')
async def test(request):
return response.text('Hello World!')
if __name__ == "__main__":
app.run(log_config=LOGGING)
```
And to close logging, simply assign log_config=None:
```python
if __name__ == "__main__":
app.run(log_config=None)
```
This would skip calling logging functions when handling requests.
And you could even do further in production to gain extra speed:
```python
if __name__ == "__main__":
# disable internal messages
app.run(debug=False, log_config=None)
```
### Configuration
By default, log_config parameter is set to use sanic.config.LOGGING dictionary for configuration. The default configuration provides several predefined `handlers`:
- internal (using [logging.StreamHandler](https://docs.python.org/3/library/logging.handlers.html#logging.StreamHandler))<br>
For internal information console outputs.
- accessStream (using [logging.StreamHandler](https://docs.python.org/3/library/logging.handlers.html#logging.StreamHandler))<br>
For requests information logging in console
- errorStream (using [logging.StreamHandler](https://docs.python.org/3/library/logging.handlers.html#logging.StreamHandler))<br>
For error message and traceback logging in console.
- accessSysLog (using [logging.handlers.SysLogHandler](https://docs.python.org/3/library/logging.handlers.html#logging.handlers.SysLogHandler))<br>
For requests information logging to syslog.
Currently supports Windows (via localhost:514), Darwin (/var/run/syslog),
Linux (/dev/log) and FreeBSD (/dev/log).<br>
You would not be able to access this property if the directory doesn't exist.
(Notice that in Docker you have to enable everything by yourself)
- errorSysLog (using [logging.handlers.SysLogHandler](https://docs.python.org/3/library/logging.handlers.html#logging.handlers.SysLogHandler))<br>
For error message and traceback logging to syslog.
Currently supports Windows (via localhost:514), Darwin (/var/run/syslog),
Linux (/dev/log) and FreeBSD (/dev/log).<br>
You would not be able to access this property if the directory doesn't exist.
(Notice that in Docker you have to enable everything by yourself)
And `filters`:
- accessFilter (using sanic.log.DefaultFilter)<br>
The filter that allows only levels in `DEBUG`, `INFO`, and `NONE(0)`
- errorFilter (using sanic.log.DefaultFilter)<br>
The filter that allows only levels in `WARNING`, `ERROR`, and `CRITICAL`
There are two `loggers` used in sanic, and **must be defined if you want to create your own logging configuration**:
- sanic:<br>
Used to log internal messages.
- network:<br>
Used to log requests from network, and any information from those requests.
#### Log format:
In addition to default parameters provided by python (asctime, levelname, message),
Sanic provides additional parameters for network logger with accessFilter:
- host (str)<br>
request.ip
- request (str)<br>
request.method + " " + request.url
- status (int)<br>
response.status
- byte (int)<br>
len(response.body)
The default access log format is
```python
%(asctime)s - (%(name)s)[%(levelname)s][%(host)s]: %(request)s %(message)s %(status)d %(byte)d
```

103
docs/sanic/logging.rst Normal file
View File

@@ -0,0 +1,103 @@
Logging
=======
Sanic allows you to do different types of logging (access log, error
log) on the requests based on the `python3 logging API`_. You should
have some basic knowledge on python3 logging if you want to create a new
configuration.
Quick Start
~~~~~~~~~~~
A simple example using default settings would be like this:
.. code:: python
from sanic import Sanic
from sanic.log import logger
from sanic.response import text
app = Sanic('test')
@app.route('/')
async def test(request):
logger.info('Here is your log')
return text('Hello World!')
if __name__ == "__main__":
app.run(debug=True, access_log=True)
After the server is running, you can see some messages looks like:
::
[2018-11-06 21:16:53 +0800] [24622] [INFO] Goin' Fast @ http://127.0.0.1:8000
[2018-11-06 21:16:53 +0800] [24667] [INFO] Starting worker [24667]
You can send a request to server and it will print the log messages:
::
[2018-11-06 21:18:53 +0800] [25685] [INFO] Here is your log
[2018-11-06 21:18:53 +0800] - (sanic.access)[INFO][127.0.0.1:57038]: GET http://localhost:8000/ 200 12
To use your own logging config, simply use
``logging.config.dictConfig``, or pass ``log_config`` when you
initialize ``Sanic`` app:
.. code:: python
app = Sanic('test', log_config=LOGGING_CONFIG)
And to close logging, simply assign access_log=False:
.. code:: python
if __name__ == "__main__":
app.run(access_log=False)
This would skip calling logging functions when handling requests. And
you could even do further in production to gain extra speed:
.. code:: python
if __name__ == "__main__":
# disable debug messages
app.run(debug=False, access_log=False)
Configuration
~~~~~~~~~~~~~
By default, ``log_config`` parameter is set to use
``sanic.log.LOGGING_CONFIG_DEFAULTS`` dictionary for configuration.
There are three ``loggers`` used in sanic, and **must be defined if you
want to create your own logging configuration**:
================ ==============================
Logger Name Usecase
================ ==============================
``sanic.root`` Used to log internal messages.
``sanic.error`` Used to log error logs.
``sanic.access`` Used to log access logs.
================ ==============================
Log format:
^^^^^^^^^^^
In addition to default parameters provided by python (``asctime``,
``levelname``, ``message``), Sanic provides additional parameters for
access logger with:
===================== ========================================== ========
Log Context Parameter Parameter Value Datatype
===================== ========================================== ========
``host`` ``request.ip`` str
``request`` ``request.method`` + " " + ``request.url`` str
``status`` ``response.status`` int
``byte`` ``len(response.body)`` int
===================== ========================================== ========
The default access log format is ``%(asctime)s - (%(name)s)[%(levelname)s][%(host)s]: %(request)s %(message)s %(status)d %(byte)d``
.. _python3 logging API: https://docs.python.org/3/howto/logging.html

View File

@@ -4,19 +4,20 @@ Middleware are functions which are executed before or after requests to the
server. They can be used to modify the *request to* or *response from*
user-defined handler functions.
Additionally, Sanic providers listeners which allow you to run code at various points of your application's lifecycle.
Additionally, Sanic provides listeners which allow you to run code at various points of your application's lifecycle.
## Middleware
There are two types of middleware: request and response. Both are declared
using the `@app.middleware` decorator, with the decorator's parameter being a
string representing its type: `'request'` or `'response'`. Response middleware
receives both the request and the response as arguments.
string representing its type: `'request'` or `'response'`.
* Request middleware receives only the `request` as argument.
* Response middleware receives both the `request` and `response`.
The simplest middleware doesn't modify the request or response at all:
```python
```
@app.middleware('request')
async def print_on_request(request):
print("I print when a request is received by the server")
@@ -32,23 +33,34 @@ Middleware can modify the request or response parameter it is given, *as long
as it does not return it*. The following example shows a practical use-case for
this.
```python
```
app = Sanic(__name__)
@app.middleware('request')
async def add_key(request):
# Add a key to request object like dict object
request['foo'] = 'bar'
@app.middleware('response')
async def custom_banner(request, response):
response.headers["Server"] = "Fake-Server"
@app.middleware('response')
async def prevent_xss(request, response):
response.headers["x-xss-protection"] = "1; mode=block"
app.run(host="0.0.0.0", port=8000)
```
The above code will apply the two middleware in order. First, the middleware
The above code will apply the three middleware in order. The first middleware
**add_key** will add a new key `foo` into `request` object. This worked because
`request` object can be manipulated like `dict` object. Then, the second middleware
**custom_banner** will change the HTTP response header *Server* to
*Fake-Server*, and the second middleware **prevent_xss** will add the HTTP
*Fake-Server*, and the last middleware **prevent_xss** will add the HTTP
header for preventing Cross-Site-Scripting (XSS) attacks. These two functions
are invoked *after* a user function returns a response.
@@ -59,7 +71,7 @@ and the response will be returned. If this occurs to a request before the
relevant user route handler is reached, the handler will never be called.
Returning a response will also prevent any further middleware from running.
```python
```
@app.middleware('request')
async def halt_request(request):
return text('I halted the request')
@@ -78,11 +90,11 @@ If you want to execute startup/teardown code as your server starts or closes, yo
- `before_server_stop`
- `after_server_stop`
These listeners are implemented as decorators on functions which accept the app object as well as the asyncio loop.
These listeners are implemented as decorators on functions which accept the app object as well as the asyncio loop.
For example:
```python
```
@app.listener('before_server_start')
async def setup_db(app, loop):
app.db = await db_setup()
@@ -100,13 +112,47 @@ async def close_db(app, loop):
await app.db.close()
```
It's also possible to register a listener using the `register_listener` method.
This may be useful if you define your listeners in another module besides
the one you instantiate your app in.
```
app = Sanic()
async def setup_db(app, loop):
app.db = await db_setup()
app.register_listener(setup_db, 'before_server_start')
```
If you want to schedule a background task to run after the loop has started,
Sanic provides the `add_task` method to easily do so.
```python
```
async def notify_server_started_after_five_seconds():
await asyncio.sleep(5)
print('Server successfully started!')
app.add_task(notify_server_started_after_five_seconds())
```
Sanic will attempt to automatically inject the app, passing it as an argument to the task:
```
async def notify_server_started_after_five_seconds(app):
await asyncio.sleep(5)
print(app.name)
app.add_task(notify_server_started_after_five_seconds)
```
Or you can pass the app explicitly for the same effect:
```
async def notify_server_started_after_five_seconds(app):
await asyncio.sleep(5)
print(app.name)
app.add_task(notify_server_started_after_five_seconds(app))
`

View File

@@ -19,6 +19,8 @@ The following variables are accessible as properties on `Request` objects:
URL that resembles `?key1=value1&key2=value2`. If that URL were to be parsed,
the `args` dictionary would look like `{'key1': ['value1'], 'key2': ['value2']}`.
The request's `query_string` variable holds the unparsed string value.
Property is providing the default parsing strategy. If you would like to change it look to the section below
(`Changing the default parsing rules of the queryset`).
```python
from sanic.response import json
@@ -28,9 +30,54 @@ The following variables are accessible as properties on `Request` objects:
return json({ "parsed": True, "args": request.args, "url": request.url, "query_string": request.query_string })
```
- `raw_args` (dict) - On many cases you would need to access the url arguments in
a less packed dictionary. For same previous URL `?key1=value1&key2=value2`, the
`raw_args` dictionary would look like `{'key1': 'value1', 'key2': 'value2'}`.
- `query_args` (list) - On many cases you would need to access the url arguments in
a less packed form. `query_args` is the list of `(key, value)` tuples.
Property is providing the default parsing strategy. If you would like to change it look to the section below
(`Changing the default parsing rules of the queryset`).
For the same previous URL queryset `?key1=value1&key2=value2`, the
`query_args` list would look like `[('key1', 'value1'), ('key2', 'value2')]`.
And in case of the multiple params with the same key like `?key1=value1&key2=value2&key1=value3`
the `query_args` list would look like `[('key1', 'value1'), ('key2', 'value2'), ('key1', 'value3')]`.
The difference between Request.args and Request.query_args
for the queryset `?key1=value1&key2=value2&key1=value3`
```python
from sanic import Sanic
from sanic.response import json
app = Sanic(__name__)
@app.route("/test_request_args")
async def test_request_args(request):
return json({
"parsed": True,
"url": request.url,
"query_string": request.query_string,
"args": request.args,
"raw_args": request.raw_args,
"query_args": request.query_args,
})
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000)
```
Output
```
{
"parsed":true,
"url":"http:\/\/0.0.0.0:8000\/test_request_args?key1=value1&key2=value2&key1=value3",
"query_string":"key1=value1&key2=value2&key1=value3",
"args":{"key1":["value1","value3"],"key2":["value2"]},
"raw_args":{"key1":"value1","key2":"value2"},
"query_args":[["key1","value1"],["key2","value2"],["key1","value3"]]
}
```
`raw_args` contains only the first entry of `key1`. Will be deprecated in the future versions.
- `files` (dictionary of `File` objects) - List of files that have a name, body, and type
@@ -71,8 +118,16 @@ The following variables are accessible as properties on `Request` objects:
return text("You are trying to create a user with the following POST: %s" % request.body)
```
- `headers` (dict) - A case-insensitive dictionary that contains the request headers.
- `method` (str) - HTTP method of the request (ie `GET`, `POST`).
- `ip` (str) - IP address of the requester.
- `port` (str) - Port address of the requester.
- `socket` (tuple) - (IP, port) of the requester.
- `app` - a reference to the Sanic application object that is handling this request. This is useful when inside blueprints or other handlers in modules that do not have access to the global `app` object.
```python
@@ -95,8 +150,54 @@ The following variables are accessible as properties on `Request` objects:
- `path`: The path of the request: `/posts/1/`
- `query_string`: The query string of the request: `foo=bar` or a blank string `''`
- `uri_template`: Template for matching route handler: `/posts/<id>/`
- `token`: The value of Authorization header: `Basic YWRtaW46YWRtaW4=`
## Changing the default parsing rules of the queryset
The default parameters that are using internally in `args` and `query_args` properties to parse queryset:
- `keep_blank_values` (bool): `False` - flag indicating whether blank values in
percent-encoded queries should be treated as blank strings.
A true value indicates that blanks should be retained as blank
strings. The default false value indicates that blank values
are to be ignored and treated as if they were not included.
- `strict_parsing` (bool): `False` - flag indicating what to do with parsing errors. If
false (the default), errors are silently ignored. If true,
errors raise a ValueError exception.
- `encoding` and `errors` (str): 'utf-8' and 'replace' - specify how to decode percent-encoded sequences
into Unicode characters, as accepted by the bytes.decode() method.
If you would like to change that default parameters you could call `get_args` and `get_query_args` methods
with the new values.
For the queryset `/?test1=value1&test2=&test3=value3`:
```python
from sanic.response import json
@app.route("/query_string")
def query_string(request):
args_with_blank_values = request.get_args(keep_blank_values=True)
return json({
"parsed": True,
"url": request.url,
"args_with_blank_values": args_with_blank_values,
"query_string": request.query_string
})
```
The output will be:
```
{
"parsed": true,
"url": "http:\/\/0.0.0.0:8000\/query_string?test1=value1&test2=&test3=value3",
"args_with_blank_values": {"test1": ["value1""], "test2": "", "test3": ["value3"]},
"query_string": "test1=value1&test2=&test3=value3"
}
```
## Accessing values using `get` and `getlist`
The request properties which return a dictionary actually return a subclass of
@@ -117,3 +218,40 @@ args.get('titles') # => 'Post 1'
args.getlist('titles') # => ['Post 1', 'Post 2']
```
## Accessing the handler name with the request.endpoint attribute
The `request.endpoint` attribute holds the handler's name. For instance, the below
route will return "hello".
```python
from sanic.response import text
from sanic import Sanic
app = Sanic()
@app.get("/")
def hello(request):
return text(request.endpoint)
```
Or, with a blueprint it will be include both, separated by a period. For example,
the below route would return foo.bar:
```python
from sanic import Sanic
from sanic import Blueprint
from sanic.response import text
app = Sanic(__name__)
blueprint = Blueprint('foo')
@blueprint.get('/')
async def bar(request):
return text(request.endpoint)
app.blueprint(blueprint)
app.run(host="0.0.0.0", port=8000, debug=True)
```

View File

@@ -55,8 +55,8 @@ from sanic import response
@app.route("/streaming")
async def index(request):
async def streaming_fn(response):
response.write('foo')
response.write('bar')
await response.write('foo')
await response.write('bar')
return response.stream(streaming_fn, content_type='text/plain')
```
@@ -91,7 +91,7 @@ from sanic import response
@app.route('/raw')
def handle_request(request):
return response.raw('raw data')
return response.raw(b'raw data')
```
## Modify headers or status

View File

@@ -138,13 +138,14 @@ app.add_route(person_handler2, '/person/<name:[A-z]>', methods=['GET'])
Sanic provides a `url_for` method, to generate URLs based on the handler method name. This is useful if you want to avoid hardcoding url paths into your app; instead, you can just reference the handler name. For example:
```python
from sanic.response import redirect
@app.route('/')
async def index(request):
# generate a URL for the endpoint `post_handler`
url = app.url_for('post_handler', post_id=5)
# the URL is `/posts/5`, redirect to it
return redirect(url)
return redirect(url)
@app.route('/posts/<post_id>')
async def post_handler(request, post_id):
@@ -163,24 +164,24 @@ url = app.url_for('post_handler', post_id=5, arg_one='one', arg_two='two')
url = app.url_for('post_handler', post_id=5, arg_one=['one', 'two'])
# /posts/5?arg_one=one&arg_one=two
```
- Also some special arguments (`_anchor`, `_external`, `_scheme`, `_method`, `_server`) passed to `url_for` will have special url building (`_method` is not support now and will be ignored). For example:
- Also some special arguments (`_anchor`, `_external`, `_scheme`, `_method`, `_server`) passed to `url_for` will have special url building (`_method` is not supported now and will be ignored). For example:
```python
url = app.url_for('post_handler', post_id=5, arg_one='one', _anchor='anchor')
# /posts/5?arg_one=one#anchor
url = app.url_for('post_handler', post_id=5, arg_one='one', _external=True)
# //server/posts/5?arg_one=one
# _external requires passed argument _server or SERVER_NAME in app.config or url will be same as no _external
# _external requires you to pass an argument _server or set SERVER_NAME in app.config if not url will be same as no _external
url = app.url_for('post_handler', post_id=5, arg_one='one', _scheme='http', _external=True)
# http://server/posts/5?arg_one=one
# when specifying _scheme, _external must be True
# you can pass all special arguments one time
# you can pass all special arguments at once
url = app.url_for('post_handler', post_id=5, arg_one=['one', 'two'], arg_two=2, _anchor='anchor', _scheme='http', _external=True, _server='another_server:8888')
# http://another_server:8888/posts/5?arg_one=one&arg_one=two&arg_two=2#anchor
```
- All valid parameters must be passed to `url_for` to build a URL. If a parameter is not supplied, or if a parameter does not match the specified type, a `URLBuildError` will be thrown.
- All valid parameters must be passed to `url_for` to build a URL. If a parameter is not supplied, or if a parameter does not match the specified type, a `URLBuildError` will be raised.
## WebSocket routes
@@ -208,9 +209,128 @@ async def feed(request, ws):
app.add_websocket_route(my_websocket_handler, '/feed')
```
Handlers for a WebSocket route are passed the request as first argument, and a
Handlers to a WebSocket route are invoked with the request as first argument, and a
WebSocket protocol object as second argument. The protocol object has `send`
and `recv` methods to send and receive data respectively.
WebSocket support requires the [websockets](https://github.com/aaugustin/websockets)
package by Aymeric Augustin.
## About `strict_slashes`
You can make `routes` strict to trailing slash or not, it's configurable.
```python
# provide default strict_slashes value for all routes
app = Sanic('test_route_strict_slash', strict_slashes=True)
# you can also overwrite strict_slashes value for specific route
@app.get('/get', strict_slashes=False)
def handler(request):
return text('OK')
# It also works for blueprints
bp = Blueprint('test_bp_strict_slash', strict_slashes=True)
@bp.get('/bp/get', strict_slashes=False)
def handler(request):
return text('OK')
app.blueprint(bp)
```
## User defined route name
A custom route name can be used by passing a `name` argument while registering the route which will
override the default route name generated using the `handler.__name__` attribute.
```python
app = Sanic('test_named_route')
@app.get('/get', name='get_handler')
def handler(request):
return text('OK')
# then you need use `app.url_for('get_handler')`
# instead of # `app.url_for('handler')`
# It also works for blueprints
bp = Blueprint('test_named_bp')
@bp.get('/bp/get', name='get_handler')
def handler(request):
return text('OK')
app.blueprint(bp)
# then you need use `app.url_for('test_named_bp.get_handler')`
# instead of `app.url_for('test_named_bp.handler')`
# different names can be used for same url with different methods
@app.get('/test', name='route_test')
def handler(request):
return text('OK')
@app.post('/test', name='route_post')
def handler2(request):
return text('OK POST')
@app.put('/test', name='route_put')
def handler3(request):
return text('OK PUT')
# below url are the same, you can use any of them
# '/test'
app.url_for('route_test')
# app.url_for('route_post')
# app.url_for('route_put')
# for same handler name with different methods
# you need specify the name (it's url_for issue)
@app.get('/get')
def handler(request):
return text('OK')
@app.post('/post', name='post_handler')
def handler(request):
return text('OK')
# then
# app.url_for('handler') == '/get'
# app.url_for('post_handler') == '/post'
```
## Build URL for static files
Sanic supports using `url_for` method to build static file urls. In case if the static url
is pointing to a directory, `filename` parameter to the `url_for` can be ignored. q
```python
app = Sanic('test_static')
app.static('/static', './static')
app.static('/uploads', './uploads', name='uploads')
app.static('/the_best.png', '/home/ubuntu/test.png', name='best_png')
bp = Blueprint('bp', url_prefix='bp')
bp.static('/static', './static')
bp.static('/uploads', './uploads', name='uploads')
bp.static('/the_best.png', '/home/ubuntu/test.png', name='best_png')
app.blueprint(bp)
# then build the url
app.url_for('static', filename='file.txt') == '/static/file.txt'
app.url_for('static', name='static', filename='file.txt') == '/static/file.txt'
app.url_for('static', name='uploads', filename='file.txt') == '/uploads/file.txt'
app.url_for('static', name='best_png') == '/the_best.png'
# blueprint url building
app.url_for('static', name='bp.static', filename='file.txt') == '/bp/static/file.txt'
app.url_for('static', name='bp.uploads', filename='file.txt') == '/bp/uploads/file.txt'
app.url_for('static', name='bp.best_png') == '/bp/static/the_best.png'
```

66
docs/sanic/sockets.rst Normal file
View File

@@ -0,0 +1,66 @@
Sockets
=======
Sanic can use the python
`socket module <https://docs.python.org/3/library/socket.html>`_ to accommodate
non IPv4 sockets.
IPv6 example:
.. code:: python
from sanic import Sanic
from sanic.response import json
import socket
sock = socket.socket(socket.AF_INET6, socket.SOCK_STREAM)
sock.bind(('::', 7777))
app = Sanic()
@app.route("/")
async def test(request):
return json({"hello": "world"})
if __name__ == "__main__":
app.run(sock=sock)
to test IPv6 ``curl -g -6 "http://[::1]:7777/"``
UNIX socket example:
.. code:: python
import signal
import sys
import socket
import os
from sanic import Sanic
from sanic.response import json
server_socket = '/tmp/sanic.sock'
sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
sock.bind(server_socket)
app = Sanic()
@app.route("/")
async def test(request):
return json({"hello": "world"})
def signal_handler(sig, frame):
print('Exiting')
os.unlink(server_socket)
sys.exit(0)
if __name__ == "__main__":
app.run(sock=sock)
to test UNIX: ``curl -v --unix-socket /tmp/sanic.sock http://localhost/hello``

View File

@@ -1,21 +1,83 @@
# Static Files
Static files and directories, such as an image file, are served by Sanic when
registered with the `app.static` method. The method takes an endpoint URL and a
registered with the `app.static()` method. The method takes an endpoint URL and a
filename. The file specified will then be accessible via the given endpoint.
```python
from sanic import Sanic
from sanic.blueprints import Blueprint
app = Sanic(__name__)
# Serves files from the static folder to the URL /static
app.static('/static', './static')
# use url_for to build the url, name defaults to 'static' and can be ignored
app.url_for('static', filename='file.txt') == '/static/file.txt'
app.url_for('static', name='static', filename='file.txt') == '/static/file.txt'
# Serves the file /home/ubuntu/test.png when the URL /the_best.png
# is requested
app.static('/the_best.png', '/home/ubuntu/test.png')
app.static('/the_best.png', '/home/ubuntu/test.png', name='best_png')
# you can use url_for to build the static file url
# you can ignore name and filename parameters if you don't define it
app.url_for('static', name='best_png') == '/the_best.png'
app.url_for('static', name='best_png', filename='any') == '/the_best.png'
# you need define the name for other static files
app.static('/another.png', '/home/ubuntu/another.png', name='another')
app.url_for('static', name='another') == '/another.png'
app.url_for('static', name='another', filename='any') == '/another.png'
# also, you can use static for blueprint
bp = Blueprint('bp', url_prefix='/bp')
bp.static('/static', './static')
# servers the file directly
bp.static('/the_best.png', '/home/ubuntu/test.png', name='best_png')
app.blueprint(bp)
app.url_for('static', name='bp.static', filename='file.txt') == '/bp/static/file.txt'
app.url_for('static', name='bp.best_png') == '/bp/test_best.png'
app.run(host="0.0.0.0", port=8000)
```
Note: currently you cannot build a URL for a static file using `url_for`.
> **Note:** Sanic does not provide directory index when you serve a static directory.
## Virtual Host
The `app.static()` method also support **virtual host**. You can serve your static files with specific **virtual host** with `host` argument. For example:
```python
from sanic import Sanic
app = Sanic(__name__)
app.static('/static', './static')
app.static('/example_static', './example_static', host='www.example.com')
```
## Streaming Large File
In some cases, you might server large file(ex: videos, images, etc.) with Sanic. You can choose to use **streaming file** rather than download directly.
Here is an example:
```python
from sanic import Sanic
app = Sanic(__name__)
app.static('/large_video.mp4', '/home/ubuntu/large_video.mp4', stream_large_files=True)
```
When `stream_large_files` is `True`, Sanic will use `file_stream()` instead of `file()` to serve static files. This will use **1KB** as the default chunk size. And, if needed, you can also use a custom chunk size. For example:
```python
from sanic import Sanic
app = Sanic(__name__)
chunk_size = 1024 * 1024 * 8 # Set chunk size to 8KB
app.static('/large_video.mp4', '/home/ubuntu/large_video.mp4', stream_large_files=chunk_size)
```

View File

@@ -2,7 +2,7 @@
## Request Streaming
Sanic allows you to get request data by stream, as below. When the request ends, `request.stream.get()` returns `None`. Only post, put and patch decorator have stream argument.
Sanic allows you to get request data by stream, as below. When the request ends, `await request.stream.read()` returns `None`. Only post, put and patch decorator have stream argument.
```python
from sanic import Sanic
@@ -22,7 +22,7 @@ class SimpleView(HTTPMethodView):
async def post(self, request):
result = ''
while True:
body = await request.stream.get()
body = await request.stream.read()
if body is None:
break
result += body.decode('utf-8')
@@ -33,29 +33,42 @@ class SimpleView(HTTPMethodView):
async def handler(request):
async def streaming(response):
while True:
body = await request.stream.get()
body = await request.stream.read()
if body is None:
break
body = body.decode('utf-8').replace('1', 'A')
response.write(body)
await response.write(body)
return stream(streaming)
@bp.put('/bp_stream', stream=True)
async def bp_handler(request):
async def bp_put_handler(request):
result = ''
while True:
body = await request.stream.get()
body = await request.stream.read()
if body is None:
break
result += body.decode('utf-8').replace('1', 'A')
return text(result)
# You can also use `bp.add_route()` with stream argument
async def bp_post_handler(request):
result = ''
while True:
body = await request.stream.read()
if body is None:
break
result += body.decode('utf-8').replace('1', 'A')
return text(result)
bp.add_route(bp_post_handler, '/bp_stream', methods=['POST'], stream=True)
async def post_handler(request):
result = ''
while True:
body = await request.stream.get()
body = await request.stream.read()
if body is None:
break
result += body.decode('utf-8')
@@ -85,8 +98,8 @@ app = Sanic(__name__)
@app.route("/")
async def test(request):
async def sample_streaming_fn(response):
response.write('foo,')
response.write('bar')
await response.write('foo,')
await response.write('bar')
return stream(sample_streaming_fn, content_type='text/csv')
```
@@ -100,7 +113,7 @@ async def index(request):
conn = await asyncpg.connect(database='test')
async with conn.transaction():
async for record in conn.cursor('SELECT generate_series(0, 10)'):
response.write(record[0])
await response.write(record[0])
return stream(stream_from_db)
```

View File

@@ -20,7 +20,7 @@ def test_index_put_not_allowed():
assert response.status == 405
```
Internally, each time you call one of the `test_client` methods, the Sanic app is run at `127.0.01:42101` and
Internally, each time you call one of the `test_client` methods, the Sanic app is run at `127.0.0.1:42101` and
your test request is executed against your application, using `aiohttp`.
The `test_client` methods accept the following arguments and keyword arguments:
@@ -59,6 +59,23 @@ the available arguments to aiohttp can be found
[in the documentation for ClientSession](https://aiohttp.readthedocs.io/en/stable/client_reference.html#client-session).
## Using a random port
If you need to test using a free unpriveleged port chosen by the kernel
instead of the default with `SanicTestClient`, you can do so by specifying
`port=None`. On most systems the port will be in the range 1024 to 65535.
```python
# Import the Sanic app, usually created with Sanic(__name__)
from external_server import app
from sanic.testing import SanicTestClient
def test_index_returns_200():
request, response = SanicTestClient(app, port=None).get('/')
assert response.status == 200
```
## pytest-sanic
[pytest-sanic](https://github.com/yunstanford/pytest-sanic) is a pytest plugin, it helps you to test your code asynchronously.

52
docs/sanic/websocket.rst Normal file
View File

@@ -0,0 +1,52 @@
WebSocket
=========
Sanic provides an easy to user abstraction on top of `websockets`. To setup a WebSocket:
.. code:: python
from sanic import Sanic
from sanic.response import json
from sanic.websocket import WebSocketProtocol
app = Sanic()
@app.websocket('/feed')
async def feed(request, ws):
while True:
data = 'hello!'
print('Sending: ' + data)
await ws.send(data)
data = await ws.recv()
print('Received: ' + data)
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8000, protocol=WebSocketProtocol)
Alternatively, the ``app.add_websocket_route`` method can be used instead of the
decorator:
.. code:: python
async def feed(request, ws):
pass
app.add_websocket_route(feed, '/feed')
Handlers for a WebSocket route is invoked with the request as first argument, and a
WebSocket protocol object as second argument. The protocol object has ``send``
and ``recv`` methods to send and receive data respectively.
You could setup your own WebSocket configuration through ``app.config``, like
.. code:: python
app.config.WEBSOCKET_MAX_SIZE = 2 ** 20
app.config.WEBSOCKET_MAX_QUEUE = 32
app.config.WEBSOCKET_READ_LIMIT = 2 ** 16
app.config.WEBSOCKET_WRITE_LIMIT = 2 ** 16
Find more in ``Configuration`` section.

View File

@@ -1,19 +1,18 @@
name: py35
name: py36
dependencies:
- openssl=1.0.2g=0
- pip=8.1.1=py35_0
- python=3.5.1=0
- readline=6.2=2
- setuptools=20.3=py35_0
- sqlite=3.9.2=0
- tk=8.5.18=0
- wheel=0.29.0=py35_0
- xz=5.0.5=1
- zlib=1.2.8=0
- pip=18.1=py36_0
- python=3.6=0
- setuptools=40.4.3=py36_0
- pip:
- httptools>=0.0.10
- uvloop>=0.5.3
- httptools>=0.0.9
- ujson>=1.35
- aiofiles>=0.3.0
- websockets>=3.2
- https://github.com/channelcat/docutils-fork/zipball/master
- websockets>=6.0,<7.0
- multidict>=4.0,<5.0
- sphinx==1.8.3
- sphinx_rtd_theme==0.4.2
- recommonmark==0.5.0
- sphinxcontrib-asyncio>=0.2.0
- docutils==0.14
- pygments==2.3.1

View File

@@ -0,0 +1,17 @@
# -*- coding: utf-8 -*-
import asyncio
from sanic import Sanic
app = Sanic()
async def notify_server_started_after_five_seconds():
await asyncio.sleep(5)
print('Server successfully started!')
app.add_task(notify_server_started_after_five_seconds())
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8000)

View File

@@ -0,0 +1,30 @@
from sanic import Sanic
from sanic.response import text
from random import randint
app = Sanic()
@app.middleware('request')
def append_request(request):
# Add new key with random value
request['num'] = randint(0, 100)
@app.get('/pop')
def pop_handler(request):
# Pop key from request object
num = request.pop('num')
return text(num)
@app.get('/key_exist')
def key_exist_handler(request):
# Check the key is exist or not
if 'num' in request:
return text('num exist in request')
return text('num does not exist in reqeust')
app.run(host="0.0.0.0", port=8000, debug=True)

View File

@@ -0,0 +1,42 @@
# -*- coding: utf-8 -*-
from sanic import Sanic
from functools import wraps
from sanic.response import json
app = Sanic()
def check_request_for_authorization_status(request):
# Note: Define your check, for instance cookie, session.
flag = True
return flag
def authorized():
def decorator(f):
@wraps(f)
async def decorated_function(request, *args, **kwargs):
# run some method that checks the request
# for the client's authorization status
is_authorized = check_request_for_authorization_status(request)
if is_authorized:
# the user is authorized.
# run the handler method and return the response
response = await f(request, *args, **kwargs)
return response
else:
# the user is not authorized.
return json({'status': 'not_authorized'}, 403)
return decorated_function
return decorator
@app.route("/")
@authorized()
async def test(request):
return json({'status': 'authorized'})
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,7 +1,5 @@
from sanic import Sanic
from sanic import Blueprint
from sanic.response import json
from sanic import Blueprint, Sanic
from sanic.response import file, json
app = Sanic(__name__)
blueprint = Blueprint('name', url_prefix='/my_blueprint')
@@ -19,7 +17,12 @@ async def foo2(request):
return json({'msg': 'hi from blueprint2'})
@blueprint3.websocket('/foo')
@blueprint3.route('/foo')
async def index(request):
return await file('websocket.html')
@app.websocket('/feed')
async def foo3(request, ws):
while True:
data = 'hello!'

View File

@@ -0,0 +1,86 @@
'''
Based on example from https://github.com/Skyscanner/aiotask-context
and `examples/{override_logging,run_async}.py`.
Needs https://github.com/Skyscanner/aiotask-context/tree/52efbc21e2e1def2d52abb9a8e951f3ce5e6f690 or newer
$ pip install git+https://github.com/Skyscanner/aiotask-context.git
'''
import asyncio
import uuid
import logging
from signal import signal, SIGINT
from sanic import Sanic
from sanic import response
import uvloop
import aiotask_context as context
log = logging.getLogger(__name__)
class RequestIdFilter(logging.Filter):
def filter(self, record):
record.request_id = context.get('X-Request-ID')
return True
LOG_SETTINGS = {
'version': 1,
'disable_existing_loggers': False,
'handlers': {
'console': {
'class': 'logging.StreamHandler',
'level': 'DEBUG',
'formatter': 'default',
'filters': ['requestid'],
},
},
'filters': {
'requestid': {
'()': RequestIdFilter,
},
},
'formatters': {
'default': {
'format': '%(asctime)s %(levelname)s %(name)s:%(lineno)d %(request_id)s | %(message)s',
},
},
'loggers': {
'': {
'level': 'DEBUG',
'handlers': ['console'],
'propagate': True
},
}
}
app = Sanic(__name__, log_config=LOG_SETTINGS)
@app.middleware('request')
async def set_request_id(request):
request_id = request.headers.get('X-Request-ID') or str(uuid.uuid4())
context.set("X-Request-ID", request_id)
@app.route("/")
async def test(request):
log.debug('X-Request-ID: %s', context.get('X-Request-ID'))
log.info('Hello from test!')
return response.json({"test": True})
if __name__ == '__main__':
asyncio.set_event_loop(uvloop.new_event_loop())
server = app.create_server(host="0.0.0.0", port=8000)
loop = asyncio.get_event_loop()
loop.set_task_factory(context.task_factory)
task = asyncio.ensure_future(server)
try:
loop.run_forever()
except:
loop.stop()

View File

@@ -0,0 +1,61 @@
import logging
import socket
from os import getenv
from platform import node
from uuid import getnode as get_mac
from logdna import LogDNAHandler
from sanic import Sanic
from sanic.response import json
from sanic.request import Request
log = logging.getLogger('logdna')
log.setLevel(logging.INFO)
def get_my_ip_address(remote_server="google.com"):
with socket.socket(socket.AF_INET, socket.SOCK_DGRAM) as s:
s.connect((remote_server, 80))
return s.getsockname()[0]
def get_mac_address():
h = iter(hex(get_mac())[2:].zfill(12))
return ":".join(i + next(h) for i in h)
logdna_options = {
"app": __name__,
"index_meta": True,
"hostname": node(),
"ip": get_my_ip_address(),
"mac": get_mac_address()
}
logdna_handler = LogDNAHandler(getenv("LOGDNA_API_KEY"), options=logdna_options)
logdna = logging.getLogger(__name__)
logdna.setLevel(logging.INFO)
logdna.addHandler(logdna_handler)
app = Sanic(__name__)
@app.middleware
def log_request(request: Request):
logdna.info("I was Here with a new Request to URL: {}".format(request.url))
@app.route("/")
def default(request):
return json({
"response": "I was here"
})
if __name__ == "__main__":
app.run(
host="0.0.0.0",
port=getenv("PORT", 8080)
)

49
examples/pytest_xdist.py Normal file
View File

@@ -0,0 +1,49 @@
"""pytest-xdist example for sanic server
Install testing tools:
$ pip install pytest pytest-xdist
Run with xdist params:
$ pytest examples/pytest_xdist.py -n 8 # 8 workers
"""
import re
from sanic import Sanic
from sanic.response import text
from sanic.testing import PORT as PORT_BASE, SanicTestClient
import pytest
@pytest.fixture(scope="session")
def test_port(worker_id):
m = re.search(r'[0-9]+', worker_id)
if m:
num_id = m.group(0)
else:
num_id = 0
port = PORT_BASE + int(num_id)
return port
@pytest.fixture(scope="session")
def app():
app = Sanic()
@app.route('/')
async def index(request):
return text('OK')
return app
@pytest.fixture(scope="session")
def client(app, test_port):
return SanicTestClient(app, test_port)
@pytest.mark.parametrize('run_id', range(100))
def test_index(client, run_id):
request, response = client._sanic_endpoint_test('get', '/')
assert response.status == 200
assert response.text == 'OK'

View File

@@ -0,0 +1,37 @@
from os import getenv
from raygun4py.raygunprovider import RaygunSender
from sanic import Sanic
from sanic.exceptions import SanicException
from sanic.handlers import ErrorHandler
class RaygunExceptionReporter(ErrorHandler):
def __init__(self, raygun_api_key=None):
super().__init__()
if raygun_api_key is None:
raygun_api_key = getenv("RAYGUN_API_KEY")
self.sender = RaygunSender(raygun_api_key)
def default(self, request, exception):
self.sender.send_exception(exception=exception)
return super().default(request, exception)
raygun_error_reporter = RaygunExceptionReporter()
app = Sanic(__name__, error_handler=raygun_error_reporter)
@app.route("/raise")
async def test(request):
raise SanicException('You Broke It!')
if __name__ == '__main__':
app.run(
host="0.0.0.0",
port=getenv("PORT", 8080)
)

View File

@@ -30,7 +30,7 @@ async def handler(request):
if body is None:
break
body = body.decode('utf-8').replace('1', 'A')
response.write(body)
await response.write(body)
return stream(streaming)

View File

@@ -0,0 +1,30 @@
import rollbar
from sanic.handlers import ErrorHandler
from sanic import Sanic
from sanic.exceptions import SanicException
from os import getenv
rollbar.init(getenv("ROLLBAR_API_KEY"))
class RollbarExceptionHandler(ErrorHandler):
def default(self, request, exception):
rollbar.report_message(str(exception))
return super().default(request, exception)
app = Sanic(__name__, error_handler=RollbarExceptionHandler())
@app.route("/raise")
def create_error(request):
raise SanicException("I was here and I don't like where I am")
if __name__ == "__main__":
app.run(
host="0.0.0.0",
port=getenv("PORT", 8080)
)

View File

@@ -0,0 +1,35 @@
from os import getenv
from sentry_sdk import init as sentry_init
from sentry_sdk.integrations.sanic import SanicIntegration
from sanic import Sanic
from sanic.response import json
sentry_init(
dsn=getenv("SENTRY_DSN"),
integrations=[SanicIntegration()],
)
app = Sanic(__name__)
# noinspection PyUnusedLocal
@app.route("/working")
async def working_path(request):
return json({
"response": "Working API Response"
})
# noinspection PyUnusedLocal
@app.route("/raise-error")
async def raise_error(request):
raise Exception("Testing Sentry Integration")
if __name__ == '__main__':
app.run(
host="0.0.0.0",
port=getenv("PORT", 8080)
)

View File

@@ -0,0 +1,42 @@
from sanic import Sanic
from sanic.views import HTTPMethodView
from sanic.response import text
app = Sanic('some_name')
class SimpleView(HTTPMethodView):
def get(self, request):
return text('I am get method')
def post(self, request):
return text('I am post method')
def put(self, request):
return text('I am put method')
def patch(self, request):
return text('I am patch method')
def delete(self, request):
return text('I am delete method')
class SimpleAsyncView(HTTPMethodView):
async def get(self, request):
return text('I am async get method')
async def post(self, request):
return text('I am async post method')
async def put(self, request):
return text('I am async put method')
app.add_route(SimpleView.as_view(), '/')
app.add_route(SimpleAsyncView.as_view(), '/async')
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000, debug=True)

13
examples/teapot.py Normal file
View File

@@ -0,0 +1,13 @@
from sanic import Sanic
from sanic import response as res
app = Sanic(__name__)
@app.route("/")
async def test(req):
return res.text("I\'m a teapot", status=418)
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8000)

View File

@@ -1,7 +1,7 @@
import os
from sanic import Sanic
from sanic.log import log
from sanic.log import logger as log
from sanic import response
from sanic.exceptions import ServerError
@@ -18,7 +18,7 @@ def test_sync(request):
return response.json({"test": True})
@app.route("/dynamic/<name>/<id:int>")
@app.route("/dynamic/<name>/<i:int>")
def test_params(request, name, i):
return response.text("yeehaww {} {}".format(name, i))
@@ -66,7 +66,7 @@ def post_json(request):
@app.route("/form")
def post_json(request):
def post_form_json(request):
return response.json({"received": True, "form_data": request.form, "test": request.form.get('test')})

2
pyproject.toml Normal file
View File

@@ -0,0 +1,2 @@
[tool.black]
line-length = 79

304
release.py Executable file
View File

@@ -0,0 +1,304 @@
#!/usr/bin/env python
from argparse import ArgumentParser, Namespace
from collections import OrderedDict
from configparser import RawConfigParser
from datetime import datetime
from json import dumps
from os import path
from subprocess import Popen, PIPE
from jinja2 import Environment, BaseLoader
from requests import patch
GIT_COMMANDS = {
"get_tag": ["git describe --tags --abbrev=0"],
"commit_version_change": [
"git add . && git commit -m 'Bumping up version from "
"{current_version} to {new_version}'"
],
"create_new_tag": [
"git tag -a {new_version} -m 'Bumping up version from "
"{current_version} to {new_version}'"
],
"push_tag": ["git push origin {new_version}"],
"get_change_log": [
'git log --no-merges --pretty=format:"%h::: %cn::: %s" '
"{current_version}.."
],
}
RELEASE_NOTE_TEMPLATE = """
# {{ release_name }} - {% now 'utc', '%Y-%m-%d' %}
To see the exhaustive list of pull requests included in this release see:
https://github.com/huge-success/sanic/milestone/{{milestone}}?closed=1
# Changelog
{% for row in changelogs %}
* {{ row -}}
{% endfor %}
# Credits
{% for author in authors %}
* {{ author -}}
{% endfor %}
"""
JINJA_RELEASE_NOTE_TEMPLATE = Environment(
loader=BaseLoader, extensions=["jinja2_time.TimeExtension"]
).from_string(RELEASE_NOTE_TEMPLATE)
RELEASE_NOTE_UPDATE_URL = (
"https://api.github.com/repos/huge-success/sanic/releases/tags/"
"{new_version}?access_token={token}"
)
def _run_shell_command(command: list):
try:
process = Popen(
command, stderr=PIPE, stdout=PIPE, stdin=PIPE, shell=True
)
output, error = process.communicate()
return_code = process.returncode
return output.decode("utf-8"), error, return_code
except:
return None, None, -1
def _fetch_default_calendar_release_version():
return datetime.now().strftime("%y.%m.0")
def _fetch_current_version(config_file: str) -> str:
if path.isfile(config_file):
config_parser = RawConfigParser()
with open(config_file) as cfg:
config_parser.read_file(cfg)
return (
config_parser.get("version", "current_version")
or _fetch_default_calendar_release_version()
)
else:
return _fetch_default_calendar_release_version()
def _change_micro_version(current_version: str):
version_string = current_version.split(".")
version_string[-1] = str((int(version_string[-1]) + 1))
return ".".join(version_string)
def _get_new_version(
config_file: str = "./setup.cfg",
current_version: str = None,
micro_release: bool = False,
):
if micro_release:
if current_version:
return _change_micro_version(current_version)
elif config_file:
return _change_micro_version(_fetch_current_version(config_file))
else:
return _fetch_default_calendar_release_version()
else:
return _fetch_default_calendar_release_version()
def _get_current_tag(git_command_name="get_tag"):
global GIT_COMMANDS
command = GIT_COMMANDS.get(git_command_name)
out, err, ret = _run_shell_command(command)
if len(str(out)):
return str(out).split("\n")[0]
else:
return None
def _update_release_version_for_sanic(
current_version, new_version, config_file
):
config_parser = RawConfigParser()
with open(config_file) as cfg:
config_parser.read_file(cfg)
config_parser.set("version", "current_version", new_version)
version_file = config_parser.get("version", "file")
current_version_line = config_parser.get(
"version", "current_version_pattern"
).format(current_version=current_version)
new_version_line = config_parser.get(
"version", "new_version_pattern"
).format(new_version=new_version)
with open(version_file) as init_file:
data = init_file.read()
new_data = data.replace(current_version_line, new_version_line)
with open(version_file, "w") as init_file:
init_file.write(new_data)
with open(config_file, "w") as config:
config_parser.write(config)
command = GIT_COMMANDS.get("commit_version_change")
command[0] = command[0].format(
new_version=new_version, current_version=current_version
)
_, err, ret = _run_shell_command(command)
if int(ret) != 0:
print(
"Failed to Commit Version upgrade changes to Sanic: {}".format(
err.decode("utf-8")
)
)
exit(1)
def _generate_change_log(current_version: str = None):
global GIT_COMMANDS
command = GIT_COMMANDS.get("get_change_log")
command[0] = command[0].format(current_version=current_version)
output, error, ret = _run_shell_command(command=command)
if not len(str(output)):
print("Unable to Fetch Change log details to update the Release Note")
exit(1)
commit_details = OrderedDict()
commit_details["authors"] = dict()
commit_details["commits"] = list()
for line in str(output).split("\n"):
commit, author, description = line.split(":::")
if "GitHub" not in author:
commit_details["authors"][author] = 1
commit_details["commits"].append(" - ".join([commit, description]))
return commit_details
def _generate_markdown_document(
milestone, release_name, current_version, release_version
):
global JINJA_RELEASE_NOTE_TEMPLATE
release_name = release_name or release_version
change_log = _generate_change_log(current_version=current_version)
return JINJA_RELEASE_NOTE_TEMPLATE.render(
release_name=release_name,
milestone=milestone,
changelogs=change_log["commits"],
authors=change_log["authors"].keys(),
)
def _tag_release(new_version, current_version, milestone, release_name, token):
global GIT_COMMANDS
global RELEASE_NOTE_UPDATE_URL
for command_name in ["create_new_tag", "push_tag"]:
command = GIT_COMMANDS.get(command_name)
command[0] = command[0].format(
new_version=new_version, current_version=current_version
)
out, error, ret = _run_shell_command(command=command)
if int(ret) != 0:
print("Failed to execute the command: {}".format(command[0]))
exit(1)
change_log = _generate_markdown_document(
milestone, release_name, current_version, new_version
)
body = {"name": release_name or new_version, "body": change_log}
headers = {"content-type": "application/json"}
response = patch(
RELEASE_NOTE_UPDATE_URL.format(new_version=new_version, token=token),
data=dumps(body),
headers=headers,
)
response.raise_for_status()
def release(args: Namespace):
current_tag = _get_current_tag()
current_version = _fetch_current_version(args.config)
if current_tag and current_version not in current_tag:
print(
"Tag mismatch between what's in git and what was provided by "
"--current-version. Existing: {}, Give: {}".format(
current_tag, current_version
)
)
exit(1)
new_version = args.release_version or _get_new_version(
args.config, current_version, args.micro_release
)
_update_release_version_for_sanic(
current_version=current_version,
new_version=new_version,
config_file=args.config,
)
_tag_release(
current_version=current_version,
new_version=new_version,
milestone=args.milestone,
release_name=args.release_name,
token=args.token,
)
if __name__ == "__main__":
cli = ArgumentParser(description="Sanic Release Manager")
cli.add_argument(
"--release-version",
"-r",
help="New Version to use for Release",
default=_fetch_default_calendar_release_version(),
required=False,
)
cli.add_argument(
"--current-version",
"-cv",
help="Current Version to default in case if you don't want to "
"use the version configuration files",
default=None,
required=False,
)
cli.add_argument(
"--config",
"-c",
help="Configuration file used for release",
default="./setup.cfg",
required=False,
)
cli.add_argument(
"--token",
"-t",
help="Git access token with necessary access to Huge Sanic Org",
required=True,
)
cli.add_argument(
"--milestone",
"-ms",
help="Git Release milestone information to include in relase note",
required=True,
)
cli.add_argument(
"--release-name",
"-n",
help="Release Name to use if any",
required=False,
)
cli.add_argument(
"--micro-release",
"-m",
help="Micro Release with patches only",
default=False,
action="store_true",
required=False,
)
args = cli.parse_args()
release(args)

View File

@@ -1,12 +0,0 @@
aiofiles
aiohttp==1.3.5
chardet<=2.3.0
beautifulsoup4
coverage
httptools
flake8
pytest
tox
ujson
uvloop
gunicorn

View File

@@ -1,3 +0,0 @@
sphinx
sphinx_rtd_theme
recommonmark

View File

@@ -1,5 +0,0 @@
aiofiles
httptools
ujson
uvloop
websockets

View File

@@ -1,6 +1,7 @@
from sanic.app import Sanic
from sanic.blueprints import Blueprint
__version__ = '0.6.0'
__all__ = ['Sanic', 'Blueprint']
__version__ = "19.03.1"
__all__ = ["Sanic", "Blueprint"]

View File

@@ -1,20 +1,23 @@
from argparse import ArgumentParser
from importlib import import_module
from sanic.log import log
from sanic.app import Sanic
from sanic.log import logger
if __name__ == "__main__":
parser = ArgumentParser(prog='sanic')
parser.add_argument('--host', dest='host', type=str, default='127.0.0.1')
parser.add_argument('--port', dest='port', type=int, default=8000)
parser.add_argument('--cert', dest='cert', type=str,
help='location of certificate for SSL')
parser.add_argument('--key', dest='key', type=str,
help='location of keyfile for SSL.')
parser.add_argument('--workers', dest='workers', type=int, default=1, )
parser.add_argument('--debug', dest='debug', action="store_true")
parser.add_argument('module')
parser = ArgumentParser(prog="sanic")
parser.add_argument("--host", dest="host", type=str, default="127.0.0.1")
parser.add_argument("--port", dest="port", type=int, default=8000)
parser.add_argument(
"--cert", dest="cert", type=str, help="location of certificate for SSL"
)
parser.add_argument(
"--key", dest="key", type=str, help="location of keyfile for SSL."
)
parser.add_argument("--workers", dest="workers", type=int, default=1)
parser.add_argument("--debug", dest="debug", action="store_true")
parser.add_argument("module")
args = parser.parse_args()
try:
@@ -25,20 +28,29 @@ if __name__ == "__main__":
module = import_module(module_name)
app = getattr(module, app_name, None)
if not isinstance(app, Sanic):
raise ValueError("Module is not a Sanic app, it is a {}. "
"Perhaps you meant {}.app?"
.format(type(app).__name__, args.module))
raise ValueError(
"Module is not a Sanic app, it is a {}. "
"Perhaps you meant {}.app?".format(
type(app).__name__, args.module
)
)
if args.cert is not None or args.key is not None:
ssl = {'cert': args.cert, 'key': args.key}
ssl = {"cert": args.cert, "key": args.key}
else:
ssl = None
app.run(host=args.host, port=args.port,
workers=args.workers, debug=args.debug, ssl=ssl)
app.run(
host=args.host,
port=args.port,
workers=args.workers,
debug=args.debug,
ssl=ssl,
)
except ImportError as e:
log.error("No module named {} found.\n"
" Example File: project/sanic_server.py -> app\n"
" Example Module: project.sanic_server.app"
.format(e.name))
except ValueError as e:
log.error("{}".format(e))
logger.error(
"No module named {} found.\n"
" Example File: project/sanic_server.py -> app\n"
" Example Module: project.sanic_server.app".format(e.name)
)
except ValueError:
logger.exception("Failed to run app")

File diff suppressed because it is too large Load Diff

120
sanic/blueprint_group.py Normal file
View File

@@ -0,0 +1,120 @@
from collections import MutableSequence
class BlueprintGroup(MutableSequence):
"""
This class provides a mechanism to implement a Blueprint Group
using the `Blueprint.group` method. To avoid having to re-write
some of the existing implementation, this class provides a custom
iterator implementation that will let you use the object of this
class as a list/tuple inside the existing implementation.
"""
__slots__ = ("_blueprints", "_url_prefix")
def __init__(self, url_prefix=None):
"""
Create a new Blueprint Group
:param url_prefix: URL: to be prefixed before all the Blueprint Prefix
"""
self._blueprints = []
self._url_prefix = url_prefix
@property
def url_prefix(self):
"""
Retrieve the URL prefix being used for the Current Blueprint Group
:return: string with url prefix
"""
return self._url_prefix
@property
def blueprints(self):
"""
Retrieve a list of all the available blueprints under this group.
:return: List of Blueprint instance
"""
return self._blueprints
def __iter__(self):
"""Tun the class Blueprint Group into an Iterable item"""
return iter(self._blueprints)
def __getitem__(self, item):
"""
This method returns a blueprint inside the group specified by
an index value. This will enable indexing, splice and slicing
of the blueprint group like we can do with regular list/tuple.
This method is provided to ensure backward compatibility with
any of the pre-existing usage that might break.
:param item: Index of the Blueprint item in the group
:return: Blueprint object
"""
return self._blueprints[item]
def __setitem__(self, index: int, item: object) -> None:
"""
Abstract method implemented to turn the `BlueprintGroup` class
into a list like object to support all the existing behavior.
This method is used to perform the list's indexed setter operation.
:param index: Index to use for inserting a new Blueprint item
:param item: New `Blueprint` object.
:return: None
"""
self._blueprints[index] = item
def __delitem__(self, index: int) -> None:
"""
Abstract method implemented to turn the `BlueprintGroup` class
into a list like object to support all the existing behavior.
This method is used to delete an item from the list of blueprint
groups like it can be done on a regular list with index.
:param index: Index to use for removing a new Blueprint item
:return: None
"""
del self._blueprints[index]
def __len__(self) -> int:
"""
Get the Length of the blueprint group object.
:return: Length of Blueprint group object
"""
return len(self._blueprints)
def insert(self, index: int, item: object) -> None:
"""
The Abstract class `MutableSequence` leverages this insert method to
perform the `BlueprintGroup.append` operation.
:param index: Index to use for removing a new Blueprint item
:param item: New `Blueprint` object.
:return: None
"""
self._blueprints.insert(index, item)
def middleware(self, *args, **kwargs):
"""
A decorator that can be used to implement a Middleware plugin to
all of the Blueprints that belongs to this specific Blueprint Group.
In case of nested Blueprint Groups, the same middleware is applied
across each of the Blueprints recursively.
:param args: Optional positional Parameters to be use middleware
:param kwargs: Optional Keyword arg to use with Middleware
:return: Partial function to apply the middleware
"""
kwargs["bp_group"] = True
def register_middleware_for_blueprints(fn):
for blueprint in self.blueprints:
blueprint.middleware(fn, *args, **kwargs)
return register_middleware_for_blueprints

View File

@@ -1,24 +1,55 @@
from collections import defaultdict, namedtuple
from sanic.blueprint_group import BlueprintGroup
from sanic.constants import HTTP_METHODS
from sanic.views import CompositionView
FutureRoute = namedtuple('Route',
['handler', 'uri', 'methods', 'host',
'strict_slashes', 'stream', 'version'])
FutureListener = namedtuple('Listener', ['handler', 'uri', 'methods', 'host'])
FutureMiddleware = namedtuple('Route', ['middleware', 'args', 'kwargs'])
FutureException = namedtuple('Route', ['handler', 'args', 'kwargs'])
FutureStatic = namedtuple('Route',
['uri', 'file_or_directory', 'args', 'kwargs'])
FutureRoute = namedtuple(
"FutureRoute",
[
"handler",
"uri",
"methods",
"host",
"strict_slashes",
"stream",
"version",
"name",
],
)
FutureListener = namedtuple(
"FutureListener", ["handler", "uri", "methods", "host"]
)
FutureMiddleware = namedtuple(
"FutureMiddleware", ["middleware", "args", "kwargs"]
)
FutureException = namedtuple("FutureException", ["handler", "args", "kwargs"])
FutureStatic = namedtuple(
"FutureStatic", ["uri", "file_or_directory", "args", "kwargs"]
)
class Blueprint:
def __init__(self, name, url_prefix=None, host=None, version=None):
"""Create a new blueprint
def __init__(
self,
name,
url_prefix=None,
host=None,
version=None,
strict_slashes=False,
):
"""
In *Sanic* terminology, a **Blueprint** is a logical collection of
URLs that perform a specific set of tasks which can be identified by
a unique name.
:param name: unique name of the blueprint
:param url_prefix: URL to be prefixed before all route URLs
:param host: IP Address of FQDN for the sanic server to use.
:param version: Blueprint Version
:param strict_slashes: Enforce the API urls are requested with a
training */*
"""
self.name = name
self.url_prefix = url_prefix
@@ -31,11 +62,47 @@ class Blueprint:
self.middlewares = []
self.statics = []
self.version = version
self.strict_slashes = strict_slashes
@staticmethod
def group(*blueprints, url_prefix=""):
"""
Create a list of blueprints, optionally grouping them under a
general URL prefix.
:param blueprints: blueprints to be registered as a group
:param url_prefix: URL route to be prepended to all sub-prefixes
"""
def chain(nested):
"""itertools.chain() but leaves strings untouched"""
for i in nested:
if isinstance(i, (list, tuple)):
yield from chain(i)
elif isinstance(i, BlueprintGroup):
yield from i.blueprints
else:
yield i
bps = BlueprintGroup(url_prefix=url_prefix)
for bp in chain(blueprints):
if bp.url_prefix is None:
bp.url_prefix = ""
bp.url_prefix = url_prefix + bp.url_prefix
bps.append(bp)
return bps
def register(self, app, options):
"""Register the blueprint to the sanic app."""
"""
Register the blueprint to the sanic app.
url_prefix = options.get('url_prefix', self.url_prefix)
:param app: Instance of :class:`sanic.app.Sanic` class
:param options: Options to be used while registering the
blueprint into the app.
*url_prefix* - URL Prefix to override the blueprint prefix
"""
url_prefix = options.get("url_prefix", self.url_prefix)
# Routes
for future in self.routes:
@@ -48,13 +115,14 @@ class Blueprint:
version = future.version or self.version
app.route(
uri=uri[1:] if uri.startswith('//') else uri,
uri=uri[1:] if uri.startswith("//") else uri,
methods=future.methods,
host=future.host or self.host,
strict_slashes=future.strict_slashes,
stream=future.stream,
version=version
)(future.handler)
version=version,
name=future.name,
)(future.handler)
for future in self.websocket_routes:
# attach the blueprint name to the handler so that it can be
@@ -65,15 +133,16 @@ class Blueprint:
app.websocket(
uri=uri,
host=future.host or self.host,
strict_slashes=future.strict_slashes
)(future.handler)
strict_slashes=future.strict_slashes,
name=future.name,
)(future.handler)
# Middleware
for future in self.middlewares:
if future.args or future.kwargs:
app.register_middleware(future.middleware,
*future.args,
**future.kwargs)
app.register_middleware(
future.middleware, *future.args, **future.kwargs
)
else:
app.register_middleware(future.middleware)
@@ -85,75 +154,147 @@ class Blueprint:
for future in self.statics:
# Prepend the blueprint URI prefix if available
uri = url_prefix + future.uri if url_prefix else future.uri
app.static(uri, future.file_or_directory,
*future.args, **future.kwargs)
app.static(
uri, future.file_or_directory, *future.args, **future.kwargs
)
# Event listeners
for event, listeners in self.listeners.items():
for listener in listeners:
app.listener(event)(listener)
def route(self, uri, methods=frozenset({'GET'}), host=None,
strict_slashes=False, stream=False, version=None):
def route(
self,
uri,
methods=frozenset({"GET"}),
host=None,
strict_slashes=None,
stream=False,
version=None,
name=None,
):
"""Create a blueprint route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:param methods: list of acceptable HTTP methods.
:param host: IP Address of FQDN for the sanic server to use.
:param strict_slashes: Enforce the API urls are requested with a
training */*
:param stream: If the route should provide a streaming support
:param version: Blueprint Version
:param name: Unique name to identify the Route
:return a decorated method that when invoked will return an object
of type :class:`FutureRoute`
"""
if strict_slashes is None:
strict_slashes = self.strict_slashes
def decorator(handler):
route = FutureRoute(
handler, uri, methods, host, strict_slashes, stream, version)
handler,
uri,
methods,
host,
strict_slashes,
stream,
version,
name,
)
self.routes.append(route)
return handler
return decorator
def add_route(self, handler, uri, methods=frozenset({'GET'}), host=None,
strict_slashes=False, version=None):
def add_route(
self,
handler,
uri,
methods=frozenset({"GET"}),
host=None,
strict_slashes=None,
version=None,
name=None,
stream=False,
):
"""Create a blueprint route from a function.
:param handler: function for handling uri requests. Accepts function,
or class instance with a view_class method.
:param uri: endpoint at which the route will be accessible.
:param methods: list of acceptable HTTP methods.
:param host: IP Address of FQDN for the sanic server to use.
:param strict_slashes: Enforce the API urls are requested with a
training */*
:param version: Blueprint Version
:param name: user defined route name for url_for
:param stream: boolean specifying if the handler is a stream handler
:return: function or class instance
"""
# Handle HTTPMethodView differently
if hasattr(handler, 'view_class'):
if hasattr(handler, "view_class"):
methods = set()
for method in HTTP_METHODS:
if getattr(handler.view_class, method.lower(), None):
methods.add(method)
if strict_slashes is None:
strict_slashes = self.strict_slashes
# handle composition view differently
if isinstance(handler, CompositionView):
methods = handler.handlers.keys()
self.route(uri=uri, methods=methods, host=host,
strict_slashes=strict_slashes, version=version)(handler)
self.route(
uri=uri,
methods=methods,
host=host,
strict_slashes=strict_slashes,
stream=stream,
version=version,
name=name,
)(handler)
return handler
def websocket(self, uri, host=None, strict_slashes=False, version=None):
def websocket(
self, uri, host=None, strict_slashes=None, version=None, name=None
):
"""Create a blueprint websocket route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:param host: IP Address of FQDN for the sanic server to use.
:param strict_slashes: Enforce the API urls are requested with a
training */*
:param version: Blueprint Version
:param name: Unique name to identify the Websocket Route
"""
if strict_slashes is None:
strict_slashes = self.strict_slashes
def decorator(handler):
route = FutureRoute(handler, uri, [], host, strict_slashes,
False, version)
route = FutureRoute(
handler, uri, [], host, strict_slashes, False, version, name
)
self.websocket_routes.append(route)
return handler
return decorator
def add_websocket_route(self, handler, uri, host=None, version=None):
def add_websocket_route(
self, handler, uri, host=None, version=None, name=None
):
"""Create a blueprint websocket route from a function.
:param handler: function for handling uri requests. Accepts function,
or class instance with a view_class method.
:param uri: endpoint at which the route will be accessible.
:param host: IP Address of FQDN for the sanic server to use.
:param version: Blueprint Version
:param name: Unique name to identify the Websocket Route
:return: function or class instance
"""
self.websocket(uri=uri, host=host, version=version)(handler)
self.websocket(uri=uri, host=host, version=version, name=name)(handler)
return handler
def listener(self, event):
@@ -161,13 +302,23 @@ class Blueprint:
:param event: Event to listen to.
"""
def decorator(listener):
self.listeners[event].append(listener)
return listener
return decorator
def middleware(self, *args, **kwargs):
"""Create a blueprint middleware from a decorated function."""
"""
Create a blueprint middleware from a decorated function.
:param args: Positional arguments to be used while invoking the
middleware
:param kwargs: optional keyword args that can be used with the
middleware.
"""
def register_middleware(_middleware):
future_middleware = FutureMiddleware(_middleware, args, kwargs)
self.middlewares.append(future_middleware)
@@ -179,14 +330,32 @@ class Blueprint:
args = []
return register_middleware(middleware)
else:
return register_middleware
if kwargs.get("bp_group") and callable(args[0]):
middleware = args[0]
args = args[1:]
kwargs.pop("bp_group")
return register_middleware(middleware)
else:
return register_middleware
def exception(self, *args, **kwargs):
"""Create a blueprint exception from a decorated function."""
"""
This method enables the process of creating a global exception
handler for the current blueprint under question.
:param args: List of Python exceptions to be caught by the handler
:param kwargs: Additional optional arguments to be passed to the
exception handler
:return a decorated method to handle global exceptions for any
route registered under this blueprint.
"""
def decorator(handler):
exception = FutureException(handler, args, kwargs)
self.exceptions.append(exception)
return handler
return decorator
def static(self, uri, file_or_directory, *args, **kwargs):
@@ -195,40 +364,197 @@ class Blueprint:
:param uri: endpoint at which the route will be accessible.
:param file_or_directory: Static asset.
"""
name = kwargs.pop("name", "static")
if not name.startswith(self.name + "."):
name = "{}.{}".format(self.name, name)
kwargs.update(name=name)
strict_slashes = kwargs.get("strict_slashes")
if strict_slashes is None and self.strict_slashes is not None:
kwargs.update(strict_slashes=self.strict_slashes)
static = FutureStatic(uri, file_or_directory, args, kwargs)
self.statics.append(static)
# Shorthand method decorators
def get(self, uri, host=None, strict_slashes=False, version=None):
return self.route(uri, methods=["GET"], host=host,
strict_slashes=strict_slashes, version=version)
def get(
self, uri, host=None, strict_slashes=None, version=None, name=None
):
"""
Add an API URL under the **GET** *HTTP* method
def post(self, uri, host=None, strict_slashes=False, stream=False,
version=None):
return self.route(uri, methods=["POST"], host=host,
strict_slashes=strict_slashes, stream=stream,
version=version)
:param uri: URL to be tagged to **GET** method of *HTTP*
:param host: Host IP or FQDN for the service to use
:param strict_slashes: Instruct :class:`sanic.app.Sanic` to check
if the request URLs need to terminate with a */*
:param version: API Version
:param name: Unique name that can be used to identify the Route
:return: Object decorated with :func:`route` method
"""
return self.route(
uri,
methods=frozenset({"GET"}),
host=host,
strict_slashes=strict_slashes,
version=version,
name=name,
)
def put(self, uri, host=None, strict_slashes=False, stream=False,
version=None):
return self.route(uri, methods=["PUT"], host=host,
strict_slashes=strict_slashes, stream=stream,
version=version)
def post(
self,
uri,
host=None,
strict_slashes=None,
stream=False,
version=None,
name=None,
):
"""
Add an API URL under the **POST** *HTTP* method
def head(self, uri, host=None, strict_slashes=False, version=None):
return self.route(uri, methods=["HEAD"], host=host,
strict_slashes=strict_slashes, version=version)
:param uri: URL to be tagged to **POST** method of *HTTP*
:param host: Host IP or FQDN for the service to use
:param strict_slashes: Instruct :class:`sanic.app.Sanic` to check
if the request URLs need to terminate with a */*
:param version: API Version
:param name: Unique name that can be used to identify the Route
:return: Object decorated with :func:`route` method
"""
return self.route(
uri,
methods=frozenset({"POST"}),
host=host,
strict_slashes=strict_slashes,
stream=stream,
version=version,
name=name,
)
def options(self, uri, host=None, strict_slashes=False, version=None):
return self.route(uri, methods=["OPTIONS"], host=host,
strict_slashes=strict_slashes, version=version)
def put(
self,
uri,
host=None,
strict_slashes=None,
stream=False,
version=None,
name=None,
):
"""
Add an API URL under the **PUT** *HTTP* method
def patch(self, uri, host=None, strict_slashes=False, stream=False,
version=None):
return self.route(uri, methods=["PATCH"], host=host,
strict_slashes=strict_slashes, stream=stream,
version=version)
:param uri: URL to be tagged to **PUT** method of *HTTP*
:param host: Host IP or FQDN for the service to use
:param strict_slashes: Instruct :class:`sanic.app.Sanic` to check
if the request URLs need to terminate with a */*
:param version: API Version
:param name: Unique name that can be used to identify the Route
:return: Object decorated with :func:`route` method
"""
return self.route(
uri,
methods=frozenset({"PUT"}),
host=host,
strict_slashes=strict_slashes,
stream=stream,
version=version,
name=name,
)
def delete(self, uri, host=None, strict_slashes=False, version=None):
return self.route(uri, methods=["DELETE"], host=host,
strict_slashes=strict_slashes, version=version)
def head(
self, uri, host=None, strict_slashes=None, version=None, name=None
):
"""
Add an API URL under the **HEAD** *HTTP* method
:param uri: URL to be tagged to **HEAD** method of *HTTP*
:param host: Host IP or FQDN for the service to use
:param strict_slashes: Instruct :class:`sanic.app.Sanic` to check
if the request URLs need to terminate with a */*
:param version: API Version
:param name: Unique name that can be used to identify the Route
:return: Object decorated with :func:`route` method
"""
return self.route(
uri,
methods=frozenset({"HEAD"}),
host=host,
strict_slashes=strict_slashes,
version=version,
name=name,
)
def options(
self, uri, host=None, strict_slashes=None, version=None, name=None
):
"""
Add an API URL under the **OPTIONS** *HTTP* method
:param uri: URL to be tagged to **OPTIONS** method of *HTTP*
:param host: Host IP or FQDN for the service to use
:param strict_slashes: Instruct :class:`sanic.app.Sanic` to check
if the request URLs need to terminate with a */*
:param version: API Version
:param name: Unique name that can be used to identify the Route
:return: Object decorated with :func:`route` method
"""
return self.route(
uri,
methods=frozenset({"OPTIONS"}),
host=host,
strict_slashes=strict_slashes,
version=version,
name=name,
)
def patch(
self,
uri,
host=None,
strict_slashes=None,
stream=False,
version=None,
name=None,
):
"""
Add an API URL under the **PATCH** *HTTP* method
:param uri: URL to be tagged to **PATCH** method of *HTTP*
:param host: Host IP or FQDN for the service to use
:param strict_slashes: Instruct :class:`sanic.app.Sanic` to check
if the request URLs need to terminate with a */*
:param version: API Version
:param name: Unique name that can be used to identify the Route
:return: Object decorated with :func:`route` method
"""
return self.route(
uri,
methods=frozenset({"PATCH"}),
host=host,
strict_slashes=strict_slashes,
stream=stream,
version=version,
name=name,
)
def delete(
self, uri, host=None, strict_slashes=None, version=None, name=None
):
"""
Add an API URL under the **DELETE** *HTTP* method
:param uri: URL to be tagged to **DELETE** method of *HTTP*
:param host: Host IP or FQDN for the service to use
:param strict_slashes: Instruct :class:`sanic.app.Sanic` to check
if the request URLs need to terminate with a */*
:param version: API Version
:param name: Unique name that can be used to identify the Route
:return: Object decorated with :func:`route` method
"""
return self.route(
uri,
methods=frozenset({"DELETE"}),
host=host,
strict_slashes=strict_slashes,
version=version,
name=name,
)

View File

@@ -1,137 +1,48 @@
import os
import sys
import syslog
import platform
import types
from sanic.log import DefaultFilter
from distutils.util import strtobool
SANIC_PREFIX = 'SANIC_'
from sanic.exceptions import PyFileError
_address_dict = {
'Windows': ('localhost', 514),
'Darwin': '/var/run/syslog',
'Linux': '/dev/log',
'FreeBSD': '/var/run/log'
SANIC_PREFIX = "SANIC_"
BASE_LOGO = """
Sanic
Build Fast. Run Fast.
"""
DEFAULT_CONFIG = {
"REQUEST_MAX_SIZE": 100000000, # 100 megabytes
"REQUEST_BUFFER_QUEUE_SIZE": 100,
"REQUEST_TIMEOUT": 60, # 60 seconds
"RESPONSE_TIMEOUT": 60, # 60 seconds
"KEEP_ALIVE": True,
"KEEP_ALIVE_TIMEOUT": 5, # 5 seconds
"WEBSOCKET_MAX_SIZE": 2 ** 20, # 1 megabytes
"WEBSOCKET_MAX_QUEUE": 32,
"WEBSOCKET_READ_LIMIT": 2 ** 16,
"WEBSOCKET_WRITE_LIMIT": 2 ** 16,
"GRACEFUL_SHUTDOWN_TIMEOUT": 15.0, # 15 sec
"ACCESS_LOG": True,
}
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'filters': {
'accessFilter': {
'()': DefaultFilter,
'param': [0, 10, 20]
},
'errorFilter': {
'()': DefaultFilter,
'param': [30, 40, 50]
}
},
'formatters': {
'simple': {
'format': '%(asctime)s - (%(name)s)[%(levelname)s]: %(message)s',
'datefmt': '%Y-%m-%d %H:%M:%S'
},
'access': {
'format': '%(asctime)s - (%(name)s)[%(levelname)s][%(host)s]: ' +
'%(request)s %(message)s %(status)d %(byte)d',
'datefmt': '%Y-%m-%d %H:%M:%S'
}
},
'handlers': {
'internal': {
'class': 'logging.StreamHandler',
'filters': ['accessFilter'],
'formatter': 'simple',
'stream': sys.stderr
},
'accessStream': {
'class': 'logging.StreamHandler',
'filters': ['accessFilter'],
'formatter': 'access',
'stream': sys.stderr
},
'errorStream': {
'class': 'logging.StreamHandler',
'filters': ['errorFilter'],
'formatter': 'simple',
'stream': sys.stderr
},
# before you use accessSysLog, be sure that log levels
# 0, 10, 20 have been enabled in you syslog configuration
# otherwise you won't be able to see the output in syslog
# logging file.
'accessSysLog': {
'class': 'logging.handlers.SysLogHandler',
'address': _address_dict.get(platform.system(),
('localhost', 514)),
'facility': syslog.LOG_DAEMON,
'filters': ['accessFilter'],
'formatter': 'access'
},
'errorSysLog': {
'class': 'logging.handlers.SysLogHandler',
'address': _address_dict.get(platform.system(),
('localhost', 514)),
'facility': syslog.LOG_DAEMON,
'filters': ['errorFilter'],
'formatter': 'simple'
},
},
'loggers': {
'sanic': {
'level': 'DEBUG',
'handlers': ['internal', 'errorStream']
},
'network': {
'level': 'DEBUG',
'handlers': ['accessStream', 'errorStream']
}
}
}
# this happens when using container or systems without syslog
# keep things in config would cause file not exists error
_addr = LOGGING['handlers']['accessSysLog']['address']
if type(_addr) is str and not os.path.exists(_addr):
LOGGING['handlers'].pop('accessSysLog')
LOGGING['handlers'].pop('errorSysLog')
class Config(dict):
def __init__(self, defaults=None, load_env=True, keep_alive=True):
super().__init__(defaults or {})
self.LOGO = """
▄▄▄▄▄
▀▀▀██████▄▄▄ _______________
▄▄▄▄▄ █████████▄ / \\
▀▀▀▀█████▌ ▀▐▄ ▀▐█ | Gotta go fast! |
▀▀█████▄▄ ▀██████▄██ | _________________/
▀▄▄▄▄▄ ▀▀█▄▀█════█▀ |/
▀▀▀▄ ▀▀███ ▀ ▄▄
▄███▀▀██▄████████▄ ▄▀▀▀▀▀▀█▌
██▀▄▄▄██▀▄███▀ ▀▀████ ▄██
▄▀▀▀▄██▄▀▀▌████▒▒▒▒▒▒███ ▌▄▄▀
▌ ▐▀████▐███▒▒▒▒▒▐██▌
▀▄▄▄▄▀ ▀▀████▒▒▒▒▄██▀
▀▀█████████▀
▄▄██▀██████▀█
▄██▀ ▀▀▀ █
▄█ ▐▌
▄▄▄▄█▌ ▀█▄▄▄▄▀▀▄
▌ ▐ ▀▀▄▄▄▀
▀▀▄▄▀
"""
self.REQUEST_MAX_SIZE = 100000000 # 100 megabytes
self.REQUEST_TIMEOUT = 60 # 60 seconds
self.KEEP_ALIVE = keep_alive
self.WEBSOCKET_MAX_SIZE = 2 ** 20 # 1 megabytes
self.WEBSOCKET_MAX_QUEUE = 32
self.GRACEFUL_SHUTDOWN_TIMEOUT = 15.0 # 15 sec
def __init__(self, defaults=None, load_env=True, keep_alive=None):
defaults = defaults or {}
super().__init__({**DEFAULT_CONFIG, **defaults})
self.LOGO = BASE_LOGO
if keep_alive is not None:
self.KEEP_ALIVE = keep_alive
if load_env:
self.load_environment_vars()
prefix = SANIC_PREFIX if load_env is True else load_env
self.load_environment_vars(prefix=prefix)
def __getattr__(self, attr):
try:
@@ -151,9 +62,10 @@ class Config(dict):
"""
config_file = os.environ.get(variable_name)
if not config_file:
raise RuntimeError('The environment variable %r is not set and '
'thus configuration could not be loaded.' %
variable_name)
raise RuntimeError(
"The environment variable %r is not set and "
"thus configuration could not be loaded." % variable_name
)
return self.from_pyfile(config_file)
def from_pyfile(self, filename):
@@ -162,15 +74,20 @@ class Config(dict):
:param filename: an absolute path to the config file
"""
module = types.ModuleType('config')
module = types.ModuleType("config")
module.__file__ = filename
try:
with open(filename) as config_file:
exec(compile(config_file.read(), filename, 'exec'),
module.__dict__)
exec(
compile(config_file.read(), filename, "exec"),
module.__dict__,
)
except IOError as e:
e.strerror = 'Unable to load configuration file (%s)' % e.strerror
e.strerror = "Unable to load configuration file (%s)" % e.strerror
raise
except Exception as e:
raise PyFileError(filename) from e
self.from_object(module)
return True
@@ -195,18 +112,21 @@ class Config(dict):
if key.isupper():
self[key] = getattr(obj, key)
def load_environment_vars(self):
def load_environment_vars(self, prefix=SANIC_PREFIX):
"""
Looks for any ``SANIC_`` prefixed environment variables and applies
Looks for prefixed environment variables and applies
them to the configuration if present.
"""
for k, v in os.environ.items():
if k.startswith(SANIC_PREFIX):
_, config_key = k.split(SANIC_PREFIX, 1)
if k.startswith(prefix):
_, config_key = k.split(prefix, 1)
try:
self[config_key] = int(v)
except ValueError:
try:
self[config_key] = float(v)
except ValueError:
self[config_key] = v
try:
self[config_key] = bool(strtobool(v))
except ValueError:
self[config_key] = v

View File

@@ -1 +1 @@
HTTP_METHODS = ('GET', 'POST', 'PUT', 'HEAD', 'OPTIONS', 'PATCH', 'DELETE')
HTTP_METHODS = ("GET", "POST", "PUT", "HEAD", "OPTIONS", "PATCH", "DELETE")

View File

@@ -1,6 +1,11 @@
import re
import string
from datetime import datetime
DEFAULT_MAX_AGE = 0
# ------------------------------------------------------------ #
# SimpleCookie
# ------------------------------------------------------------ #
@@ -8,18 +13,16 @@ import string
# Straight up copied this section of dark magic from SimpleCookie
_LegalChars = string.ascii_letters + string.digits + "!#$%&'*+-.^_`|~:"
_UnescapedChars = _LegalChars + ' ()/<=>?@[]{}'
_UnescapedChars = _LegalChars + " ()/<=>?@[]{}"
_Translator = {n: '\\%03o' % n
for n in set(range(256)) - set(map(ord, _UnescapedChars))}
_Translator.update({
ord('"'): '\\"',
ord('\\'): '\\\\',
})
_Translator = {
n: "\\%03o" % n for n in set(range(256)) - set(map(ord, _UnescapedChars))
}
_Translator.update({ord('"'): '\\"', ord("\\"): "\\\\"})
def _quote(str):
"""Quote a string for use in a cookie header.
r"""Quote a string for use in a cookie header.
If the string does not need to be double-quoted, then just return the
string. Otherwise, surround the string in doublequotes and quote
(with a \) special characters.
@@ -30,7 +33,7 @@ def _quote(str):
return '"' + str.translate(_Translator) + '"'
_is_legal_key = re.compile('[%s]+' % re.escape(_LegalChars)).fullmatch
_is_legal_key = re.compile("[%s]+" % re.escape(_LegalChars)).fullmatch
# ------------------------------------------------------------ #
# Custom SimpleCookie
@@ -47,33 +50,37 @@ class CookieJar(dict):
super().__init__()
self.headers = headers
self.cookie_headers = {}
self.header_key = "Set-Cookie"
def __setitem__(self, key, value):
# If this cookie doesn't exist, add it to the header keys
cookie_header = self.cookie_headers.get(key)
if not cookie_header:
if not self.cookie_headers.get(key):
cookie = Cookie(key, value)
cookie['path'] = '/'
cookie_header = MultiHeader("Set-Cookie")
self.cookie_headers[key] = cookie_header
self.headers[cookie_header] = cookie
cookie["path"] = "/"
self.cookie_headers[key] = self.header_key
self.headers.add(self.header_key, cookie)
return super().__setitem__(key, cookie)
else:
self[key].value = value
def __delitem__(self, key):
if key not in self.cookie_headers:
self[key] = ''
self[key]['max-age'] = 0
self[key] = ""
self[key]["max-age"] = 0
else:
cookie_header = self.cookie_headers[key]
del self.headers[cookie_header]
# remove it from header
cookies = self.headers.popall(cookie_header)
for cookie in cookies:
if cookie.key != key:
self.headers.add(cookie_header, cookie)
del self.cookie_headers[key]
return super().__delitem__(key)
class Cookie(dict):
"""A stripped down version of Morsel from SimpleCookie #gottagofast"""
_keys = {
"expires": "expires",
"path": "Path",
@@ -83,8 +90,9 @@ class Cookie(dict):
"secure": "Secure",
"httponly": "HttpOnly",
"version": "Version",
"samesite": "SameSite",
}
_flags = {'secure', 'httponly'}
_flags = {"secure", "httponly"}
def __init__(self, key, value):
if key in self._keys:
@@ -98,42 +106,45 @@ class Cookie(dict):
def __setitem__(self, key, value):
if key not in self._keys:
raise KeyError("Unknown cookie property")
return super().__setitem__(key, value)
if value is not False:
if key.lower() == "max-age":
if not str(value).isdigit():
value = DEFAULT_MAX_AGE
elif key.lower() == "expires":
if not isinstance(value, datetime):
raise TypeError(
"Cookie 'expires' property must be a datetime"
)
return super().__setitem__(key, value)
def encode(self, encoding):
output = ['%s=%s' % (self.key, _quote(self.value))]
"""
Encode the cookie content in a specific type of encoding instructed
by the developer. Leverages the :func:`str.encode` method provided
by python.
This method can be used to encode and embed ``utf-8`` content into
the cookies.
:param encoding: Encoding to be used with the cookie
:return: Cookie encoded in a codec of choosing.
:except: UnicodeEncodeError
"""
output = ["%s=%s" % (self.key, _quote(self.value))]
for key, value in self.items():
if key == 'max-age':
if key == "max-age":
try:
output.append('%s=%d' % (self._keys[key], value))
output.append("%s=%d" % (self._keys[key], value))
except TypeError:
output.append('%s=%s' % (self._keys[key], value))
elif key == 'expires':
try:
output.append('%s=%s' % (
self._keys[key],
value.strftime("%a, %d-%b-%Y %T GMT")
))
except AttributeError:
output.append('%s=%s' % (self._keys[key], value))
output.append("%s=%s" % (self._keys[key], value))
elif key == "expires":
output.append(
"%s=%s"
% (self._keys[key], value.strftime("%a, %d-%b-%Y %T GMT"))
)
elif key in self._flags and self[key]:
output.append(self._keys[key])
else:
output.append('%s=%s' % (self._keys[key], value))
output.append("%s=%s" % (self._keys[key], value))
return "; ".join(output).encode(encoding)
# ------------------------------------------------------------ #
# Header Trickery
# ------------------------------------------------------------ #
class MultiHeader:
"""String-holding object which allow us to set a header within response
that has a unique key, but may contain duplicate header names
"""
def __init__(self, name):
self.name = name
def encode(self):
return self.name.encode()

View File

@@ -1,6 +1,7 @@
from sanic.response import ALL_STATUS_CODES, COMMON_STATUS_CODES
from sanic.helpers import STATUS_CODES
TRACEBACK_STYLE = '''
TRACEBACK_STYLE = """
<style>
body {
padding: 20px;
@@ -61,9 +62,9 @@ TRACEBACK_STYLE = '''
font-size: 14px;
}
</style>
'''
"""
TRACEBACK_WRAPPER_HTML = '''
TRACEBACK_WRAPPER_HTML = """
<html>
<head>
{style}
@@ -78,27 +79,27 @@ TRACEBACK_WRAPPER_HTML = '''
</div>
</body>
</html>
'''
"""
TRACEBACK_WRAPPER_INNER_HTML = '''
TRACEBACK_WRAPPER_INNER_HTML = """
<h1>{exc_name}</h1>
<h3><code>{exc_value}</code></h3>
<div class="tb-wrapper">
<p class="tb-header">Traceback (most recent call last):</p>
{frame_html}
</div>
'''
"""
TRACEBACK_BORDER = '''
TRACEBACK_BORDER = """
<div class="tb-border">
<b><i>
The above exception was the direct cause of the
following exception:
</i></b>
</div>
'''
"""
TRACEBACK_LINE_HTML = '''
TRACEBACK_LINE_HTML = """
<div class="frame-line">
<p class="frame-descriptor">
File {0.filename}, line <i>{0.lineno}</i>,
@@ -106,15 +107,15 @@ TRACEBACK_LINE_HTML = '''
</p>
<p class="frame-code"><code>{0.line}</code></p>
</div>
'''
"""
INTERNAL_SERVER_ERROR_HTML = '''
INTERNAL_SERVER_ERROR_HTML = """
<h1>Internal Server Error</h1>
<p>
The server encountered an internal error and cannot complete
your request.
</p>
'''
"""
_sanic_exceptions = {}
@@ -122,17 +123,18 @@ _sanic_exceptions = {}
def add_status_code(code):
"""
Decorator used for adding exceptions to _sanic_exceptions.
Decorator used for adding exceptions to :class:`SanicException`.
"""
def class_decorator(cls):
cls.status_code = code
_sanic_exceptions[code] = cls
return cls
return class_decorator
class SanicException(Exception):
def __init__(self, message, status_code=None):
super().__init__(message)
@@ -150,18 +152,34 @@ class InvalidUsage(SanicException):
pass
@add_status_code(405)
class MethodNotSupported(SanicException):
def __init__(self, message, method, allowed_methods):
super().__init__(message)
self.headers = dict()
self.headers["Allow"] = ", ".join(allowed_methods)
if method in ["HEAD", "PATCH", "PUT", "DELETE"]:
self.headers["Content-Length"] = 0
@add_status_code(500)
class ServerError(SanicException):
pass
@add_status_code(503)
class ServiceUnavailable(SanicException):
"""The server is currently unavailable (because it is overloaded or
down for maintenance). Generally, this is a temporary state."""
pass
class URLBuildError(ServerError):
pass
class FileNotFound(NotFound):
pass
def __init__(self, message, path, relative_url):
super().__init__(message)
self.path = path
@@ -170,6 +188,14 @@ class FileNotFound(NotFound):
@add_status_code(408)
class RequestTimeout(SanicException):
"""The Web server (running the Web site) thinks that there has been too
long an interval of time between 1) the establishment of an IP
connection (socket) between the client and the server and
2) the receipt of any data on that socket, so the server has dropped
the connection. The socket connection has actually been lost - the Web
server has 'timed out' on that particular socket connection.
"""
pass
@@ -184,13 +210,11 @@ class HeaderNotFound(InvalidUsage):
@add_status_code(416)
class ContentRangeError(SanicException):
pass
def __init__(self, message, content_range):
super().__init__(message)
self.headers = {
'Content-Type': 'text/plain',
"Content-Range": "bytes */%s" % (content_range.total,)
"Content-Type": "text/plain",
"Content-Range": "bytes */%s" % (content_range.total,),
}
@@ -203,12 +227,18 @@ class InvalidRangeType(ContentRangeError):
pass
class PyFileError(Exception):
def __init__(self, file):
super().__init__("could not execute config file %s", file)
@add_status_code(401)
class Unauthorized(SanicException):
"""
Unauthorized exception (401 HTTP status code).
:param message: Message describing the exception.
:param status_code: HTTP Status code.
:param scheme: Name of the authentication scheme to be used.
When present, kwargs is used to complete the WWW-Authentication header.
@@ -216,11 +246,13 @@ class Unauthorized(SanicException):
Examples::
# With a Basic auth-scheme, realm MUST be present:
raise Unauthorized("Auth required.", "Basic", realm="Restricted Area")
raise Unauthorized("Auth required.",
scheme="Basic",
realm="Restricted Area")
# With a Digest auth-scheme, things are a bit more complicated:
raise Unauthorized("Auth required.",
"Digest",
scheme="Digest",
realm="Restricted Area",
qop="auth, auth-int",
algorithm="MD5",
@@ -228,20 +260,25 @@ class Unauthorized(SanicException):
opaque="zyxwvu")
# With a Bearer auth-scheme, realm is optional so you can write:
raise Unauthorized("Auth required.", "Bearer")
raise Unauthorized("Auth required.", scheme="Bearer")
# or, if you want to specify the realm:
raise Unauthorized("Auth required.", "Bearer", realm="Restricted Area")
raise Unauthorized("Auth required.",
scheme="Bearer",
realm="Restricted Area")
"""
def __init__(self, message, scheme, **kwargs):
super().__init__(message)
values = ["{!s}={!r}".format(k, v) for k, v in kwargs.items()]
challenge = ', '.join(values)
def __init__(self, message, status_code=None, scheme=None, **kwargs):
super().__init__(message, status_code)
self.headers = {
"WWW-Authenticate": "{} {}".format(scheme, challenge).rstrip()
}
# if auth-scheme is specified, set "WWW-Authenticate" header
if scheme is not None:
values = ['{!s}="{!s}"'.format(k, v) for k, v in kwargs.items()]
challenge = ", ".join(values)
self.headers = {
"WWW-Authenticate": "{} {}".format(scheme, challenge).rstrip()
}
def abort(status_code, message=None):
@@ -254,9 +291,8 @@ def abort(status_code, message=None):
in response.py for the given status code.
"""
if message is None:
message = COMMON_STATUS_CODES.get(status_code,
ALL_STATUS_CODES.get(status_code))
message = STATUS_CODES.get(status_code)
# These are stored as bytes in the STATUS_CODES dict
message = message.decode('utf8')
message = message.decode("utf8")
sanic_exception = _sanic_exceptions.get(status_code, SanicException)
raise sanic_exception(message=message, status_code=status_code)

View File

@@ -1,22 +1,36 @@
import sys
from traceback import format_exc, extract_tb
from traceback import extract_tb, format_exc
from sanic.exceptions import (
ContentRangeError,
HeaderNotFound,
INTERNAL_SERVER_ERROR_HTML,
InvalidRangeType,
SanicException,
TRACEBACK_BORDER,
TRACEBACK_LINE_HTML,
TRACEBACK_STYLE,
TRACEBACK_WRAPPER_HTML,
TRACEBACK_WRAPPER_INNER_HTML,
TRACEBACK_BORDER)
from sanic.log import log
from sanic.response import text, html
ContentRangeError,
HeaderNotFound,
InvalidRangeType,
SanicException,
)
from sanic.log import logger
from sanic.response import html, text
class ErrorHandler:
"""
Provide :class:`sanic.app.Sanic` application with a mechanism to handle
and process any and all uncaught exceptions in a way the application
developer will set fit.
This error handling framework is built into the core that can be extended
by the developers to perform a wide range of tasks from recording the error
stats to reporting them to an external service that can be used for
realtime alerting system.
"""
handlers = None
cached_handlers = None
_missing = object()
@@ -36,7 +50,8 @@ class ErrorHandler:
return TRACEBACK_WRAPPER_INNER_HTML.format(
exc_name=exception.__class__.__name__,
exc_value=exception,
frame_html=''.join(frame_html))
frame_html="".join(frame_html),
)
def _render_traceback_html(self, exception, request):
exc_type, exc_value, tb = sys.exc_info()
@@ -51,13 +66,39 @@ class ErrorHandler:
exc_name=exception.__class__.__name__,
exc_value=exception,
inner_html=TRACEBACK_BORDER.join(reversed(exceptions)),
path=request.path)
path=request.path,
)
def add(self, exception, handler):
"""
Add a new exception handler to an already existing handler object.
:param exception: Type of exception that need to be handled
:param handler: Reference to the method that will handle the exception
:type exception: :class:`sanic.exceptions.SanicException` or
:class:`Exception`
:type handler: ``function``
:return: None
"""
self.handlers.append((exception, handler))
def lookup(self, exception):
handler = self.cached_handlers.get(exception, self._missing)
"""
Lookup the existing instance of :class:`ErrorHandler` and fetch the
registered handler for a specific type of exception.
This method leverages a dict lookup to speedup the retrieval process.
:param exception: Type of exception
:type exception: :class:`sanic.exceptions.SanicException` or
:class:`Exception`
:return: Registered function if found ``None`` otherwise
"""
handler = self.cached_handlers.get(type(exception), self._missing)
if handler is self._missing:
for exception_class, handler in self.handlers:
if isinstance(exception, exception_class):
@@ -71,101 +112,149 @@ class ErrorHandler:
"""Fetches and executes an exception handler and returns a response
object
:param request: Request
:param request: Instance of :class:`sanic.request.Request`
:param exception: Exception to handle
:return: Response object
:type request: :class:`sanic.request.Request`
:type exception: :class:`sanic.exceptions.SanicException` or
:class:`Exception`
:return: Wrap the return value obtained from :func:`default`
or registered handler for that type of exception.
"""
handler = self.lookup(exception)
response = None
try:
if handler:
response = handler(request=request, exception=exception)
response = handler(request, exception)
if response is None:
response = self.default(request=request, exception=exception)
response = self.default(request, exception)
except Exception:
self.log(format_exc())
try:
url = repr(request.url)
except AttributeError:
url = "unknown"
response_message = (
"Exception raised in exception handler " '"%s" for uri: %s'
)
logger.exception(response_message, handler.__name__, url)
if self.debug:
url = getattr(request, 'url', 'unknown')
response_message = (
'Exception raised in exception handler "{}" '
'for uri: "{}"\n{}').format(
handler.__name__, url, format_exc())
log.error(response_message)
return text(response_message, 500)
return text(response_message % (handler.__name__, url), 500)
else:
return text('An error occurred while handling an error', 500)
return text("An error occurred while handling an error", 500)
return response
def log(self, message, level='error'):
def log(self, message, level="error"):
"""
Override this method in an ErrorHandler subclass to prevent
logging exceptions.
Deprecated, do not use.
"""
getattr(log, level)(message)
def default(self, request, exception):
"""
Provide a default behavior for the objects of :class:`ErrorHandler`.
If a developer chooses to extent the :class:`ErrorHandler` they can
provide a custom implementation for this method to behave in a way
they see fit.
:param request: Incoming request
:param exception: Exception object
:type request: :class:`sanic.request.Request`
:type exception: :class:`sanic.exceptions.SanicException` or
:class:`Exception`
:return:
"""
self.log(format_exc())
try:
url = repr(request.url)
except AttributeError:
url = "unknown"
response_message = "Exception occurred while handling uri: %s"
logger.exception(response_message, url)
if issubclass(type(exception), SanicException):
return text(
'Error: {}'.format(exception),
status=getattr(exception, 'status_code', 500),
headers=getattr(exception, 'headers', dict())
"Error: {}".format(exception),
status=getattr(exception, "status_code", 500),
headers=getattr(exception, "headers", dict()),
)
elif self.debug:
html_output = self._render_traceback_html(exception, request)
response_message = (
'Exception occurred while handling uri: "{}"\n{}'.format(
request.url, format_exc()))
log.error(response_message)
return html(html_output, status=500)
else:
return html(INTERNAL_SERVER_ERROR_HTML, status=500)
class ContentRangeHandler:
"""Class responsible for parsing request header"""
__slots__ = ('start', 'end', 'size', 'total', 'headers')
"""
A mechanism to parse and process the incoming request headers to
extract the content range information.
:param request: Incoming api request
:param stats: Stats related to the content
:type request: :class:`sanic.request.Request`
:type stats: :class:`posix.stat_result`
:ivar start: Content Range start
:ivar end: Content Range end
:ivar size: Length of the content
:ivar total: Total size identified by the :class:`posix.stat_result`
instance
:ivar ContentRangeHandler.headers: Content range header ``dict``
"""
__slots__ = ("start", "end", "size", "total", "headers")
def __init__(self, request, stats):
self.total = stats.st_size
_range = request.headers.get('Range')
_range = request.headers.get("Range")
if _range is None:
raise HeaderNotFound('Range Header Not Found')
unit, _, value = tuple(map(str.strip, _range.partition('=')))
if unit != 'bytes':
raise HeaderNotFound("Range Header Not Found")
unit, _, value = tuple(map(str.strip, _range.partition("=")))
if unit != "bytes":
raise InvalidRangeType(
'%s is not a valid Range Type' % (unit,), self)
start_b, _, end_b = tuple(map(str.strip, value.partition('-')))
"%s is not a valid Range Type" % (unit,), self
)
start_b, _, end_b = tuple(map(str.strip, value.partition("-")))
try:
self.start = int(start_b) if start_b else None
except ValueError:
raise ContentRangeError(
'\'%s\' is invalid for Content Range' % (start_b,), self)
"'%s' is invalid for Content Range" % (start_b,), self
)
try:
self.end = int(end_b) if end_b else None
except ValueError:
raise ContentRangeError(
'\'%s\' is invalid for Content Range' % (end_b,), self)
"'%s' is invalid for Content Range" % (end_b,), self
)
if self.end is None:
if self.start is None:
raise ContentRangeError(
'Invalid for Content Range parameters', self)
"Invalid for Content Range parameters", self
)
else:
# this case represents `Content-Range: bytes 5-`
self.end = self.total
self.end = self.total - 1
else:
if self.start is None:
# this case represents `Content-Range: bytes -5`
self.start = self.total - self.end
self.end = self.total
self.end = self.total - 1
if self.start >= self.end:
raise ContentRangeError(
'Invalid for Content Range parameters', self)
self.size = self.end - self.start
"Invalid for Content Range parameters", self
)
self.size = self.end - self.start + 1
self.headers = {
'Content-Range': "bytes %s-%s/%s" % (
self.start, self.end, self.total)}
"Content-Range": "bytes %s-%s/%s"
% (self.start, self.end, self.total)
}
def __bool__(self):
return self.size > 0

133
sanic/helpers.py Normal file
View File

@@ -0,0 +1,133 @@
"""Defines basics of HTTP standard."""
STATUS_CODES = {
100: b"Continue",
101: b"Switching Protocols",
102: b"Processing",
200: b"OK",
201: b"Created",
202: b"Accepted",
203: b"Non-Authoritative Information",
204: b"No Content",
205: b"Reset Content",
206: b"Partial Content",
207: b"Multi-Status",
208: b"Already Reported",
226: b"IM Used",
300: b"Multiple Choices",
301: b"Moved Permanently",
302: b"Found",
303: b"See Other",
304: b"Not Modified",
305: b"Use Proxy",
307: b"Temporary Redirect",
308: b"Permanent Redirect",
400: b"Bad Request",
401: b"Unauthorized",
402: b"Payment Required",
403: b"Forbidden",
404: b"Not Found",
405: b"Method Not Allowed",
406: b"Not Acceptable",
407: b"Proxy Authentication Required",
408: b"Request Timeout",
409: b"Conflict",
410: b"Gone",
411: b"Length Required",
412: b"Precondition Failed",
413: b"Request Entity Too Large",
414: b"Request-URI Too Long",
415: b"Unsupported Media Type",
416: b"Requested Range Not Satisfiable",
417: b"Expectation Failed",
418: b"I'm a teapot",
422: b"Unprocessable Entity",
423: b"Locked",
424: b"Failed Dependency",
426: b"Upgrade Required",
428: b"Precondition Required",
429: b"Too Many Requests",
431: b"Request Header Fields Too Large",
451: b"Unavailable For Legal Reasons",
500: b"Internal Server Error",
501: b"Not Implemented",
502: b"Bad Gateway",
503: b"Service Unavailable",
504: b"Gateway Timeout",
505: b"HTTP Version Not Supported",
506: b"Variant Also Negotiates",
507: b"Insufficient Storage",
508: b"Loop Detected",
510: b"Not Extended",
511: b"Network Authentication Required",
}
# According to https://tools.ietf.org/html/rfc2616#section-7.1
_ENTITY_HEADERS = frozenset(
[
"allow",
"content-encoding",
"content-language",
"content-length",
"content-location",
"content-md5",
"content-range",
"content-type",
"expires",
"last-modified",
"extension-header",
]
)
# According to https://tools.ietf.org/html/rfc2616#section-13.5.1
_HOP_BY_HOP_HEADERS = frozenset(
[
"connection",
"keep-alive",
"proxy-authenticate",
"proxy-authorization",
"te",
"trailers",
"transfer-encoding",
"upgrade",
]
)
def has_message_body(status):
"""
According to the following RFC message body and length SHOULD NOT
be included in responses status 1XX, 204 and 304.
https://tools.ietf.org/html/rfc2616#section-4.4
https://tools.ietf.org/html/rfc2616#section-4.3
"""
return status not in (204, 304) and not (100 <= status < 200)
def is_entity_header(header):
"""Checks if the given header is an Entity Header"""
return header.lower() in _ENTITY_HEADERS
def is_hop_by_hop_header(header):
"""Checks if the given header is a Hop By Hop header"""
return header.lower() in _HOP_BY_HOP_HEADERS
def remove_entity_headers(headers, allowed=("content-location", "expires")):
"""
Removes all the entity headers present in the headers given.
According to RFC 2616 Section 10.3.5,
Content-Location and Expires are allowed as for the
"strong cache validator".
https://tools.ietf.org/html/rfc2616#section-10.3.5
returns the headers without the entity headers
"""
allowed = set([h.lower() for h in allowed])
headers = {
header: value
for header, value in headers.items()
if not is_entity_header(header) or header.lower() in allowed
}
return headers

View File

@@ -1,18 +1,58 @@
import logging
import sys
class DefaultFilter(logging.Filter):
def __init__(self, param=None):
self.param = param
def filter(self, record):
if self.param is None:
return True
if record.levelno in self.param:
return True
return False
LOGGING_CONFIG_DEFAULTS = dict(
version=1,
disable_existing_loggers=False,
loggers={
"sanic.root": {"level": "INFO", "handlers": ["console"]},
"sanic.error": {
"level": "INFO",
"handlers": ["error_console"],
"propagate": True,
"qualname": "sanic.error",
},
"sanic.access": {
"level": "INFO",
"handlers": ["access_console"],
"propagate": True,
"qualname": "sanic.access",
},
},
handlers={
"console": {
"class": "logging.StreamHandler",
"formatter": "generic",
"stream": sys.stdout,
},
"error_console": {
"class": "logging.StreamHandler",
"formatter": "generic",
"stream": sys.stderr,
},
"access_console": {
"class": "logging.StreamHandler",
"formatter": "access",
"stream": sys.stdout,
},
},
formatters={
"generic": {
"format": "%(asctime)s [%(process)d] [%(levelname)s] %(message)s",
"datefmt": "[%Y-%m-%d %H:%M:%S %z]",
"class": "logging.Formatter",
},
"access": {
"format": "%(asctime)s - (%(name)s)[%(levelname)s][%(host)s]: "
+ "%(request)s %(message)s %(status)d %(byte)d",
"datefmt": "[%Y-%m-%d %H:%M:%S %z]",
"class": "logging.Formatter",
},
},
)
log = logging.getLogger('sanic')
netlog = logging.getLogger('network')
logger = logging.getLogger("sanic.root")
error_logger = logging.getLogger("sanic.error")
access_logger = logging.getLogger("sanic.access")

167
sanic/reloader_helpers.py Normal file
View File

@@ -0,0 +1,167 @@
import os
import signal
import subprocess
import sys
from multiprocessing import Process
from time import sleep
def _iter_module_files():
"""This iterates over all relevant Python files.
It goes through all
loaded files from modules, all files in folders of already loaded modules
as well as all files reachable through a package.
"""
# The list call is necessary on Python 3 in case the module
# dictionary modifies during iteration.
for module in list(sys.modules.values()):
if module is None:
continue
filename = getattr(module, "__file__", None)
if filename:
old = None
while not os.path.isfile(filename):
old = filename
filename = os.path.dirname(filename)
if filename == old:
break
else:
if filename[-4:] in (".pyc", ".pyo"):
filename = filename[:-1]
yield filename
def _get_args_for_reloading():
"""Returns the executable."""
rv = [sys.executable]
main_module = sys.modules["__main__"]
mod_spec = getattr(main_module, "__spec__", None)
if mod_spec:
# Parent exe was launched as a module rather than a script
rv.extend(["-m", mod_spec.name])
if len(sys.argv) > 1:
rv.extend(sys.argv[1:])
else:
rv.extend(sys.argv)
return rv
def restart_with_reloader():
"""Create a new process and a subprocess in it with the same arguments as
this one.
"""
cwd = os.getcwd()
args = _get_args_for_reloading()
new_environ = os.environ.copy()
new_environ["SANIC_SERVER_RUNNING"] = "true"
cmd = " ".join(args)
worker_process = Process(
target=subprocess.call,
args=(cmd,),
kwargs={"cwd": cwd, "shell": True, "env": new_environ},
)
worker_process.start()
return worker_process
def kill_process_children_unix(pid):
"""Find and kill child processes of a process (maximum two level).
:param pid: PID of parent process (process ID)
:return: Nothing
"""
root_process_path = "/proc/{pid}/task/{pid}/children".format(pid=pid)
if not os.path.isfile(root_process_path):
return
with open(root_process_path) as children_list_file:
children_list_pid = children_list_file.read().split()
for child_pid in children_list_pid:
children_proc_path = "/proc/%s/task/%s/children" % (
child_pid,
child_pid,
)
if not os.path.isfile(children_proc_path):
continue
with open(children_proc_path) as children_list_file_2:
children_list_pid_2 = children_list_file_2.read().split()
for _pid in children_list_pid_2:
try:
os.kill(int(_pid), signal.SIGTERM)
except ProcessLookupError:
continue
try:
os.kill(int(child_pid), signal.SIGTERM)
except ProcessLookupError:
continue
def kill_process_children_osx(pid):
"""Find and kill child processes of a process.
:param pid: PID of parent process (process ID)
:return: Nothing
"""
subprocess.run(["pkill", "-P", str(pid)])
def kill_process_children(pid):
"""Find and kill child processes of a process.
:param pid: PID of parent process (process ID)
:return: Nothing
"""
if sys.platform == "darwin":
kill_process_children_osx(pid)
elif sys.platform == "linux":
kill_process_children_unix(pid)
else:
pass # should signal error here
def kill_program_completly(proc):
"""Kill worker and it's child processes and exit.
:param proc: worker process (process ID)
:return: Nothing
"""
kill_process_children(proc.pid)
proc.terminate()
os._exit(0)
def watchdog(sleep_interval):
"""Watch project files, restart worker process if a change happened.
:param sleep_interval: interval in second.
:return: Nothing
"""
mtimes = {}
worker_process = restart_with_reloader()
signal.signal(
signal.SIGTERM, lambda *args: kill_program_completly(worker_process)
)
signal.signal(
signal.SIGINT, lambda *args: kill_program_completly(worker_process)
)
while True:
for filename in _iter_module_files():
try:
mtime = os.stat(filename).st_mtime
except OSError:
continue
old_time = mtimes.get(filename)
if old_time is None:
mtimes[filename] = mtime
continue
elif mtime > old_time:
kill_process_children(worker_process.pid)
worker_process.terminate()
worker_process = restart_with_reloader()
mtimes[filename] = mtime
break
sleep(sleep_interval)

View File

@@ -1,26 +1,36 @@
import sys
import asyncio
import email.utils
import json
import sys
import warnings
from cgi import parse_header
from collections import namedtuple
from collections import defaultdict, namedtuple
from http.cookies import SimpleCookie
from urllib.parse import parse_qs, parse_qsl, unquote, urlunparse
from httptools import parse_url
from urllib.parse import parse_qs, urlunparse
from sanic.exceptions import InvalidUsage
from sanic.log import error_logger, logger
try:
from ujson import loads as json_loads
except ImportError:
if sys.version_info[:2] == (3, 5):
def json_loads(data):
# on Python 3.5 json.loads only supports str not bytes
return json.loads(data.decode())
else:
json_loads = json.loads
from sanic.exceptions import InvalidUsage
from sanic.log import log
DEFAULT_HTTP_CONTENT_TYPE = "application/octet-stream"
# HTTP/1.1: https://www.w3.org/Protocols/rfc2616/rfc2616-sec7.html#sec7.2.1
# > If the media type remains unknown, the recipient SHOULD treat it
# > as type "application/octet-stream"
@@ -40,15 +50,53 @@ class RequestParameters(dict):
return super().get(name, default)
class StreamBuffer:
def __init__(self, buffer_size=100):
self._queue = asyncio.Queue(buffer_size)
async def read(self):
""" Stop reading when gets None """
payload = await self._queue.get()
self._queue.task_done()
return payload
async def put(self, payload):
await self._queue.put(payload)
def is_full(self):
return self._queue.full()
class Request(dict):
"""Properties of an HTTP request such as URL, headers, etc."""
__slots__ = (
'app', 'headers', 'version', 'method', '_cookies', 'transport',
'body', 'parsed_json', 'parsed_args', 'parsed_form', 'parsed_files',
'_ip', '_parsed_url', 'uri_template', 'stream', '_remote_addr'
"__weakref__",
"_cookies",
"_ip",
"_parsed_url",
"_port",
"_remote_addr",
"_socket",
"app",
"body",
"endpoint",
"headers",
"method",
"parsed_args",
"parsed_not_grouped_args",
"parsed_files",
"parsed_form",
"parsed_json",
"raw_url",
"stream",
"transport",
"uri_template",
"version",
)
def __init__(self, url_bytes, headers, version, method, transport):
self.raw_url = url_bytes
# TODO: Content-Encoding detection
self._parsed_url = parse_url(url_bytes)
self.app = None
@@ -59,24 +107,50 @@ class Request(dict):
self.transport = transport
# Init but do not inhale
self.body = []
self.body_init()
self.parsed_json = None
self.parsed_form = None
self.parsed_files = None
self.parsed_args = None
self.parsed_args = defaultdict(RequestParameters)
self.parsed_not_grouped_args = defaultdict(list)
self.uri_template = None
self._cookies = None
self.stream = None
self.endpoint = None
def __repr__(self):
return "<{0}: {1} {2}>".format(
self.__class__.__name__, self.method, self.path
)
def __bool__(self):
if self.transport:
return True
return False
def body_init(self):
self.body = []
def body_push(self, data):
self.body.append(data)
def body_finish(self):
self.body = b"".join(self.body)
@property
def json(self):
if self.parsed_json is None:
try:
self.parsed_json = json_loads(self.body)
except Exception:
if not self.body:
return None
raise InvalidUsage("Failed when parsing body as json")
self.load_json()
return self.parsed_json
def load_json(self, loads=json_loads):
try:
self.parsed_json = loads(self.body)
except Exception:
if not self.body:
return None
raise InvalidUsage("Failed when parsing body as json")
return self.parsed_json
@@ -86,8 +160,8 @@ class Request(dict):
:return: token related to request
"""
prefixes = ('Bearer', 'Token')
auth_header = self.headers.get('Authorization')
prefixes = ("Bearer", "Token")
auth_header = self.headers.get("Authorization")
if auth_header is not None:
for prefix in prefixes:
@@ -102,19 +176,22 @@ class Request(dict):
self.parsed_form = RequestParameters()
self.parsed_files = RequestParameters()
content_type = self.headers.get(
'Content-Type', DEFAULT_HTTP_CONTENT_TYPE)
"Content-Type", DEFAULT_HTTP_CONTENT_TYPE
)
content_type, parameters = parse_header(content_type)
try:
if content_type == 'application/x-www-form-urlencoded':
if content_type == "application/x-www-form-urlencoded":
self.parsed_form = RequestParameters(
parse_qs(self.body.decode('utf-8')))
elif content_type == 'multipart/form-data':
parse_qs(self.body.decode("utf-8"))
)
elif content_type == "multipart/form-data":
# TODO: Stream this instead of reading to/from memory
boundary = parameters['boundary'].encode('utf-8')
self.parsed_form, self.parsed_files = (
parse_multipart_form(self.body, boundary))
boundary = parameters["boundary"].encode("utf-8")
self.parsed_form, self.parsed_files = parse_multipart_form(
self.body, boundary
)
except Exception:
log.exception("Failed when parsing form")
error_logger.exception("Failed when parsing form")
return self.parsed_form
@@ -125,69 +202,188 @@ class Request(dict):
return self.parsed_files
@property
def args(self):
if self.parsed_args is None:
def get_args(
self,
keep_blank_values: bool = False,
strict_parsing: bool = False,
encoding: str = "utf-8",
errors: str = "replace",
) -> RequestParameters:
"""
Method to parse `query_string` using `urllib.parse.parse_qs`.
This methods is used by `args` property.
Can be used directly if you need to change default parameters.
:param keep_blank_values: flag indicating whether blank values in
percent-encoded queries should be treated as blank strings.
A true value indicates that blanks should be retained as blank
strings. The default false value indicates that blank values
are to be ignored and treated as if they were not included.
:type keep_blank_values: bool
:param strict_parsing: flag indicating what to do with parsing errors.
If false (the default), errors are silently ignored. If true,
errors raise a ValueError exception.
:type strict_parsing: bool
:param encoding: specify how to decode percent-encoded sequences
into Unicode characters, as accepted by the bytes.decode() method.
:type encoding: str
:param errors: specify how to decode percent-encoded sequences
into Unicode characters, as accepted by the bytes.decode() method.
:type errors: str
:return: RequestParameters
"""
if not self.parsed_args[
(keep_blank_values, strict_parsing, encoding, errors)
]:
if self.query_string:
self.parsed_args = RequestParameters(
parse_qs(self.query_string))
else:
self.parsed_args = RequestParameters()
return self.parsed_args
self.parsed_args[
(keep_blank_values, strict_parsing, encoding, errors)
] = RequestParameters(
parse_qs(
qs=self.query_string,
keep_blank_values=keep_blank_values,
strict_parsing=strict_parsing,
encoding=encoding,
errors=errors,
)
)
return self.parsed_args[
(keep_blank_values, strict_parsing, encoding, errors)
]
args = property(get_args)
@property
def raw_args(self):
def raw_args(self) -> dict:
if self.app.debug: # pragma: no cover
warnings.simplefilter("default")
warnings.warn(
"Use of raw_args will be deprecated in "
"the future versions. Please use args or query_args "
"properties instead",
DeprecationWarning,
)
return {k: v[0] for k, v in self.args.items()}
def get_query_args(
self,
keep_blank_values: bool = False,
strict_parsing: bool = False,
encoding: str = "utf-8",
errors: str = "replace",
) -> list:
"""
Method to parse `query_string` using `urllib.parse.parse_qsl`.
This methods is used by `query_args` property.
Can be used directly if you need to change default parameters.
:param keep_blank_values: flag indicating whether blank values in
percent-encoded queries should be treated as blank strings.
A true value indicates that blanks should be retained as blank
strings. The default false value indicates that blank values
are to be ignored and treated as if they were not included.
:type keep_blank_values: bool
:param strict_parsing: flag indicating what to do with parsing errors.
If false (the default), errors are silently ignored. If true,
errors raise a ValueError exception.
:type strict_parsing: bool
:param encoding: specify how to decode percent-encoded sequences
into Unicode characters, as accepted by the bytes.decode() method.
:type encoding: str
:param errors: specify how to decode percent-encoded sequences
into Unicode characters, as accepted by the bytes.decode() method.
:type errors: str
:return: list
"""
if not self.parsed_not_grouped_args[
(keep_blank_values, strict_parsing, encoding, errors)
]:
if self.query_string:
self.parsed_not_grouped_args[
(keep_blank_values, strict_parsing, encoding, errors)
] = parse_qsl(
qs=self.query_string,
keep_blank_values=keep_blank_values,
strict_parsing=strict_parsing,
encoding=encoding,
errors=errors,
)
return self.parsed_not_grouped_args[
(keep_blank_values, strict_parsing, encoding, errors)
]
query_args = property(get_query_args)
@property
def cookies(self):
if self._cookies is None:
cookie = self.headers.get('Cookie')
cookie = self.headers.get("Cookie")
if cookie is not None:
cookies = SimpleCookie()
cookies.load(cookie)
self._cookies = {name: cookie.value
for name, cookie in cookies.items()}
self._cookies = {
name: cookie.value for name, cookie in cookies.items()
}
else:
self._cookies = {}
return self._cookies
@property
def ip(self):
if not hasattr(self, '_ip'):
self._ip = (self.transport.get_extra_info('peername') or
(None, None))
if not hasattr(self, "_socket"):
self._get_address()
return self._ip
@property
def port(self):
if not hasattr(self, "_socket"):
self._get_address()
return self._port
@property
def socket(self):
if not hasattr(self, "_socket"):
self._get_address()
return self._socket
def _get_address(self):
self._socket = self.transport.get_extra_info("peername") or (
None,
None,
)
self._ip = self._socket[0]
self._port = self._socket[1]
@property
def remote_addr(self):
"""Attempt to return the original client ip based on X-Forwarded-For.
:return: original client ip.
"""
if not hasattr(self, '_remote_addr'):
forwarded_for = self.headers.get('X-Forwarded-For', '').split(',')
if not hasattr(self, "_remote_addr"):
forwarded_for = self.headers.get("X-Forwarded-For", "").split(",")
remote_addrs = [
addr for addr in [
addr.strip() for addr in forwarded_for
] if addr
addr
for addr in [addr.strip() for addr in forwarded_for]
if addr
]
if len(remote_addrs) > 0:
self._remote_addr = remote_addrs[0]
else:
self._remote_addr = ''
self._remote_addr = ""
return self._remote_addr
@property
def scheme(self):
if self.app.websocket_enabled \
and self.headers.get('upgrade') == 'websocket':
scheme = 'ws'
if (
self.app.websocket_enabled
and self.headers.get("upgrade") == "websocket"
):
scheme = "ws"
else:
scheme = 'http'
scheme = "http"
if self.transport.get_extra_info('sslcontext'):
scheme += 's'
if self.transport.get_extra_info("sslcontext"):
scheme += "s"
return scheme
@@ -195,11 +391,11 @@ class Request(dict):
def host(self):
# it appears that httptools doesn't return the host
# so pull it from the headers
return self.headers.get('Host', '')
return self.headers.get("Host", "")
@property
def content_type(self):
return self.headers.get('Content-Type', DEFAULT_HTTP_CONTENT_TYPE)
return self.headers.get("Content-Type", DEFAULT_HTTP_CONTENT_TYPE)
@property
def match_info(self):
@@ -208,27 +404,23 @@ class Request(dict):
@property
def path(self):
return self._parsed_url.path.decode('utf-8')
return self._parsed_url.path.decode("utf-8")
@property
def query_string(self):
if self._parsed_url.query:
return self._parsed_url.query.decode('utf-8')
return self._parsed_url.query.decode("utf-8")
else:
return ''
return ""
@property
def url(self):
return urlunparse((
self.scheme,
self.host,
self.path,
None,
self.query_string,
None))
return urlunparse(
(self.scheme, self.host, self.path, None, self.query_string, None)
)
File = namedtuple('File', ['type', 'body', 'name'])
File = namedtuple("File", ["type", "body", "name"])
def parse_multipart_form(body, boundary):
@@ -244,42 +436,59 @@ def parse_multipart_form(body, boundary):
form_parts = body.split(boundary)
for form_part in form_parts[1:-1]:
file_name = None
file_type = None
content_type = "text/plain"
content_charset = "utf-8"
field_name = None
line_index = 2
line_end_index = 0
while not line_end_index == -1:
line_end_index = form_part.find(b'\r\n', line_index)
form_line = form_part[line_index:line_end_index].decode('utf-8')
line_end_index = form_part.find(b"\r\n", line_index)
form_line = form_part[line_index:line_end_index].decode("utf-8")
line_index = line_end_index + 2
if not form_line:
break
colon_index = form_line.index(':')
colon_index = form_line.index(":")
form_header_field = form_line[0:colon_index].lower()
form_header_value, form_parameters = parse_header(
form_line[colon_index + 2:])
form_line[colon_index + 2 :]
)
if form_header_field == 'content-disposition':
if 'filename' in form_parameters:
file_name = form_parameters['filename']
field_name = form_parameters.get('name')
elif form_header_field == 'content-type':
file_type = form_header_value
if form_header_field == "content-disposition":
field_name = form_parameters.get("name")
file_name = form_parameters.get("filename")
post_data = form_part[line_index:-4]
if file_name or file_type:
file = File(type=file_type, name=file_name, body=post_data)
if field_name in files:
files[field_name].append(file)
# non-ASCII filenames in RFC2231, "filename*" format
if file_name is None and form_parameters.get("filename*"):
encoding, _, value = email.utils.decode_rfc2231(
form_parameters["filename*"]
)
file_name = unquote(value, encoding=encoding)
elif form_header_field == "content-type":
content_type = form_header_value
content_charset = form_parameters.get("charset", "utf-8")
if field_name:
post_data = form_part[line_index:-4]
if file_name is None:
value = post_data.decode(content_charset)
if field_name in fields:
fields[field_name].append(value)
else:
fields[field_name] = [value]
else:
files[field_name] = [file]
form_file = File(
type=content_type, name=file_name, body=post_data
)
if field_name in files:
files[field_name].append(form_file)
else:
files[field_name] = [form_file]
else:
value = post_data.decode('utf-8')
if field_name in fields:
fields[field_name].append(value)
else:
fields[field_name] = [value]
logger.debug(
"Form-data field does not have a 'name' parameter "
"in the Content-Disposition header"
)
return fields, files

View File

@@ -1,80 +1,23 @@
from functools import partial
from mimetypes import guess_type
from os import path
from urllib.parse import quote_plus
from aiofiles import open as open_async
from multidict import CIMultiDict
from sanic.cookies import CookieJar
from sanic.helpers import STATUS_CODES, has_message_body, remove_entity_headers
try:
from ujson import dumps as json_dumps
except:
from json import dumps as json_dumps
except BaseException:
from json import dumps
from aiofiles import open as open_async
from sanic.cookies import CookieJar
COMMON_STATUS_CODES = {
200: b'OK',
400: b'Bad Request',
404: b'Not Found',
500: b'Internal Server Error',
}
ALL_STATUS_CODES = {
100: b'Continue',
101: b'Switching Protocols',
102: b'Processing',
200: b'OK',
201: b'Created',
202: b'Accepted',
203: b'Non-Authoritative Information',
204: b'No Content',
205: b'Reset Content',
206: b'Partial Content',
207: b'Multi-Status',
208: b'Already Reported',
226: b'IM Used',
300: b'Multiple Choices',
301: b'Moved Permanently',
302: b'Found',
303: b'See Other',
304: b'Not Modified',
305: b'Use Proxy',
307: b'Temporary Redirect',
308: b'Permanent Redirect',
400: b'Bad Request',
401: b'Unauthorized',
402: b'Payment Required',
403: b'Forbidden',
404: b'Not Found',
405: b'Method Not Allowed',
406: b'Not Acceptable',
407: b'Proxy Authentication Required',
408: b'Request Timeout',
409: b'Conflict',
410: b'Gone',
411: b'Length Required',
412: b'Precondition Failed',
413: b'Request Entity Too Large',
414: b'Request-URI Too Long',
415: b'Unsupported Media Type',
416: b'Requested Range Not Satisfiable',
417: b'Expectation Failed',
422: b'Unprocessable Entity',
423: b'Locked',
424: b'Failed Dependency',
426: b'Upgrade Required',
428: b'Precondition Required',
429: b'Too Many Requests',
431: b'Request Header Fields Too Large',
500: b'Internal Server Error',
501: b'Not Implemented',
502: b'Bad Gateway',
503: b'Service Unavailable',
504: b'Gateway Timeout',
505: b'HTTP Version Not Supported',
506: b'Variant Also Negotiates',
507: b'Insufficient Storage',
508: b'Loop Detected',
510: b'Not Extended',
511: b'Network Authentication Required'
}
# This is done in order to ensure that the JSON response is
# kept consistent across both ujson and inbuilt json usage.
json_dumps = partial(dumps, separators=(",", ":"))
class BaseHTTPResponse:
@@ -87,16 +30,18 @@ class BaseHTTPResponse:
return str(data).encode()
def _parse_headers(self):
headers = b''
headers = b""
for name, value in self.headers.items():
try:
headers += (
b'%b: %b\r\n' % (
name.encode(), value.encode('utf-8')))
headers += b"%b: %b\r\n" % (
name.encode(),
value.encode("utf-8"),
)
except AttributeError:
headers += (
b'%b: %b\r\n' % (
str(name).encode(), str(value).encode('utf-8')))
headers += b"%b: %b\r\n" % (
str(name).encode(),
str(value).encode("utf-8"),
)
return headers
@@ -109,18 +54,24 @@ class BaseHTTPResponse:
class StreamingHTTPResponse(BaseHTTPResponse):
__slots__ = (
'transport', 'streaming_fn',
'status', 'content_type', 'headers', '_cookies')
"protocol",
"streaming_fn",
"status",
"content_type",
"headers",
"_cookies",
)
def __init__(self, streaming_fn, status=200, headers=None,
content_type='text/plain'):
def __init__(
self, streaming_fn, status=200, headers=None, content_type="text/plain"
):
self.content_type = content_type
self.streaming_fn = streaming_fn
self.status = status
self.headers = headers or {}
self.headers = CIMultiDict(headers or {})
self._cookies = None
def write(self, data):
async def write(self, data):
"""Writes a chunk of data to the streaming response.
:param data: bytes-ish data to be written.
@@ -128,59 +79,69 @@ class StreamingHTTPResponse(BaseHTTPResponse):
if type(data) != bytes:
data = self._encode_body(data)
self.transport.write(
b"%x\r\n%b\r\n" % (len(data), data))
self.protocol.push_data(b"%x\r\n%b\r\n" % (len(data), data))
await self.protocol.drain()
async def stream(
self, version="1.1", keep_alive=False, keep_alive_timeout=None):
self, version="1.1", keep_alive=False, keep_alive_timeout=None
):
"""Streams headers, runs the `streaming_fn` callback that writes
content to the response body, then finalizes the response body.
"""
headers = self.get_headers(
version, keep_alive=keep_alive,
keep_alive_timeout=keep_alive_timeout)
self.transport.write(headers)
version,
keep_alive=keep_alive,
keep_alive_timeout=keep_alive_timeout,
)
self.protocol.push_data(headers)
await self.protocol.drain()
await self.streaming_fn(self)
self.transport.write(b'0\r\n\r\n')
self.protocol.push_data(b"0\r\n\r\n")
# no need to await drain here after this write, because it is the
# very last thing we write and nothing needs to wait for it.
def get_headers(
self, version="1.1", keep_alive=False, keep_alive_timeout=None):
self, version="1.1", keep_alive=False, keep_alive_timeout=None
):
# This is all returned in a kind-of funky way
# We tried to make this as fast as possible in pure python
timeout_header = b''
timeout_header = b""
if keep_alive and keep_alive_timeout is not None:
timeout_header = b'Keep-Alive: %d\r\n' % keep_alive_timeout
timeout_header = b"Keep-Alive: %d\r\n" % keep_alive_timeout
self.headers['Transfer-Encoding'] = 'chunked'
self.headers.pop('Content-Length', None)
self.headers['Content-Type'] = self.headers.get(
'Content-Type', self.content_type)
self.headers["Transfer-Encoding"] = "chunked"
self.headers.pop("Content-Length", None)
self.headers["Content-Type"] = self.headers.get(
"Content-Type", self.content_type
)
headers = self._parse_headers()
# Try to pull from the common codes first
# Speeds up response rate 6% over pulling from all
status = COMMON_STATUS_CODES.get(self.status)
if not status:
status = ALL_STATUS_CODES.get(self.status)
if self.status == 200:
status = b"OK"
else:
status = STATUS_CODES.get(self.status)
return (b'HTTP/%b %d %b\r\n'
b'%b'
b'%b\r\n') % (
version.encode(),
self.status,
status,
timeout_header,
headers
)
return (b"HTTP/%b %d %b\r\n" b"%b" b"%b\r\n") % (
version.encode(),
self.status,
status,
timeout_header,
headers,
)
class HTTPResponse(BaseHTTPResponse):
__slots__ = ('body', 'status', 'content_type', 'headers', '_cookies')
__slots__ = ("body", "status", "content_type", "headers", "_cookies")
def __init__(self, body=None, status=200, headers=None,
content_type='text/plain', body_bytes=b''):
def __init__(
self,
body=None,
status=200,
headers=None,
content_type="text/plain",
body_bytes=b"",
):
self.content_type = content_type
if body is not None:
@@ -189,42 +150,48 @@ class HTTPResponse(BaseHTTPResponse):
self.body = body_bytes
self.status = status
self.headers = headers or {}
self.headers = CIMultiDict(headers or {})
self._cookies = None
def output(
self, version="1.1", keep_alive=False, keep_alive_timeout=None):
def output(self, version="1.1", keep_alive=False, keep_alive_timeout=None):
# This is all returned in a kind-of funky way
# We tried to make this as fast as possible in pure python
timeout_header = b''
timeout_header = b""
if keep_alive and keep_alive_timeout is not None:
timeout_header = b'Keep-Alive: %d\r\n' % keep_alive_timeout
self.headers['Content-Length'] = self.headers.get(
'Content-Length', len(self.body))
self.headers['Content-Type'] = self.headers.get(
'Content-Type', self.content_type)
timeout_header = b"Keep-Alive: %d\r\n" % keep_alive_timeout
body = b""
if has_message_body(self.status):
body = self.body
self.headers["Content-Length"] = self.headers.get(
"Content-Length", len(self.body)
)
self.headers["Content-Type"] = self.headers.get(
"Content-Type", self.content_type
)
if self.status in (304, 412):
self.headers = remove_entity_headers(self.headers)
headers = self._parse_headers()
# Try to pull from the common codes first
# Speeds up response rate 6% over pulling from all
status = COMMON_STATUS_CODES.get(self.status)
if not status:
status = ALL_STATUS_CODES.get(self.status, b'UNKNOWN RESPONSE')
if self.status == 200:
status = b"OK"
else:
status = STATUS_CODES.get(self.status, b"UNKNOWN RESPONSE")
return (b'HTTP/%b %d %b\r\n'
b'Connection: %b\r\n'
b'%b'
b'%b\r\n'
b'%b') % (
version.encode(),
self.status,
status,
b'keep-alive' if keep_alive else b'close',
timeout_header,
headers,
self.body
)
return (
b"HTTP/%b %d %b\r\n" b"Connection: %b\r\n" b"%b" b"%b\r\n" b"%b"
) % (
version.encode(),
self.status,
status,
b"keep-alive" if keep_alive else b"close",
timeout_header,
headers,
body,
)
@property
def cookies(self):
@@ -233,8 +200,14 @@ class HTTPResponse(BaseHTTPResponse):
return self._cookies
def json(body, status=200, headers=None,
content_type="application/json", **kwargs):
def json(
body,
status=200,
headers=None,
content_type="application/json",
dumps=json_dumps,
**kwargs
):
"""
Returns response object with body in json format.
@@ -243,12 +216,17 @@ def json(body, status=200, headers=None,
:param headers: Custom Headers.
:param kwargs: Remaining arguments that are passed to the json encoder.
"""
return HTTPResponse(json_dumps(body, **kwargs), headers=headers,
status=status, content_type=content_type)
return HTTPResponse(
dumps(body, **kwargs),
headers=headers,
status=status,
content_type=content_type,
)
def text(body, status=200, headers=None,
content_type="text/plain; charset=utf-8"):
def text(
body, status=200, headers=None, content_type="text/plain; charset=utf-8"
):
"""
Returns response object with body in text format.
@@ -258,12 +236,13 @@ def text(body, status=200, headers=None,
:param content_type: the content type (string) of the response
"""
return HTTPResponse(
body, status=status, headers=headers,
content_type=content_type)
body, status=status, headers=headers, content_type=content_type
)
def raw(body, status=200, headers=None,
content_type="application/octet-stream"):
def raw(
body, status=200, headers=None, content_type="application/octet-stream"
):
"""
Returns response object without encoding the body.
@@ -272,8 +251,12 @@ def raw(body, status=200, headers=None,
:param headers: Custom Headers.
:param content_type: the content type (string) of the response.
"""
return HTTPResponse(body_bytes=body, status=status, headers=headers,
content_type=content_type)
return HTTPResponse(
body_bytes=body,
status=status,
headers=headers,
content_type=content_type,
)
def html(body, status=200, headers=None):
@@ -284,50 +267,85 @@ def html(body, status=200, headers=None):
:param status: Response code.
:param headers: Custom Headers.
"""
return HTTPResponse(body, status=status, headers=headers,
content_type="text/html; charset=utf-8")
return HTTPResponse(
body,
status=status,
headers=headers,
content_type="text/html; charset=utf-8",
)
async def file(location, mime_type=None, headers=None, _range=None):
async def file(
location,
status=200,
mime_type=None,
headers=None,
filename=None,
_range=None,
):
"""Return a response object with file data.
:param location: Location of file on system.
:param mime_type: Specific mime_type.
:param headers: Custom Headers.
:param filename: Override filename.
:param _range:
"""
filename = path.split(location)[-1]
headers = headers or {}
if filename:
headers.setdefault(
"Content-Disposition", 'attachment; filename="{}"'.format(filename)
)
filename = filename or path.split(location)[-1]
async with open_async(location, mode='rb') as _file:
async with open_async(location, mode="rb") as _file:
if _range:
await _file.seek(_range.start)
out_stream = await _file.read(_range.size)
headers['Content-Range'] = 'bytes %s-%s/%s' % (
_range.start, _range.end, _range.total)
headers["Content-Range"] = "bytes %s-%s/%s" % (
_range.start,
_range.end,
_range.total,
)
status = 206
else:
out_stream = await _file.read()
mime_type = mime_type or guess_type(filename)[0] or 'text/plain'
return HTTPResponse(status=200,
headers=headers,
content_type=mime_type,
body_bytes=out_stream)
mime_type = mime_type or guess_type(filename)[0] or "text/plain"
return HTTPResponse(
status=status,
headers=headers,
content_type=mime_type,
body_bytes=out_stream,
)
async def file_stream(location, chunk_size=4096, mime_type=None, headers=None,
_range=None):
async def file_stream(
location,
status=200,
chunk_size=4096,
mime_type=None,
headers=None,
filename=None,
_range=None,
):
"""Return a streaming response object with file data.
:param location: Location of file on system.
:param chunk_size: The size of each chunk in the stream (in bytes)
:param mime_type: Specific mime_type.
:param headers: Custom Headers.
:param filename: Override filename.
:param _range:
"""
filename = path.split(location)[-1]
headers = headers or {}
if filename:
headers.setdefault(
"Content-Disposition", 'attachment; filename="{}"'.format(filename)
)
filename = filename or path.split(location)[-1]
_file = await open_async(location, mode='rb')
_file = await open_async(location, mode="rb")
async def _streaming_fn(response):
nonlocal _file, chunk_size
@@ -341,30 +359,39 @@ async def file_stream(location, chunk_size=4096, mime_type=None, headers=None,
if len(content) < 1:
break
to_send -= len(content)
response.write(content)
await response.write(content)
else:
while True:
content = await _file.read(chunk_size)
if len(content) < 1:
break
response.write(content)
await response.write(content)
finally:
await _file.close()
return # Returning from this fn closes the stream
mime_type = mime_type or guess_type(filename)[0] or 'text/plain'
mime_type = mime_type or guess_type(filename)[0] or "text/plain"
if _range:
headers['Content-Range'] = 'bytes %s-%s/%s' % (
_range.start, _range.end, _range.total)
return StreamingHTTPResponse(streaming_fn=_streaming_fn,
status=200,
headers=headers,
content_type=mime_type)
headers["Content-Range"] = "bytes %s-%s/%s" % (
_range.start,
_range.end,
_range.total,
)
status = 206
return StreamingHTTPResponse(
streaming_fn=_streaming_fn,
status=status,
headers=headers,
content_type=mime_type,
)
def stream(
streaming_fn, status=200, headers=None,
content_type="text/plain; charset=utf-8"):
streaming_fn,
status=200,
headers=None,
content_type="text/plain; charset=utf-8",
):
"""Accepts an coroutine `streaming_fn` which can be used to
write chunks to a streaming response. Returns a `StreamingHTTPResponse`.
@@ -384,15 +411,13 @@ def stream(
:param headers: Custom Headers.
"""
return StreamingHTTPResponse(
streaming_fn,
headers=headers,
content_type=content_type,
status=status
streaming_fn, headers=headers, content_type=content_type, status=status
)
def redirect(to, headers=None, status=302,
content_type="text/html; charset=utf-8"):
def redirect(
to, headers=None, status=302, content_type="text/html; charset=utf-8"
):
"""Abort execution and cause a 302 redirect (by default).
:param to: path or fully qualified URL to redirect to
@@ -403,10 +428,12 @@ def redirect(to, headers=None, status=302,
"""
headers = headers or {}
# URL Quote the URL before redirecting
safe_to = quote_plus(to, safe=":/%#?&=@[]!$&'()*+,;")
# According to RFC 7231, a relative URI is now permitted.
headers['Location'] = to
headers["Location"] = safe_to
return HTTPResponse(
status=status,
headers=headers,
content_type=content_type)
status=status, headers=headers, content_type=content_type
)

View File

@@ -1,29 +1,38 @@
import re
import uuid
from collections import defaultdict, namedtuple
from collections.abc import Iterable
from functools import lru_cache
from urllib.parse import unquote
from sanic.exceptions import NotFound, InvalidUsage
from sanic.exceptions import MethodNotSupported, NotFound
from sanic.views import CompositionView
Route = namedtuple(
'Route',
['handler', 'methods', 'pattern', 'parameters', 'name', 'uri'])
Parameter = namedtuple('Parameter', ['name', 'cast'])
"Route", ["handler", "methods", "pattern", "parameters", "name", "uri"]
)
Parameter = namedtuple("Parameter", ["name", "cast"])
REGEX_TYPES = {
'string': (str, r'[^/]+'),
'int': (int, r'\d+'),
'number': (float, r'[0-9\\.]+'),
'alpha': (str, r'[A-Za-z]+'),
'path': (str, r'[^/].*?'),
"string": (str, r"[^/]+"),
"int": (int, r"-?\d+"),
"number": (float, r"-?[0-9\\.]+"),
"alpha": (str, r"[A-Za-z]+"),
"path": (str, r"[^/].*?"),
"uuid": (
uuid.UUID,
r"[A-Fa-f0-9]{8}-[A-Fa-f0-9]{4}-"
r"[A-Fa-f0-9]{4}-[A-Fa-f0-9]{4}-[A-Fa-f0-9]{12}",
),
}
ROUTER_CACHE_SIZE = 1024
def url_hash(url):
return url.count('/')
return url.count("/")
class RouteExists(Exception):
@@ -34,6 +43,10 @@ class RouteDoesNotExist(Exception):
pass
class ParameterNameConflicts(Exception):
pass
class Router:
"""Router supports basic routing with parameters and method checks
@@ -60,13 +73,16 @@ class Router:
also be passed in as the type. The argument given to the function will
always be a string, independent of the type.
"""
routes_static = None
routes_dynamic = None
routes_always_check = None
parameter_pattern = re.compile(r'<(.+?)>')
parameter_pattern = re.compile(r"<(.+?)>")
def __init__(self):
self.routes_all = {}
self.routes_names = {}
self.routes_static_files = {}
self.routes_static = {}
self.routes_dynamic = defaultdict(list)
self.routes_always_check = []
@@ -88,9 +104,13 @@ class Router:
"""
# We could receive NAME or NAME:PATTERN
name = parameter_string
pattern = 'string'
if ':' in parameter_string:
name, pattern = parameter_string.split(':', 1)
pattern = "string"
if ":" in parameter_string:
name, pattern = parameter_string.split(":", 1)
if not name:
raise ValueError(
"Invalid parameter syntax: {}".format(parameter_string)
)
default = (str, pattern)
# Pull from pre-configured types
@@ -98,8 +118,16 @@ class Router:
return name, _type, pattern
def add(self, uri, methods, handler, host=None, strict_slashes=False,
version=None):
def add(
self,
uri,
methods,
handler,
host=None,
strict_slashes=False,
version=None,
name=None,
):
"""Add a handler to the route list
:param uri: path to match
@@ -113,34 +141,47 @@ class Router:
:return: Nothing
"""
if version is not None:
if uri.startswith('/'):
uri = "/".join(["/v{}".format(str(version)), uri[1:]])
else:
uri = "/".join(["/v{}".format(str(version)), uri])
version = re.escape(str(version).strip("/").lstrip("v"))
uri = "/".join(["/v{}".format(version), uri.lstrip("/")])
# add regular version
self._add(uri, methods, handler, host)
self._add(uri, methods, handler, host, name)
if strict_slashes:
return
if not isinstance(host, str) and host is not None:
# we have gotten back to the top of the recursion tree where the
# host was originally a list. By now, we've processed the strict
# slashes logic on the leaf nodes (the individual host strings in
# the list of host)
return
# Add versions with and without trailing /
slash_is_missing = (
not uri[-1] == '/'
and not self.routes_all.get(uri + '/', False)
)
slashed_methods = self.routes_all.get(uri + "/", frozenset({}))
unslashed_methods = self.routes_all.get(uri[:-1], frozenset({}))
if isinstance(methods, Iterable):
_slash_is_missing = all(
method in slashed_methods for method in methods
)
_without_slash_is_missing = all(
method in unslashed_methods for method in methods
)
else:
_slash_is_missing = methods in slashed_methods
_without_slash_is_missing = methods in unslashed_methods
slash_is_missing = not uri[-1] == "/" and not _slash_is_missing
without_slash_is_missing = (
uri[-1] == '/'
and not self.routes_all.get(uri[:-1], False)
and not uri == '/'
uri[-1] == "/" and not _without_slash_is_missing and not uri == "/"
)
# add version with trailing slash
if slash_is_missing:
self._add(uri + '/', methods, handler, host)
self._add(uri + "/", methods, handler, host, name)
# add version without trailing slash
elif without_slash_is_missing:
self._add(uri[:-1], methods, handler, host)
self._add(uri[:-1], methods, handler, host, name)
def _add(self, uri, methods, handler, host=None):
def _add(self, uri, methods, handler, host=None, name=None):
"""Add a handler to the route list
:param uri: path to match
@@ -148,6 +189,7 @@ class Router:
provided, any method is allowed
:param handler: request handler function.
When executed, it should provide a response object.
:param name: user defined route name for url_for
:return: Nothing
"""
if host is not None:
@@ -157,11 +199,13 @@ class Router:
else:
if not isinstance(host, Iterable):
raise ValueError("Expected either string or Iterable of "
"host strings, not {!r}".format(host))
raise ValueError(
"Expected either string or Iterable of "
"host strings, not {!r}".format(host)
)
for host_ in host:
self.add(uri, methods, handler, host_)
self.add(uri, methods, handler, host_, name)
return
# Dict for faster lookups of if method allowed
@@ -169,40 +213,48 @@ class Router:
methods = frozenset(methods)
parameters = []
parameter_names = set()
properties = {"unhashable": None}
def add_parameter(match):
name = match.group(1)
name, _type, pattern = self.parse_parameter_string(name)
parameter = Parameter(
name=name, cast=_type)
if name in parameter_names:
raise ParameterNameConflicts(
"Multiple parameter named <{name}> "
"in route uri {uri}".format(name=name, uri=uri)
)
parameter_names.add(name)
parameter = Parameter(name=name, cast=_type)
parameters.append(parameter)
# Mark the whole route as unhashable if it has the hash key in it
if re.search(r'(^|[^^]){1}/', pattern):
properties['unhashable'] = True
if re.search(r"(^|[^^]){1}/", pattern):
properties["unhashable"] = True
# Mark the route as unhashable if it matches the hash key
elif re.search(r'/', pattern):
properties['unhashable'] = True
elif re.search(r"/", pattern):
properties["unhashable"] = True
return '({})'.format(pattern)
return "({})".format(pattern)
pattern_string = re.sub(self.parameter_pattern, add_parameter, uri)
pattern = re.compile(r'^{}$'.format(pattern_string))
pattern = re.compile(r"^{}$".format(pattern_string))
def merge_route(route, methods, handler):
# merge to the existing route when possible.
if not route.methods or not methods:
# method-unspecified routes are not mergeable.
raise RouteExists(
"Route already registered: {}".format(uri))
raise RouteExists("Route already registered: {}".format(uri))
elif route.methods.intersection(methods):
# already existing method is not overloadable.
duplicated = methods.intersection(route.methods)
raise RouteExists(
"Route already registered: {} [{}]".format(
uri, ','.join(list(duplicated))))
uri, ",".join(list(duplicated))
)
)
if isinstance(route.handler, CompositionView):
view = route.handler
else:
@@ -210,42 +262,67 @@ class Router:
view.add(route.methods, route.handler)
view.add(methods, handler)
route = route._replace(
handler=view, methods=methods.union(route.methods))
handler=view, methods=methods.union(route.methods)
)
return route
if parameters:
# TODO: This is too complex, we need to reduce the complexity
if properties['unhashable']:
if properties["unhashable"]:
routes_to_check = self.routes_always_check
ndx, route = self.check_dynamic_route_exists(
pattern, routes_to_check)
pattern, routes_to_check, parameters
)
else:
routes_to_check = self.routes_dynamic[url_hash(uri)]
ndx, route = self.check_dynamic_route_exists(
pattern, routes_to_check)
pattern, routes_to_check, parameters
)
if ndx != -1:
# Pop the ndx of the route, no dups of the same route
routes_to_check.pop(ndx)
else:
route = self.routes_all.get(uri)
# prefix the handler name with the blueprint name
# if available
# special prefix for static files
is_static = False
if name and name.startswith("_static_"):
is_static = True
name = name.split("_static_", 1)[-1]
if hasattr(handler, "__blueprintname__"):
handler_name = "{}.{}".format(
handler.__blueprintname__, name or handler.__name__
)
else:
handler_name = name or getattr(handler, "__name__", None)
if route:
route = merge_route(route, methods, handler)
else:
# prefix the handler name with the blueprint name
# if available
if hasattr(handler, '__blueprintname__'):
handler_name = '{}.{}'.format(
handler.__blueprintname__, handler.__name__)
else:
handler_name = getattr(handler, '__name__', None)
route = Route(
handler=handler, methods=methods, pattern=pattern,
parameters=parameters, name=handler_name, uri=uri)
handler=handler,
methods=methods,
pattern=pattern,
parameters=parameters,
name=handler_name,
uri=uri,
)
self.routes_all[uri] = route
if properties['unhashable']:
if is_static:
pair = self.routes_static_files.get(handler_name)
if not (pair and (pair[0] + "/" == uri or uri + "/" == pair[0])):
self.routes_static_files[handler_name] = (uri, route)
else:
pair = self.routes_names.get(handler_name)
if not (pair and (pair[0] + "/" == uri or uri + "/" == pair[0])):
self.routes_names[handler_name] = (uri, route)
if properties["unhashable"]:
self.routes_always_check.append(route)
elif parameters:
self.routes_dynamic[url_hash(uri)].append(route)
@@ -253,9 +330,20 @@ class Router:
self.routes_static[uri] = route
@staticmethod
def check_dynamic_route_exists(pattern, routes_to_check):
def check_dynamic_route_exists(pattern, routes_to_check, parameters):
"""
Check if a URL pattern exists in a list of routes provided based on
the comparison of URL pattern and the parameters.
:param pattern: URL parameter pattern
:param routes_to_check: list of dynamic routes either hashable or
unhashable routes.
:param parameters: List of :class:`Parameter` items
:return: Tuple of index and route if matching route exists else
-1 for index and None for route
"""
for ndx, route in enumerate(routes_to_check):
if route.pattern == pattern:
if route.pattern == pattern and route.parameters == parameters:
return ndx, route
else:
return -1, None
@@ -265,13 +353,25 @@ class Router:
uri = host + uri
try:
route = self.routes_all.pop(uri)
for handler_name, pairs in self.routes_names.items():
if pairs[0] == uri:
self.routes_names.pop(handler_name)
break
for handler_name, pairs in self.routes_static_files.items():
if pairs[0] == uri:
self.routes_static_files.pop(handler_name)
break
except KeyError:
raise RouteDoesNotExist("Route was not registered: {}".format(uri))
if route in self.routes_always_check:
self.routes_always_check.remove(route)
elif url_hash(uri) in self.routes_dynamic \
and route in self.routes_dynamic[url_hash(uri)]:
elif (
url_hash(uri) in self.routes_dynamic
and route in self.routes_dynamic[url_hash(uri)]
):
self.routes_dynamic[url_hash(uri)].remove(route)
else:
self.routes_static.pop(uri)
@@ -280,20 +380,20 @@ class Router:
self._get.cache_clear()
@lru_cache(maxsize=ROUTER_CACHE_SIZE)
def find_route_by_view_name(self, view_name):
def find_route_by_view_name(self, view_name, name=None):
"""Find a route in the router based on the specified view name.
:param view_name: string of view name to search by
:param kwargs: additional params, usually for static files
:return: tuple containing (uri, Route)
"""
if not view_name:
return (None, None)
for uri, route in self.routes_all.items():
if route.name == view_name:
return uri, route
if view_name == "static" or view_name.endswith(".static"):
return self.routes_static_files.get(name, (None, None))
return (None, None)
return self.routes_names.get(view_name, (None, None))
def get(self, request):
"""Get a request handler based on the URL of the request, or raises an
@@ -304,14 +404,25 @@ class Router:
"""
# No virtual hosts specified; default behavior
if not self.hosts:
return self._get(request.path, request.method, '')
return self._get(request.path, request.method, "")
# virtual hosts specified; try to match route to the host header
try:
return self._get(request.path, request.method,
request.headers.get("Host", ''))
return self._get(
request.path, request.method, request.headers.get("Host", "")
)
# try default hosts
except NotFound:
return self._get(request.path, request.method, '')
return self._get(request.path, request.method, "")
def get_supported_methods(self, url):
"""Get a list of supported methods for a url and optional host.
:param url: URL string (including host)
:return: frozenset of supported methods
"""
route = self.routes_all.get(url)
# if methods are None then this logic will prevent an error
return getattr(route, "methods", None) or frozenset()
@lru_cache(maxsize=ROUTER_CACHE_SIZE)
def _get(self, url, method, host):
@@ -322,12 +433,14 @@ class Router:
:param method: request method
:return: handler, arguments, keyword arguments
"""
url = host + url
url = unquote(host + url)
# Check against known static routes
route = self.routes_static.get(url)
method_not_supported = InvalidUsage(
'Method {} not allowed for URL {}'.format(
method, url), status_code=405)
method_not_supported = MethodNotSupported(
"Method {} not allowed for URL {}".format(method, url),
method=method,
allowed_methods=self.get_supported_methods(url),
)
if route:
if route.methods and method not in route.methods:
raise method_not_supported
@@ -353,13 +466,14 @@ class Router:
# Route was found but the methods didn't match
if route_found:
raise method_not_supported
raise NotFound('Requested URL {} not found'.format(url))
raise NotFound("Requested URL {} not found".format(url))
kwargs = {p.name: p.cast(value)
for value, p
in zip(match.groups(1), route.parameters)}
kwargs = {
p.name: p.cast(value)
for value, p in zip(match.groups(1), route.parameters)
}
route_handler = route.handler
if hasattr(route_handler, 'handlers'):
if hasattr(route_handler, "handlers"):
route_handler = route_handler.handlers[method]
return route_handler, [], kwargs, route.uri
@@ -370,9 +484,10 @@ class Router:
"""
try:
handler = self.get(request)[0]
except (NotFound, InvalidUsage):
except (NotFound, MethodNotSupported):
return False
if (hasattr(handler, 'view_class') and
hasattr(handler.view_class, request.method.lower())):
if hasattr(handler, "view_class") and hasattr(
handler.view_class, request.method.lower()
):
handler = getattr(handler.view_class, request.method.lower())
return hasattr(handler, 'is_stream')
return hasattr(handler, "is_stream")

View File

@@ -1,81 +1,111 @@
import asyncio
import os
import traceback
from functools import partial
from inspect import isawaitable
from multiprocessing import Process
from signal import (
SIGTERM, SIGINT,
signal as signal_func,
Signals
)
from socket import (
socket,
SOL_SOCKET,
SO_REUSEADDR,
)
from signal import SIG_IGN, SIGINT, SIGTERM, Signals
from signal import signal as signal_func
from socket import SO_REUSEADDR, SOL_SOCKET, socket
from time import time
from httptools import HttpRequestParser
from httptools.parser.errors import HttpParserError
from multidict import CIMultiDict
from sanic.exceptions import (
InvalidUsage,
PayloadTooLarge,
RequestTimeout,
ServerError,
ServiceUnavailable,
)
from sanic.log import access_logger, logger
from sanic.request import Request, StreamBuffer
from sanic.response import HTTPResponse
try:
import uvloop as async_loop
import uvloop
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
except ImportError:
async_loop = asyncio
from sanic.log import log, netlog
from sanic.response import HTTPResponse
from sanic.request import Request
from sanic.exceptions import (
RequestTimeout, PayloadTooLarge, InvalidUsage, ServerError)
current_time = None
pass
class Signal:
stopped = False
class CIDict(dict):
"""Case Insensitive dict where all keys are converted to lowercase
This does not maintain the inputted case when calling items() or keys()
in favor of speed, since headers are case insensitive
class HttpProtocol(asyncio.Protocol):
"""
This class provides a basic HTTP implementation of the sanic framework.
"""
def get(self, key, default=None):
return super().get(key.casefold(), default)
def __getitem__(self, key):
return super().__getitem__(key.casefold())
def __setitem__(self, key, value):
return super().__setitem__(key.casefold(), value)
def __contains__(self, key):
return super().__contains__(key.casefold())
class HttpProtocol(asyncio.Protocol):
__slots__ = (
# event loop, connection
'loop', 'transport', 'connections', 'signal',
"loop",
"transport",
"connections",
"signal",
# request params
'parser', 'request', 'url', 'headers',
"parser",
"request",
"url",
"headers",
# request config
'request_handler', 'request_timeout', 'request_max_size',
'request_class', 'is_request_stream', 'router',
# enable or disable access log / error log purpose
'has_log',
"request_handler",
"request_timeout",
"response_timeout",
"keep_alive_timeout",
"request_max_size",
"request_buffer_queue_size",
"request_class",
"is_request_stream",
"router",
"error_handler",
# enable or disable access log purpose
"access_log",
# connection management
'_total_request_size', '_timeout_handler', '_last_communication_time',
'_is_stream_handler')
"_total_request_size",
"_request_timeout_handler",
"_response_timeout_handler",
"_keep_alive_timeout_handler",
"_last_request_time",
"_last_response_time",
"_is_stream_handler",
"_not_paused",
"_request_handler_task",
"_request_stream_task",
"_keep_alive",
"_header_fragment",
"state",
"_debug",
)
def __init__(self, *, loop, request_handler, error_handler,
signal=Signal(), connections=set(), request_timeout=60,
request_max_size=None, request_class=None, has_log=True,
keep_alive=True, is_request_stream=False, router=None,
state=None, debug=False, **kwargs):
def __init__(
self,
*,
loop,
request_handler,
error_handler,
signal=Signal(),
connections=None,
request_timeout=60,
response_timeout=60,
keep_alive_timeout=5,
request_max_size=None,
request_buffer_queue_size=100,
request_class=None,
access_log=True,
keep_alive=True,
is_request_stream=False,
router=None,
state=None,
debug=False,
**kwargs
):
self.loop = loop
self.transport = None
self.request = None
@@ -84,33 +114,49 @@ class HttpProtocol(asyncio.Protocol):
self.headers = None
self.router = router
self.signal = signal
self.has_log = has_log
self.connections = connections
self.access_log = access_log
self.connections = connections or set()
self.request_handler = request_handler
self.error_handler = error_handler
self.request_timeout = request_timeout
self.request_buffer_queue_size = request_buffer_queue_size
self.response_timeout = response_timeout
self.keep_alive_timeout = keep_alive_timeout
self.request_max_size = request_max_size
self.request_class = request_class or Request
self.is_request_stream = is_request_stream
self._is_stream_handler = False
self._not_paused = asyncio.Event(loop=loop)
self._total_request_size = 0
self._timeout_handler = None
self._request_timeout_handler = None
self._response_timeout_handler = None
self._keep_alive_timeout_handler = None
self._last_request_time = None
self._last_response_time = None
self._request_handler_task = None
self._request_stream_task = None
self._keep_alive = keep_alive
self._header_fragment = b''
self._header_fragment = b""
self.state = state if state else {}
if 'requests_count' not in self.state:
self.state['requests_count'] = 0
if "requests_count" not in self.state:
self.state["requests_count"] = 0
self._debug = debug
self._not_paused.set()
@property
def keep_alive(self):
"""
Check if the connection needs to be kept alive based on the params
attached to the `_keep_alive` attribute, :attr:`Signal.stopped`
and :func:`HttpProtocol.parser.should_keep_alive`
:return: ``True`` if connection is to be kept alive ``False`` else
"""
return (
self._keep_alive and
not self.signal.stopped and
self.parser.should_keep_alive())
self._keep_alive
and not self.signal.stopped
and self.parser.should_keep_alive()
)
# -------------------------------------------- #
# Connection
@@ -118,31 +164,83 @@ class HttpProtocol(asyncio.Protocol):
def connection_made(self, transport):
self.connections.add(self)
self._timeout_handler = self.loop.call_later(
self.request_timeout, self.connection_timeout)
self._request_timeout_handler = self.loop.call_later(
self.request_timeout, self.request_timeout_callback
)
self.transport = transport
self._last_request_time = current_time
self._last_request_time = time()
def connection_lost(self, exc):
self.connections.discard(self)
self._timeout_handler.cancel()
if self._request_handler_task:
self._request_handler_task.cancel()
if self._request_stream_task:
self._request_stream_task.cancel()
if self._request_timeout_handler:
self._request_timeout_handler.cancel()
if self._response_timeout_handler:
self._response_timeout_handler.cancel()
if self._keep_alive_timeout_handler:
self._keep_alive_timeout_handler.cancel()
def connection_timeout(self):
# Check if
time_elapsed = current_time - self._last_request_time
def pause_writing(self):
self._not_paused.clear()
def resume_writing(self):
self._not_paused.set()
def request_timeout_callback(self):
# See the docstring in the RequestTimeout exception, to see
# exactly what this timeout is checking for.
# Check if elapsed time since request initiated exceeds our
# configured maximum request timeout value
time_elapsed = time() - self._last_request_time
if time_elapsed < self.request_timeout:
time_left = self.request_timeout - time_elapsed
self._timeout_handler = (
self.loop.call_later(time_left, self.connection_timeout))
self._request_timeout_handler = self.loop.call_later(
time_left, self.request_timeout_callback
)
else:
if self._request_stream_task:
self._request_stream_task.cancel()
if self._request_handler_task:
self._request_handler_task.cancel()
try:
raise RequestTimeout('Request Timeout')
except RequestTimeout as exception:
self.write_error(exception)
self.write_error(RequestTimeout("Request Timeout"))
def response_timeout_callback(self):
# Check if elapsed time since response was initiated exceeds our
# configured maximum request timeout value
time_elapsed = time() - self._last_request_time
if time_elapsed < self.response_timeout:
time_left = self.response_timeout - time_elapsed
self._response_timeout_handler = self.loop.call_later(
time_left, self.response_timeout_callback
)
else:
if self._request_stream_task:
self._request_stream_task.cancel()
if self._request_handler_task:
self._request_handler_task.cancel()
self.write_error(ServiceUnavailable("Response Timeout"))
def keep_alive_timeout_callback(self):
"""
Check if elapsed time since last response exceeds our configured
maximum keep alive timeout value and if so, close the transport
pipe and let the response writer handle the error.
:return: None
"""
time_elapsed = time() - self._last_response_time
if time_elapsed < self.keep_alive_timeout:
time_left = self.keep_alive_timeout - time_elapsed
self._keep_alive_timeout_handler = self.loop.call_later(
time_left, self.keep_alive_timeout_callback
)
else:
logger.debug("KeepAlive Timeout. Closing connection.")
self.transport.close()
self.transport = None
# -------------------------------------------- #
# Parsing
@@ -153,8 +251,7 @@ class HttpProtocol(asyncio.Protocol):
# memory limits
self._total_request_size += len(data)
if self._total_request_size > self.request_max_size:
exception = PayloadTooLarge('Payload Too Large')
self.write_error(exception)
self.write_error(PayloadTooLarge("Payload Too Large"))
# Create parser if this is the first time we're receiving data
if self.parser is None:
@@ -163,17 +260,16 @@ class HttpProtocol(asyncio.Protocol):
self.parser = HttpRequestParser(self)
# requests count
self.state['requests_count'] = self.state['requests_count'] + 1
self.state["requests_count"] = self.state["requests_count"] + 1
# Parse request chunk or close connection
try:
self.parser.feed_data(data)
except HttpParserError:
message = 'Bad Request'
message = "Bad Request"
if self._debug:
message += '\n' + traceback.format_exc()
exception = InvalidUsage(message)
self.write_error(exception)
message += "\n" + traceback.format_exc()
self.write_error(InvalidUsage(message))
def on_url(self, url):
if not self.url:
@@ -185,188 +281,289 @@ class HttpProtocol(asyncio.Protocol):
self._header_fragment += name
if value is not None:
if self._header_fragment == b'Content-Length' \
and int(value) > self.request_max_size:
exception = PayloadTooLarge('Payload Too Large')
self.write_error(exception)
if (
self._header_fragment == b"Content-Length"
and int(value) > self.request_max_size
):
self.write_error(PayloadTooLarge("Payload Too Large"))
try:
value = value.decode()
except UnicodeDecodeError:
value = value.decode("latin_1")
self.headers.append(
(self._header_fragment.decode().casefold(),
value.decode()))
(self._header_fragment.decode().casefold(), value)
)
self._header_fragment = b''
self._header_fragment = b""
def on_headers_complete(self):
self.request = self.request_class(
url_bytes=self.url,
headers=CIDict(self.headers),
headers=CIMultiDict(self.headers),
version=self.parser.get_http_version(),
method=self.parser.get_method().decode(),
transport=self.transport
transport=self.transport,
)
# Remove any existing KeepAlive handler here,
# It will be recreated if required on the new request.
if self._keep_alive_timeout_handler:
self._keep_alive_timeout_handler.cancel()
self._keep_alive_timeout_handler = None
if self.is_request_stream:
self._is_stream_handler = self.router.is_stream_handler(
self.request)
self.request
)
if self._is_stream_handler:
self.request.stream = asyncio.Queue()
self.request.stream = StreamBuffer(
self.request_buffer_queue_size
)
self.execute_request_handler()
def on_body(self, body):
if self.is_request_stream and self._is_stream_handler:
self._request_stream_task = self.loop.create_task(
self.request.stream.put(body))
return
self.request.body.append(body)
self.body_append(body)
)
else:
self.request.body_push(body)
async def body_append(self, body):
if self.request.stream.is_full():
self.transport.pause_reading()
await self.request.stream.put(body)
self.transport.resume_reading()
else:
await self.request.stream.put(body)
def on_message_complete(self):
# Entire request (headers and whole body) is received.
# We can cancel and remove the request timeout handler now.
if self._request_timeout_handler:
self._request_timeout_handler.cancel()
self._request_timeout_handler = None
if self.is_request_stream and self._is_stream_handler:
self._request_stream_task = self.loop.create_task(
self.request.stream.put(None))
self.request.stream.put(None)
)
return
self.request.body = b''.join(self.request.body)
self.request.body_finish()
self.execute_request_handler()
def execute_request_handler(self):
"""
Invoke the request handler defined by the
:func:`sanic.app.Sanic.handle_request` method
:return: None
"""
self._response_timeout_handler = self.loop.call_later(
self.response_timeout, self.response_timeout_callback
)
self._last_request_time = time()
self._request_handler_task = self.loop.create_task(
self.request_handler(
self.request,
self.write_response,
self.stream_response))
self.request, self.write_response, self.stream_response
)
)
# -------------------------------------------- #
# Responding
# -------------------------------------------- #
def log_response(self, response):
"""
Helper method provided to enable the logging of responses in case if
the :attr:`HttpProtocol.access_log` is enabled.
:param response: Response generated for the current request
:type response: :class:`sanic.response.HTTPResponse` or
:class:`sanic.response.StreamingHTTPResponse`
:return: None
"""
if self.access_log:
extra = {"status": getattr(response, "status", 0)}
if isinstance(response, HTTPResponse):
extra["byte"] = len(response.body)
else:
extra["byte"] = -1
extra["host"] = "UNKNOWN"
if self.request is not None:
if self.request.ip:
extra["host"] = "{0}:{1}".format(
self.request.ip, self.request.port
)
extra["request"] = "{0} {1}".format(
self.request.method, self.request.url
)
else:
extra["request"] = "nil"
access_logger.info("", extra=extra)
def write_response(self, response):
"""
Writes response content synchronously to the transport.
"""
if self._response_timeout_handler:
self._response_timeout_handler.cancel()
self._response_timeout_handler = None
try:
keep_alive = self.keep_alive
self.transport.write(
response.output(
self.request.version, keep_alive,
self.request_timeout))
if self.has_log:
netlog.info('', extra={
'status': response.status,
'byte': len(response.body),
'host': '{0}:{1}'.format(self.request.ip[0],
self.request.ip[1]),
'request': '{0} {1}'.format(self.request.method,
self.request.url)
})
self.request.version, keep_alive, self.keep_alive_timeout
)
)
self.log_response(response)
except AttributeError:
log.error(
('Invalid response object for url {}, '
'Expected Type: HTTPResponse, Actual Type: {}').format(
self.url, type(response)))
self.write_error(ServerError('Invalid response type'))
logger.error(
"Invalid response object for url %s, "
"Expected Type: HTTPResponse, Actual Type: %s",
self.url,
type(response),
)
self.write_error(ServerError("Invalid response type"))
except RuntimeError:
log.error(
'Connection lost before response written @ {}'.format(
self.request.ip))
if self._debug:
logger.error(
"Connection lost before response written @ %s",
self.request.ip,
)
keep_alive = False
except Exception as e:
self.bail_out(
"Writing response failed, connection closed {}".format(
repr(e)))
"Writing response failed, connection closed {}".format(repr(e))
)
finally:
if not keep_alive:
self.transport.close()
self.transport = None
else:
self._last_request_time = current_time
self._keep_alive_timeout_handler = self.loop.call_later(
self.keep_alive_timeout, self.keep_alive_timeout_callback
)
self._last_response_time = time()
self.cleanup()
async def drain(self):
await self._not_paused.wait()
def push_data(self, data):
self.transport.write(data)
async def stream_response(self, response):
"""
Streams a response to the client asynchronously. Attaches
the transport to the response so the response consumer can
write to the response as needed.
"""
if self._response_timeout_handler:
self._response_timeout_handler.cancel()
self._response_timeout_handler = None
try:
keep_alive = self.keep_alive
response.transport = self.transport
response.protocol = self
await response.stream(
self.request.version, keep_alive, self.request_timeout)
if self.has_log:
netlog.info('', extra={
'status': response.status,
'byte': -1,
'host': '{0}:{1}'.format(self.request.ip[0],
self.request.ip[1]),
'request': '{0} {1}'.format(self.request.method,
self.request.url)
})
self.request.version, keep_alive, self.keep_alive_timeout
)
self.log_response(response)
except AttributeError:
log.error(
('Invalid response object for url {}, '
'Expected Type: HTTPResponse, Actual Type: {}').format(
self.url, type(response)))
self.write_error(ServerError('Invalid response type'))
logger.error(
"Invalid response object for url %s, "
"Expected Type: HTTPResponse, Actual Type: %s",
self.url,
type(response),
)
self.write_error(ServerError("Invalid response type"))
except RuntimeError:
log.error(
'Connection lost before response written @ {}'.format(
self.request.ip))
if self._debug:
logger.error(
"Connection lost before response written @ %s",
self.request.ip,
)
keep_alive = False
except Exception as e:
self.bail_out(
"Writing response failed, connection closed {}".format(
repr(e)))
"Writing response failed, connection closed {}".format(repr(e))
)
finally:
if not keep_alive:
self.transport.close()
self.transport = None
else:
self._last_request_time = current_time
self._keep_alive_timeout_handler = self.loop.call_later(
self.keep_alive_timeout, self.keep_alive_timeout_callback
)
self._last_response_time = time()
self.cleanup()
def write_error(self, exception):
# An error _is_ a response.
# Don't throw a response timeout, when a response _is_ given.
if self._response_timeout_handler:
self._response_timeout_handler.cancel()
self._response_timeout_handler = None
response = None
try:
response = self.error_handler.response(self.request, exception)
version = self.request.version if self.request else '1.1'
version = self.request.version if self.request else "1.1"
self.transport.write(response.output(version))
except RuntimeError:
log.error(
'Connection lost before error written @ {}'.format(
self.request.ip if self.request else 'Unknown'))
if self._debug:
logger.error(
"Connection lost before error written @ %s",
self.request.ip if self.request else "Unknown",
)
except Exception as e:
self.bail_out(
"Writing error failed, connection closed {}".format(repr(e)),
from_error=True)
from_error=True,
)
finally:
if self.has_log:
extra = dict()
if isinstance(response, HTTPResponse):
extra['status'] = response.status
extra['byte'] = len(response.body)
else:
extra['status'] = 0
extra['byte'] = -1
if self.request:
extra['host'] = '%s:%d' % self.request.ip,
extra['request'] = '%s %s' % (self.request.method,
self.url)
else:
extra['host'] = 'UNKNOWN'
extra['request'] = 'nil'
if self.parser and not (self.keep_alive
and extra['status'] == 408):
netlog.info('', extra=extra)
self.transport.close()
if self.parser and (
self.keep_alive or getattr(response, "status", 0) == 408
):
self.log_response(response)
try:
self.transport.close()
except AttributeError:
logger.debug("Connection lost before server could close it.")
def bail_out(self, message, from_error=False):
"""
In case if the transport pipes are closed and the sanic app encounters
an error while writing data to the transport pipe, we log the error
with proper details.
:param message: Error message to display
:param from_error: If the bail out was invoked while handling an
exception scenario.
:type message: str
:type from_error: bool
:return: None
"""
if from_error or self.transport.is_closing():
log.error(
("Transport closed @ {} and exception "
"experienced during error handling").format(
self.transport.get_extra_info('peername')))
log.debug(
'Exception:\n{}'.format(traceback.format_exc()))
logger.error(
"Transport closed @ %s and exception "
"experienced during error handling",
self.transport.get_extra_info("peername"),
)
logger.debug("Exception:", exc_info=True)
else:
exception = ServerError(message)
self.write_error(exception)
log.error(message)
self.write_error(ServerError(message))
logger.error(message)
def cleanup(self):
"""This is called when KeepAlive feature is used,
it resets the connection in order for it to be able
to handle receiving another request on the same connection."""
self.parser = None
self.request = None
self.url = None
@@ -395,18 +592,6 @@ class HttpProtocol(asyncio.Protocol):
self.transport = None
def update_current_time(loop):
"""Cache the current time, since it is needed at the end of every
keep-alive request to update the request timeout time
:param loop:
:return:
"""
global current_time
current_time = time()
loop.call_later(1, partial(update_current_time, loop))
def trigger_events(events, loop):
"""Trigger event callbacks (functions or async)
@@ -419,15 +604,45 @@ def trigger_events(events, loop):
loop.run_until_complete(result)
def serve(host, port, request_handler, error_handler, before_start=None,
after_start=None, before_stop=None, after_stop=None, debug=False,
request_timeout=60, ssl=None, sock=None, request_max_size=None,
reuse_port=False, loop=None, protocol=HttpProtocol, backlog=100,
register_sys_signals=True, run_async=False, connections=None,
signal=Signal(), request_class=None, has_log=True, keep_alive=True,
is_request_stream=False, router=None, websocket_max_size=None,
websocket_max_queue=None, state=None,
graceful_shutdown_timeout=15.0):
def serve(
host,
port,
request_handler,
error_handler,
before_start=None,
after_start=None,
before_stop=None,
after_stop=None,
debug=False,
request_timeout=60,
response_timeout=60,
keep_alive_timeout=5,
ssl=None,
sock=None,
request_max_size=None,
request_buffer_queue_size=100,
reuse_port=False,
loop=None,
protocol=HttpProtocol,
backlog=100,
register_sys_signals=True,
run_multiple=False,
run_async=False,
connections=None,
signal=Signal(),
request_class=None,
access_log=True,
keep_alive=True,
is_request_stream=False,
router=None,
websocket_max_size=None,
websocket_max_queue=None,
websocket_read_limit=2 ** 16,
websocket_write_limit=2 ** 16,
state=None,
graceful_shutdown_timeout=15.0,
asyncio_server_kwargs=None,
):
"""Start asynchronous HTTP Server on an individual process.
:param host: Address to host on
@@ -446,6 +661,8 @@ def serve(host, port, request_handler, error_handler, before_start=None,
`app` instance and `loop`
:param debug: enables debug output (slows server)
:param request_timeout: time in seconds
:param response_timeout: time in seconds
:param keep_alive_timeout: time in seconds
:param ssl: SSLContext
:param sock: Socket for the server to accept connections from
:param request_max_size: size in bytes, `None` for no limit
@@ -453,13 +670,29 @@ def serve(host, port, request_handler, error_handler, before_start=None,
:param loop: asyncio compatible event loop
:param protocol: subclass of asyncio protocol class
:param request_class: Request class to use
:param has_log: disable/enable access log and error log
:param access_log: disable/enable access log
:param websocket_max_size: enforces the maximum size for
incoming messages in bytes.
:param websocket_max_queue: sets the maximum length of the queue
that holds incoming messages.
:param websocket_read_limit: sets the high-water limit of the buffer for
incoming bytes, the low-water limit is half
the high-water limit.
:param websocket_write_limit: sets the high-water limit of the buffer for
outgoing bytes, the low-water limit is a
quarter of the high-water limit.
:param is_request_stream: disable/enable Request.stream
:param request_buffer_queue_size: streaming request buffer queue size
:param router: Router object
:param graceful_shutdown_timeout: How long take to Force close non-idle
connection
:param asyncio_server_kwargs: key-value args for asyncio/uvloop
create_server method
:return: Nothing
"""
if not run_async:
loop = async_loop.new_event_loop()
# create new event_loop after fork
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
if debug:
@@ -474,18 +707,24 @@ def serve(host, port, request_handler, error_handler, before_start=None,
request_handler=request_handler,
error_handler=error_handler,
request_timeout=request_timeout,
response_timeout=response_timeout,
keep_alive_timeout=keep_alive_timeout,
request_max_size=request_max_size,
request_class=request_class,
has_log=has_log,
access_log=access_log,
keep_alive=keep_alive,
is_request_stream=is_request_stream,
router=router,
websocket_max_size=websocket_max_size,
websocket_max_queue=websocket_max_queue,
websocket_read_limit=websocket_read_limit,
websocket_write_limit=websocket_write_limit,
state=state,
debug=debug,
)
asyncio_server_kwargs = (
asyncio_server_kwargs if asyncio_server_kwargs else {}
)
server_coroutine = loop.create_server(
server,
host,
@@ -493,13 +732,10 @@ def serve(host, port, request_handler, error_handler, before_start=None,
ssl=ssl,
reuse_port=reuse_port,
sock=sock,
backlog=backlog
backlog=backlog,
**asyncio_server_kwargs
)
# Instead of pulling time at the end of every request,
# pull it once per minute
loop.call_soon(partial(update_current_time, loop))
if run_async:
return server_coroutine
@@ -507,26 +743,33 @@ def serve(host, port, request_handler, error_handler, before_start=None,
try:
http_server = loop.run_until_complete(server_coroutine)
except:
log.exception("Unable to start server")
except BaseException:
logger.exception("Unable to start server")
return
trigger_events(after_start, loop)
# Ignore SIGINT when run_multiple
if run_multiple:
signal_func(SIGINT, SIG_IGN)
# Register signals for graceful termination
if register_sys_signals:
for _signal in (SIGINT, SIGTERM):
_singals = (SIGTERM,) if run_multiple else (SIGINT, SIGTERM)
for _signal in _singals:
try:
loop.add_signal_handler(_signal, loop.stop)
except NotImplementedError:
log.warn('Sanic tried to use loop.add_signal_handler but it is'
' not implemented on this platform.')
logger.warning(
"Sanic tried to use loop.add_signal_handler "
"but it is not implemented on this platform."
)
pid = os.getpid()
try:
log.info('Starting worker [{}]'.format(pid))
logger.info("Starting worker [%s]", pid)
loop.run_forever()
finally:
log.info("Stopping worker [{}]".format(pid))
logger.info("Stopping worker [%s]", pid)
# Run the on_stop function if provided
trigger_events(before_stop, loop)
@@ -554,7 +797,7 @@ def serve(host, port, request_handler, error_handler, before_start=None,
coros = []
for conn in connections:
if hasattr(conn, "websocket") and conn.websocket:
coros.append(conn.websocket.close_connection(force=True))
coros.append(conn.websocket.close_connection())
else:
conn.close()
@@ -575,28 +818,29 @@ def serve_multiple(server_settings, workers):
:param stop_event: if provided, is used as a stop signal
:return:
"""
server_settings['reuse_port'] = True
server_settings["reuse_port"] = True
server_settings["run_multiple"] = True
# Handling when custom socket is not provided.
if server_settings.get('sock') is None:
if server_settings.get("sock") is None:
sock = socket()
sock.setsockopt(SOL_SOCKET, SO_REUSEADDR, 1)
sock.bind((server_settings['host'], server_settings['port']))
sock.bind((server_settings["host"], server_settings["port"]))
sock.set_inheritable(True)
server_settings['sock'] = sock
server_settings['host'] = None
server_settings['port'] = None
server_settings["sock"] = sock
server_settings["host"] = None
server_settings["port"] = None
def sig_handler(signal, frame):
log.info("Received signal {}. Shutting down.".format(
Signals(signal).name))
logger.info("Received signal %s. Shutting down.", Signals(signal).name)
for process in processes:
os.kill(process.pid, SIGINT)
os.kill(process.pid, SIGTERM)
signal_func(SIGINT, lambda s, f: sig_handler(s, f))
signal_func(SIGTERM, lambda s, f: sig_handler(s, f))
processes = []
for _ in range(workers):
process = Process(target=serve, kwargs=server_settings)
process.daemon = True
@@ -609,4 +853,4 @@ def serve_multiple(server_settings, workers):
# the above processes will block this until they're stopped
for process in processes:
process.terminate()
server_settings.get('sock').close()
server_settings.get("sock").close()

View File

@@ -1,7 +1,7 @@
from mimetypes import guess_type
from os import path
from re import sub
from time import strftime, gmtime
from time import gmtime, strftime
from urllib.parse import unquote
from aiofiles.os import stat
@@ -13,12 +13,22 @@ from sanic.exceptions import (
InvalidUsage,
)
from sanic.handlers import ContentRangeHandler
from sanic.response import file, file_stream, HTTPResponse
from sanic.response import HTTPResponse, file, file_stream
def register(app, uri, file_or_directory, pattern,
use_modified_since, use_content_range,
stream_large_files):
def register(
app,
uri,
file_or_directory,
pattern,
use_modified_since,
use_content_range,
stream_large_files,
name="static",
host=None,
strict_slashes=None,
content_type=None,
):
# TODO: Though sanic is not a file server, I feel like we should at least
# make a good effort here. Modified-since is nice, but we could
# also look into etags, expires, and caching
@@ -39,16 +49,18 @@ def register(app, uri, file_or_directory, pattern,
than the file() handler to send the file
If this is an integer, this represents the
threshold size to switch to file_stream()
:param name: user defined name used for url_for
:param content_type: user defined content type for header
"""
# If we're not trying to match a file directly,
# serve from the folder
if not path.isfile(file_or_directory):
uri += '<file_uri:' + pattern + '>'
uri += "<file_uri:" + pattern + ">"
async def _handler(request, file_uri=None):
# Using this to determine if the URL is trying to break out of the path
# served. os.path.realpath seems to be very slow
if file_uri and '../' in file_uri:
if file_uri and "../" in file_uri:
raise InvalidUsage("Invalid URL")
# Merge served directory and requested file if provided
# Strip all / that in the beginning of the URL to help prevent python
@@ -56,15 +68,16 @@ def register(app, uri, file_or_directory, pattern,
root_path = file_path = file_or_directory
if file_uri:
file_path = path.join(
file_or_directory, sub('^[/]*', '', file_uri))
file_or_directory, sub("^[/]*", "", file_uri)
)
# URL decode the path sent by the browser otherwise we won't be able to
# match filenames which got encoded (filenames with spaces etc)
file_path = path.abspath(unquote(file_path))
if not file_path.startswith(path.abspath(unquote(root_path))):
raise FileNotFound('File not found',
path=file_or_directory,
relative_url=file_uri)
raise FileNotFound(
"File not found", path=file_or_directory, relative_url=file_uri
)
try:
headers = {}
# Check if the client has been sent this file before
@@ -73,48 +86,61 @@ def register(app, uri, file_or_directory, pattern,
if use_modified_since:
stats = await stat(file_path)
modified_since = strftime(
'%a, %d %b %Y %H:%M:%S GMT', gmtime(stats.st_mtime))
if request.headers.get('If-Modified-Since') == modified_since:
"%a, %d %b %Y %H:%M:%S GMT", gmtime(stats.st_mtime)
)
if request.headers.get("If-Modified-Since") == modified_since:
return HTTPResponse(status=304)
headers['Last-Modified'] = modified_since
headers["Last-Modified"] = modified_since
_range = None
if use_content_range:
_range = None
if not stats:
stats = await stat(file_path)
headers['Accept-Ranges'] = 'bytes'
headers['Content-Length'] = str(stats.st_size)
if request.method != 'HEAD':
headers["Accept-Ranges"] = "bytes"
headers["Content-Length"] = str(stats.st_size)
if request.method != "HEAD":
try:
_range = ContentRangeHandler(request, stats)
except HeaderNotFound:
pass
else:
del headers['Content-Length']
del headers["Content-Length"]
for key, value in _range.headers.items():
headers[key] = value
if request.method == 'HEAD':
return HTTPResponse(
headers=headers,
content_type=guess_type(file_path)[0] or 'text/plain')
headers["Content-Type"] = (
content_type or guess_type(file_path)[0] or "text/plain"
)
if request.method == "HEAD":
return HTTPResponse(headers=headers)
else:
if stream_large_files:
if isinstance(stream_large_files, int):
if type(stream_large_files) == int:
threshold = stream_large_files
else:
threshold = 1024*1000
threshold = 1024 * 1024
if not stats:
stats = await stat(file_path)
if stats.st_size >= threshold:
return await file_stream(file_path, headers=headers,
_range=_range)
return await file_stream(
file_path, headers=headers, _range=_range
)
return await file(file_path, headers=headers, _range=_range)
except ContentRangeError:
raise
except Exception:
raise FileNotFound('File not found',
path=file_or_directory,
relative_url=file_uri)
raise FileNotFound(
"File not found", path=file_or_directory, relative_url=file_uri
)
app.route(uri, methods=['GET', 'HEAD'])(_handler)
# special prefix for static files
if not name.startswith("_static_"):
name = "_static_{}".format(name)
app.route(
uri,
methods=["GET", "HEAD"],
name=name,
host=host,
strict_slashes=strict_slashes,
)(_handler)

View File

@@ -1,73 +1,109 @@
import traceback
from json import JSONDecodeError
from socket import socket
from sanic.log import log
from sanic.exceptions import MethodNotSupported
from sanic.log import logger
from sanic.response import text
HOST = '127.0.0.1'
HOST = "127.0.0.1"
PORT = 42101
class SanicTestClient:
def __init__(self, app):
def __init__(self, app, port=PORT):
"""Use port=None to bind to a random port"""
self.app = app
self.port = port
async def _local_request(self, method, uri, cookies=None, *args, **kwargs):
async def _local_request(self, method, url, cookies=None, *args, **kwargs):
import aiohttp
if uri.startswith(('http:', 'https:', 'ftp:', 'ftps://' '//')):
url = uri
else:
url = 'http://{host}:{port}{uri}'.format(
host=HOST, port=PORT, uri=uri)
log.info(url)
conn = aiohttp.TCPConnector(verify_ssl=False)
logger.info(url)
conn = aiohttp.TCPConnector(ssl=False)
async with aiohttp.ClientSession(
cookies=cookies, connector=conn) as session:
async with getattr(
session, method.lower())(url, *args, **kwargs) as response:
cookies=cookies, connector=conn
) as session:
async with getattr(session, method.lower())(
url, *args, **kwargs
) as response:
try:
response.text = await response.text()
except UnicodeDecodeError as e:
except UnicodeDecodeError:
response.text = None
try:
response.json = await response.json()
except (JSONDecodeError,
UnicodeDecodeError,
aiohttp.ClientResponseError):
except (
JSONDecodeError,
UnicodeDecodeError,
aiohttp.ClientResponseError,
):
response.json = None
response.body = await response.read()
return response
def _sanic_endpoint_test(
self, method='get', uri='/', gather_request=True,
debug=False, server_kwargs={},
*request_args, **request_kwargs):
self,
method="get",
uri="/",
gather_request=True,
debug=False,
server_kwargs={"auto_reload": False},
*request_args,
**request_kwargs
):
results = [None, None]
exceptions = []
if gather_request:
def _collect_request(request):
if results[0] is None:
results[0] = request
self.app.request_middleware.appendleft(_collect_request)
@self.app.listener('after_server_start')
@self.app.exception(MethodNotSupported)
async def error_handler(request, exception):
if request.method in ["HEAD", "PATCH", "PUT", "DELETE"]:
return text(
"", exception.status_code, headers=exception.headers
)
else:
return self.app.error_handler.default(request, exception)
if self.port:
server_kwargs = dict(host=HOST, port=self.port, **server_kwargs)
host, port = HOST, self.port
else:
sock = socket()
sock.bind((HOST, 0))
server_kwargs = dict(sock=sock, **server_kwargs)
host, port = sock.getsockname()
if uri.startswith(("http:", "https:", "ftp:", "ftps://", "//")):
url = uri
else:
url = "http://{host}:{port}{uri}".format(
host=host, port=port, uri=uri
)
@self.app.listener("after_server_start")
async def _collect_response(sanic, loop):
try:
response = await self._local_request(
method, uri, *request_args,
**request_kwargs)
method, url, *request_args, **request_kwargs
)
results[-1] = response
except Exception as e:
log.error(
'Exception:\n{}'.format(traceback.format_exc()))
logger.exception("Exception")
exceptions.append(e)
self.app.stop()
self.app.run(host=HOST, debug=debug, port=PORT, **server_kwargs)
self.app.listeners['after_server_start'].pop()
self.app.run(debug=debug, **server_kwargs)
self.app.listeners["after_server_start"].pop()
if exceptions:
raise ValueError("Exception during request: {}".format(exceptions))
@@ -76,34 +112,37 @@ class SanicTestClient:
try:
request, response = results
return request, response
except:
except BaseException:
raise ValueError(
"Request and response object expected, got ({})".format(
results))
results
)
)
else:
try:
return results[-1]
except:
except BaseException:
raise ValueError(
"Request object expected, got ({})".format(results))
"Request object expected, got ({})".format(results)
)
def get(self, *args, **kwargs):
return self._sanic_endpoint_test('get', *args, **kwargs)
return self._sanic_endpoint_test("get", *args, **kwargs)
def post(self, *args, **kwargs):
return self._sanic_endpoint_test('post', *args, **kwargs)
return self._sanic_endpoint_test("post", *args, **kwargs)
def put(self, *args, **kwargs):
return self._sanic_endpoint_test('put', *args, **kwargs)
return self._sanic_endpoint_test("put", *args, **kwargs)
def delete(self, *args, **kwargs):
return self._sanic_endpoint_test('delete', *args, **kwargs)
return self._sanic_endpoint_test("delete", *args, **kwargs)
def patch(self, *args, **kwargs):
return self._sanic_endpoint_test('patch', *args, **kwargs)
return self._sanic_endpoint_test("patch", *args, **kwargs)
def options(self, *args, **kwargs):
return self._sanic_endpoint_test('options', *args, **kwargs)
return self._sanic_endpoint_test("options", *args, **kwargs)
def head(self, *args, **kwargs):
return self._sanic_endpoint_test('head', *args, **kwargs)
return self._sanic_endpoint_test("head", *args, **kwargs)

View File

@@ -1,5 +1,5 @@
from sanic.exceptions import InvalidUsage
from sanic.constants import HTTP_METHODS
from sanic.exceptions import InvalidUsage
class HTTPMethodView:
@@ -48,6 +48,7 @@ class HTTPMethodView:
"""Return view function for use with the routing system, that
dispatches request to appropriate handler method.
"""
def view(*args, **kwargs):
self = view.view_class(*class_args, **class_kwargs)
return self.dispatch_request(*args, **kwargs)
@@ -94,11 +95,13 @@ class CompositionView:
for method in methods:
if method not in HTTP_METHODS:
raise InvalidUsage(
'{} is not a valid HTTP method.'.format(method))
"{} is not a valid HTTP method.".format(method)
)
if method in self.handlers:
raise InvalidUsage(
'Method {} is already registered.'.format(method))
"Method {} is already registered.".format(method)
)
self.handlers[method] = handler
def __call__(self, request, *args, **kwargs):

View File

@@ -1,22 +1,42 @@
from httptools import HttpParserUpgrade
from websockets import ConnectionClosed # noqa
from websockets import InvalidHandshake, WebSocketCommonProtocol, handshake
from sanic.exceptions import InvalidUsage
from sanic.server import HttpProtocol
from httptools import HttpParserUpgrade
from websockets import handshake, WebSocketCommonProtocol, InvalidHandshake
from websockets import ConnectionClosed # noqa
class WebSocketProtocol(HttpProtocol):
def __init__(self, *args, websocket_max_size=None,
websocket_max_queue=None, **kwargs):
def __init__(
self,
*args,
websocket_timeout=10,
websocket_max_size=None,
websocket_max_queue=None,
websocket_read_limit=2 ** 16,
websocket_write_limit=2 ** 16,
**kwargs
):
super().__init__(*args, **kwargs)
self.websocket = None
self.websocket_timeout = websocket_timeout
self.websocket_max_size = websocket_max_size
self.websocket_max_queue = websocket_max_queue
self.websocket_read_limit = websocket_read_limit
self.websocket_write_limit = websocket_write_limit
def connection_timeout(self):
# timeouts make no sense for websocket routes
# timeouts make no sense for websocket routes
def request_timeout_callback(self):
if self.websocket is None:
super().connection_timeout()
super().request_timeout_callback()
def response_timeout_callback(self):
if self.websocket is None:
super().response_timeout_callback()
def keep_alive_timeout_callback(self):
if self.websocket is None:
super().keep_alive_timeout_callback()
def connection_lost(self, exc):
if self.websocket is not None:
@@ -41,33 +61,45 @@ class WebSocketProtocol(HttpProtocol):
else:
super().write_response(response)
async def websocket_handshake(self, request):
async def websocket_handshake(self, request, subprotocols=None):
# let the websockets package do the handshake with the client
headers = []
def get_header(k):
return request.headers.get(k, '')
def set_header(k, v):
headers.append((k, v))
headers = {}
try:
key = handshake.check_request(get_header)
handshake.build_response(set_header, key)
key = handshake.check_request(request.headers)
handshake.build_response(headers, key)
except InvalidHandshake:
raise InvalidUsage('Invalid websocket request')
raise InvalidUsage("Invalid websocket request")
subprotocol = None
if subprotocols and "Sec-Websocket-Protocol" in request.headers:
# select a subprotocol
client_subprotocols = [
p.strip()
for p in request.headers["Sec-Websocket-Protocol"].split(",")
]
for p in client_subprotocols:
if p in subprotocols:
subprotocol = p
headers["Sec-Websocket-Protocol"] = subprotocol
break
# write the 101 response back to the client
rv = b'HTTP/1.1 101 Switching Protocols\r\n'
for k, v in headers:
rv += k.encode('utf-8') + b': ' + v.encode('utf-8') + b'\r\n'
rv += b'\r\n'
rv = b"HTTP/1.1 101 Switching Protocols\r\n"
for k, v in headers.items():
rv += k.encode("utf-8") + b": " + v.encode("utf-8") + b"\r\n"
rv += b"\r\n"
request.transport.write(rv)
# hook up the websocket protocol
self.websocket = WebSocketCommonProtocol(
timeout=self.websocket_timeout,
max_size=self.websocket_max_size,
max_queue=self.websocket_max_queue
max_queue=self.websocket_max_queue,
read_limit=self.websocket_read_limit,
write_limit=self.websocket_write_limit,
)
self.websocket.subprotocol = subprotocol
self.websocket.connection_made(request.transport)
self.websocket.connection_open()
return self.websocket

View File

@@ -1,10 +1,16 @@
import os
import sys
import signal
import asyncio
import logging
import os
import signal
import sys
import traceback
import gunicorn.workers.base as base
from sanic.server import HttpProtocol, Signal, serve, trigger_events
from sanic.websocket import WebSocketProtocol
try:
import ssl
except ImportError:
@@ -12,17 +18,17 @@ except ImportError:
try:
import uvloop
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
except ImportError:
pass
import gunicorn.workers.base as base
from sanic.server import trigger_events, serve, HttpProtocol, Signal
from sanic.websocket import WebSocketProtocol
class GunicornWorker(base.Worker):
http_protocol = HttpProtocol
websocket_protocol = WebSocketProtocol
def __init__(self, *args, **kw): # pragma: no cover
super().__init__(*args, **kw)
cfg = self.cfg
@@ -46,37 +52,46 @@ class GunicornWorker(base.Worker):
def run(self):
is_debug = self.log.loglevel == logging.DEBUG
protocol = (WebSocketProtocol if self.app.callable.websocket_enabled
else HttpProtocol)
protocol = (
self.websocket_protocol
if self.app.callable.websocket_enabled
else self.http_protocol
)
self._server_settings = self.app.callable._helper(
loop=self.loop,
debug=is_debug,
protocol=protocol,
ssl=self.ssl_context,
run_async=True)
self._server_settings['signal'] = self.signal
self._server_settings.pop('sock')
trigger_events(self._server_settings.get('before_start', []),
self.loop)
self._server_settings['before_start'] = ()
run_async=True,
)
self._server_settings["signal"] = self.signal
self._server_settings.pop("sock")
trigger_events(
self._server_settings.get("before_start", []), self.loop
)
self._server_settings["before_start"] = ()
self._runner = asyncio.ensure_future(self._run(), loop=self.loop)
try:
self.loop.run_until_complete(self._runner)
self.app.callable.is_running = True
trigger_events(self._server_settings.get('after_start', []),
self.loop)
trigger_events(
self._server_settings.get("after_start", []), self.loop
)
self.loop.run_until_complete(self._check_alive())
trigger_events(self._server_settings.get('before_stop', []),
self.loop)
trigger_events(
self._server_settings.get("before_stop", []), self.loop
)
self.loop.run_until_complete(self.close())
except:
except BaseException:
traceback.print_exc()
finally:
try:
trigger_events(self._server_settings.get('after_stop', []),
self.loop)
except:
trigger_events(
self._server_settings.get("after_stop", []), self.loop
)
except BaseException:
traceback.print_exc()
finally:
self.loop.close()
@@ -86,8 +101,11 @@ class GunicornWorker(base.Worker):
async def close(self):
if self.servers:
# stop accepting connections
self.log.info("Stopping server: %s, connections: %s",
self.pid, len(self.connections))
self.log.info(
"Stopping server: %s, connections: %s",
self.pid,
len(self.connections),
)
for server in self.servers:
server.close()
await server.wait_closed()
@@ -101,8 +119,9 @@ class GunicornWorker(base.Worker):
# gracefully shutdown timeout
start_shutdown = 0
graceful_shutdown_timeout = self.cfg.graceful_timeout
while self.connections and \
(start_shutdown < graceful_shutdown_timeout):
while self.connections and (
start_shutdown < graceful_shutdown_timeout
):
await asyncio.sleep(0.1)
start_shutdown = start_shutdown + 0.1
@@ -111,7 +130,7 @@ class GunicornWorker(base.Worker):
coros = []
for conn in self.connections:
if hasattr(conn, "websocket") and conn.websocket:
coros.append(conn.websocket.close_connection(force=True))
coros.append(conn.websocket.close_connection())
else:
conn.close()
_shutdown = asyncio.gather(*coros, loop=self.loop)
@@ -143,8 +162,8 @@ class GunicornWorker(base.Worker):
if self.max_requests and req_count > self.max_requests:
self.alive = False
self.log.info(
"Max requests exceeded, shutting down: %s", self
)
"Max requests exceeded, shutting down: %s", self
)
elif pid == os.getpid() and self.ppid != os.getppid():
self.alive = False
self.log.info("Parent changed, shutting down: %s", self)
@@ -170,23 +189,29 @@ class GunicornWorker(base.Worker):
def init_signals(self):
# Set up signals through the event loop API.
self.loop.add_signal_handler(signal.SIGQUIT, self.handle_quit,
signal.SIGQUIT, None)
self.loop.add_signal_handler(
signal.SIGQUIT, self.handle_quit, signal.SIGQUIT, None
)
self.loop.add_signal_handler(signal.SIGTERM, self.handle_exit,
signal.SIGTERM, None)
self.loop.add_signal_handler(
signal.SIGTERM, self.handle_exit, signal.SIGTERM, None
)
self.loop.add_signal_handler(signal.SIGINT, self.handle_quit,
signal.SIGINT, None)
self.loop.add_signal_handler(
signal.SIGINT, self.handle_quit, signal.SIGINT, None
)
self.loop.add_signal_handler(signal.SIGWINCH, self.handle_winch,
signal.SIGWINCH, None)
self.loop.add_signal_handler(
signal.SIGWINCH, self.handle_winch, signal.SIGWINCH, None
)
self.loop.add_signal_handler(signal.SIGUSR1, self.handle_usr1,
signal.SIGUSR1, None)
self.loop.add_signal_handler(
signal.SIGUSR1, self.handle_usr1, signal.SIGUSR1, None
)
self.loop.add_signal_handler(signal.SIGABRT, self.handle_abort,
signal.SIGABRT, None)
self.loop.add_signal_handler(
signal.SIGABRT, self.handle_abort, signal.SIGABRT, None
)
# Don't let SIGTERM and SIGUSR1 disturb active requests
# by interrupting system calls

20
setup.cfg Normal file
View File

@@ -0,0 +1,20 @@
[flake8]
ignore = E203, W503
[isort]
atomic = true
default_section = THIRDPARTY
include_trailing_comma = true
known_first_party = sanic
known_third_party = pytest
line_length = 79
lines_after_imports = 2
lines_between_types = 1
multi_line_output = 3
not_skip = __init__.py
[version]
current_version = 0.8.3
file = sanic/__init__.py
current_version_pattern = __version__ = "{current_version}"
new_version_pattern = __version__ = "{new_version}"

135
setup.py
View File

@@ -4,77 +4,126 @@ Sanic
import codecs
import os
import re
from distutils.errors import DistutilsPlatformError
import sys
from distutils.util import strtobool
from setuptools import setup
from setuptools.command.test import test as TestCommand
def open_local(paths, mode='r', encoding='utf8'):
path = os.path.join(
os.path.abspath(os.path.dirname(__file__)),
*paths
)
class PyTest(TestCommand):
"""
Provide a Test runner to be used from setup.py to run unit tests
"""
user_options = [("pytest-args=", "a", "Arguments to pass to pytest")]
def initialize_options(self):
TestCommand.initialize_options(self)
self.pytest_args = ""
def run_tests(self):
import shlex
import pytest
errno = pytest.main(shlex.split(self.pytest_args))
sys.exit(errno)
def open_local(paths, mode="r", encoding="utf8"):
path = os.path.join(os.path.abspath(os.path.dirname(__file__)), *paths)
return codecs.open(path, mode, encoding)
with open_local(['sanic', '__init__.py'], encoding='latin1') as fp:
with open_local(["sanic", "__init__.py"], encoding="latin1") as fp:
try:
version = re.findall(r"^__version__ = '([^']+)'\r?$",
fp.read(), re.M)[0]
version = re.findall(
r"^__version__ = \"([^']+)\"\r?$", fp.read(), re.M
)[0]
except IndexError:
raise RuntimeError('Unable to determine version.')
raise RuntimeError("Unable to determine version.")
with open_local(['README.rst']) as rm:
with open_local(["README.rst"]) as rm:
long_description = rm.read()
setup_kwargs = {
'name': 'sanic',
'version': version,
'url': 'http://github.com/channelcat/sanic/',
'license': 'MIT',
'author': 'Channel Cat',
'author_email': 'channelcat@gmail.com',
'description': (
'A microframework based on uvloop, httptools, and learnings of flask'),
'long_description': long_description,
'packages': ['sanic'],
'platforms': 'any',
'classifiers': [
'Development Status :: 2 - Pre-Alpha',
'Environment :: Web Environment',
'License :: OSI Approved :: MIT License',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
"name": "sanic",
"version": version,
"url": "http://github.com/channelcat/sanic/",
"license": "MIT",
"author": "Channel Cat",
"author_email": "channelcat@gmail.com",
"description": (
"A microframework based on uvloop, httptools, and learnings of flask"
),
"long_description": long_description,
"packages": ["sanic"],
"platforms": "any",
"classifiers": [
"Development Status :: 4 - Beta",
"Environment :: Web Environment",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3.5",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
],
}
ujson = 'ujson>=1.35'
uvloop = 'uvloop>=0.5.3'
env_dependency = (
'; sys_platform != "win32" ' 'and implementation_name == "cpython"'
)
ujson = "ujson>=1.35" + env_dependency
uvloop = "uvloop>=0.5.3" + env_dependency
requirements = [
'httptools>=0.0.9',
"httptools>=0.0.10",
uvloop,
ujson,
'aiofiles>=0.3.0',
'websockets>=3.2',
"aiofiles>=0.3.0",
"websockets>=6.0,<7.0",
"multidict>=4.0,<5.0",
]
tests_require = [
"pytest==4.1.0",
"multidict>=4.0,<5.0",
"gunicorn",
"pytest-cov",
"aiohttp>=2.3.0,<=3.2.1",
"beautifulsoup4",
uvloop,
ujson,
"pytest-sanic",
"pytest-sugar",
]
if strtobool(os.environ.get("SANIC_NO_UJSON", "no")):
print("Installing without uJSON")
requirements.remove(ujson)
tests_require.remove(ujson)
# 'nt' means windows OS
if strtobool(os.environ.get("SANIC_NO_UVLOOP", "no")):
print("Installing without uvLoop")
requirements.remove(uvloop)
tests_require.remove(uvloop)
try:
setup_kwargs['install_requires'] = requirements
setup(**setup_kwargs)
except DistutilsPlatformError as exception:
requirements.remove(ujson)
requirements.remove(uvloop)
print("Installing without uJSON or uvLoop")
setup_kwargs['install_requires'] = requirements
setup(**setup_kwargs)
extras_require = {
"test": tests_require,
"dev": tests_require + ["aiofiles", "tox", "black", "flake8"],
"docs": [
"sphinx",
"sphinx_rtd_theme",
"recommonmark",
"sphinxcontrib-asyncio",
"docutils",
"pygments"
],
}
setup_kwargs["install_requires"] = requirements
setup_kwargs["tests_require"] = tests_require
setup_kwargs["extras_require"] = extras_require
setup_kwargs["cmdclass"] = {"test": PyTest}
setup(**setup_kwargs)

View File

@@ -0,0 +1,53 @@
from random import choice, seed
from pytest import mark
import sanic.router
seed("Pack my box with five dozen liquor jugs.")
# Disable Caching for testing purpose
sanic.router.ROUTER_CACHE_SIZE = 0
class TestSanicRouteResolution:
@mark.asyncio
async def test_resolve_route_no_arg_string_path(
self, sanic_router, route_generator, benchmark
):
simple_routes = route_generator.generate_random_direct_route(
max_route_depth=4
)
router, simple_routes = sanic_router(route_details=simple_routes)
route_to_call = choice(simple_routes)
result = benchmark.pedantic(
router._get,
("/{}".format(route_to_call[-1]), route_to_call[0], "localhost"),
iterations=1000,
rounds=1000,
)
assert await result[0](None) == 1
@mark.asyncio
async def test_resolve_route_with_typed_args(
self, sanic_router, route_generator, benchmark
):
typed_routes = route_generator.add_typed_parameters(
route_generator.generate_random_direct_route(max_route_depth=4),
max_route_depth=8,
)
router, typed_routes = sanic_router(route_details=typed_routes)
route_to_call = choice(typed_routes)
url = route_generator.generate_url_for_template(
template=route_to_call[-1]
)
print("{} -> {}".format(route_to_call[-1], url))
result = benchmark.pedantic(
router._get,
("/{}".format(url), route_to_call[0], "localhost"),
iterations=1000,
rounds=1000,
)
assert await result[0](None) == 1

130
tests/conftest.py Normal file
View File

@@ -0,0 +1,130 @@
import random
import re
import string
import sys
import uuid
import pytest
from sanic import Sanic
from sanic.router import RouteExists, Router
random.seed("Pack my box with five dozen liquor jugs.")
if sys.platform in ["win32", "cygwin"]:
collect_ignore = ["test_worker.py"]
async def _handler(request):
"""
Dummy placeholder method used for route resolver when creating a new
route into the sanic router. This router is not actually called by the
sanic app. So do not worry about the arguments to this method.
If you change the return value of this method, make sure to propagate the
change to any test case that leverages RouteStringGenerator.
"""
return 1
TYPE_TO_GENERATOR_MAP = {
"string": lambda: "".join(
[random.choice(string.ascii_letters + string.digits) for _ in range(4)]
),
"int": lambda: random.choice(range(1000000)),
"number": lambda: random.random(),
"alpha": lambda: "".join(
[random.choice(string.ascii_letters) for _ in range(4)]
),
"uuid": lambda: str(uuid.uuid1()),
}
class RouteStringGenerator:
ROUTE_COUNT_PER_DEPTH = 100
HTTP_METHODS = ["GET", "PUT", "POST", "PATCH", "DELETE", "OPTION"]
ROUTE_PARAM_TYPES = ["string", "int", "number", "alpha", "uuid"]
def generate_random_direct_route(self, max_route_depth=4):
routes = []
for depth in range(1, max_route_depth + 1):
for _ in range(self.ROUTE_COUNT_PER_DEPTH):
route = "/".join(
[
TYPE_TO_GENERATOR_MAP.get("string")()
for _ in range(depth)
]
)
route = route.replace(".", "", -1)
route_detail = (random.choice(self.HTTP_METHODS), route)
if route_detail not in routes:
routes.append(route_detail)
return routes
def add_typed_parameters(self, current_routes, max_route_depth=8):
routes = []
for method, route in current_routes:
current_length = len(route.split("/"))
new_route_part = "/".join(
[
"<{}:{}>".format(
TYPE_TO_GENERATOR_MAP.get("string")(),
random.choice(self.ROUTE_PARAM_TYPES),
)
for _ in range(max_route_depth - current_length)
]
)
route = "/".join([route, new_route_part])
route = route.replace(".", "", -1)
routes.append((method, route))
return routes
@staticmethod
def generate_url_for_template(template):
url = template
for pattern, param_type in re.findall(
re.compile(r"((?:<\w+:(string|int|number|alpha|uuid)>)+)"),
template,
):
value = TYPE_TO_GENERATOR_MAP.get(param_type)()
url = url.replace(pattern, str(value), -1)
return url
@pytest.fixture(scope="function")
def sanic_router():
# noinspection PyProtectedMember
def _setup(route_details: tuple) -> (Router, tuple):
router = Router()
added_router = []
for method, route in route_details:
try:
router._add(
uri="/{}".format(route),
methods=frozenset({method}),
host="localhost",
handler=_handler,
)
added_router.append((method, route))
except RouteExists:
pass
return router, added_router
return _setup
@pytest.fixture(scope="function")
def route_generator() -> RouteStringGenerator:
return RouteStringGenerator()
@pytest.fixture(scope="function")
def url_param_generator():
return TYPE_TO_GENERATOR_MAP
@pytest.fixture
def app(request):
return Sanic(request.node.name)

View File

@@ -9,10 +9,15 @@ import ujson as json
loop = uvloop.new_event_loop()
asyncio.set_event_loop(loop)
async def handle(request):
return web.Response(body=json.dumps({"test":True}).encode('utf-8'), content_type='application/json')
return web.Response(
body=json.dumps({"test": True}).encode("utf-8"),
content_type="application/json",
)
app = web.Application(loop=loop)
app.router.add_route('GET', '/', handle)
app.router.add_route("GET", "/", handle)
web.run_app(app, port=sys.argv[1], access_log=None)

View File

@@ -4,8 +4,9 @@ from bottle import route, run
import ujson
@route('/')
@route("/")
def index():
return ujson.dumps({'test': True})
return ujson.dumps({"test": True})
app = bottle.default_app()

View File

@@ -3,9 +3,11 @@
import falcon
import ujson as json
class TestResource:
def on_get(self, req, resp):
resp.body = json.dumps({"test": True})
app = falcon.API()
app.add_route('/', TestResource())
app.add_route("/", TestResource())

View File

@@ -13,8 +13,14 @@ kyk = Kyoukai("example_app")
logger = logging.getLogger("Kyoukai")
logger.setLevel(logging.ERROR)
@kyk.route("/")
async def index(ctx: HTTPRequestContext):
return ujson.dumps({"test":True}), 200, {"Content-Type": "application/json"}
return (
ujson.dumps({"test": True}),
200,
{"Content-Type": "application/json"},
)
kyk.run()
kyk.run()

View File

@@ -3,8 +3,10 @@ import sys
import os
import inspect
currentdir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
sys.path.insert(0, currentdir + '/../../../')
currentdir = os.path.dirname(
os.path.abspath(inspect.getfile(inspect.currentframe()))
)
sys.path.insert(0, currentdir + "/../../../")
import timeit
@@ -16,7 +18,11 @@ print("Running New 100,000 times")
times = 0
total_time = 0
for n in range(6):
time = timeit.timeit('json({ "test":True }).output()', setup='from sanic.response import json', number=100000)
time = timeit.timeit(
'json({ "test":True }).output()',
setup="from sanic.response import json",
number=100000,
)
print("Took {} seconds".format(time))
total_time += time
times += 1
@@ -26,7 +32,11 @@ print("Running Old 100,000 times")
times = 0
total_time = 0
for n in range(6):
time = timeit.timeit('json({ "test":True }).output_old()', setup='from sanic.response import json', number=100000)
time = timeit.timeit(
'json({ "test":True }).output_old()',
setup="from sanic.response import json",
number=100000,
)
print("Took {} seconds".format(time))
total_time += time
times += 1

View File

@@ -2,8 +2,10 @@ import sys
import os
import inspect
currentdir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
sys.path.insert(0, currentdir + '/../../../')
currentdir = os.path.dirname(
os.path.abspath(inspect.getfile(inspect.currentframe()))
)
sys.path.insert(0, currentdir + "/../../../")
from sanic import Sanic
from sanic.response import json
@@ -15,5 +17,6 @@ app = Sanic("test")
async def test(request):
return json({"test": True})
if __name__ == '__main__':
if __name__ == "__main__":
app.run(host="0.0.0.0", port=sys.argv[1])

Some files were not shown because too many files have changed in this diff Show More