On Wed, Jun 28, 2017 at 12:30 AM, Vinnie Falco via Boost
On Tue, Jun 27, 2017 at 1:40 PM, Artyom Beilis via Boost
wrote: Looking into parser/body code I noticed: ... Basically I can exhaust the memory of the server and kill it by providing huge content length from several connections and lead to its crash.
Reasonable and configurable limit should be provided for content length.
That's reasonable although note that you can put a max buffer size on the dynamic buffers that come with Beast, and it will naturally take care of limits. For example:
beast::http::requestbeast::http::dynamic_body req{1024 * 1024};
will create a request that has a 1MB limit on the body. The moment the reader goes to resize the dynamic buffer, it will return a beast::http::error::buffer_overflow error.
It does not fix security flaw of using http::string_body!
Still, your suggestion to add something like `void basic_parser::max_content_length(std::size_t)` is a good idea. Thanks!
Note: the default and reasonable max_context_length must be defined by default. std::size_t isn't good for max_content_length, it should be unsigned long long or uint64_t because if you use it for file upload on 32 bit system you want to support files above 4GB. Regards, Artyom Beilis