Amazon Web Services DDos/DoS Protection

Discussion in 'Protection Systems' started by Rushy, Feb 8, 2017.

Thread Status:
Not open for further replies.
  1. Rushy

    Rushy Administrator
    Staff Member

    Oct 1, 2016
    Likes Received:
    Tutorial on the system

    When I was doing the internship at the company I was working for, they kept getting network attacks from an individual/multiple people. When this occurred it disrupted their services as the servers where being hit. Things such as slow downs to the website or website outages can occur due to this.

    I did some research and found out a system that can automatically deal with these attacks before this bad traffic hits the infrastructure.


    1. As CloudFront receives requests on behalf of your web application, it sends access logs to an Amazon S3 bucket that contains detailed information about the requests.

    2. For every new access log stored in the Amazon S3 bucket, a Lambda function is triggered.

    3. The Lambda function analyzes which IP addresses have made more requests than the defined threshold and adds those IP addresses to an AWS WAF block list. AWS WAF blocks those IP addresses for a period of time that you define. After this blocking period has expired, AWS WAF allows those IP addresses to access your application again, but continues to monitor the requests from those IP addresses.

    4. The Lambda function publishes execution metrics in CloudWatch, such as the number of requests analyzed and IP addresses blocked.

    This system prevents attack traffic before hitting the elastic load balance level, and before going to the web servers. (WAF sits behind ELB)

    Nice thing to note, the cloudfront only allows valid http+https traffic to be forwarded through to ELB. ELB only supports TCP requests. Making attacks using protocols such as UDP and SYN floods blocked at its level before heading to the web servers.

    When ELB detects the types of attacks above, it will automatically scale to absorb the additional traffic at no additional charge. Will still be charge the normal service amount possibly (This was quoted in a AWS document).

    See here and here for more information

    This system allows the following configurations:

    Request Threshold

    Type the maximum number of requests that can be made from an IP address per minute without being blocked. The default is 400.

    WAF Block Period

    Specify how long (in minutes) an IP address should be blocked after crossing the threshold. The default is 240 minutes (four hours).

    WAF Quarantine Period

    Specify how long (in minutes) AWS WAF should monitor IP addresses after AWS WAF has stopped blocking them. The default is 240 minutes.

    These ip addresses show up in the Auto count rule in the ACL list (See Order of ACL's section below)


    1. The S3 bucket needs to be created manually, as the script fails to create it. Ensure to create the S3 in the same region as the lambda, AWS will complain about them not being in the same region
    2. Use this json to create the services using stack. This is the version that allows manual whitelists
    3. After creation create new rules for a ACL in WAF that is string based that allows user agents (For the allowed crawlers to bypass the limit rules)
    4. Also create a block list in ACL to deny abusive crawlers
    5. Link the Cloudfront to S3 bucket only when all ACL parameters/rules are created. This stops the service from running while creating these
    6. Testing: If the automated script on lambda fails when using the test file, copy this code from here to the lambda instance and try it again (Nothing bad will happen if lambda fails with the test, it wont add in any rules)

    Example ACL for allowed crawlers



    Order of ACL's


    "When a web request matches all of the conditions in a rule, AWS WAF immediately takes the corresponding action—allow or block—and doesn't evaluate the request against the remaining rules in the web ACL, if any."

    This order allows the following:

    1. Allows crawlers we want to access the websites always (Bypassing the rate blocking system)
    2. Allows bad crawlers to be blocked
    3. Can whitelist ip addresses manually by adding ip's to the whitelist
Thread Status:
Not open for further replies.