Prowler Pro and Verica Announcement

Hi there!

I’m so happy to announce that I’ve joined Verica and thanks to their support we have invested a lot on Prowler and we are announcing today the availability of Prowler Pro!

As many of you have noticed, Prowler is growing fast and getting better. Now we are 4 full time engineers. Pepe Fagoaga, Nacho Rivera and Sergio Garcia are the Prowler Pro dream team along with a vibrant community. We are working every day on Prowler to make it better and more comprehensive, with that we are also launching Prowler Pro.

Our main goal is to keep hands on the Open Source version and giving customers better experience at enterprise level with Prowler Pro.

Prowler 2.8.0 – Ides of March

The Ides of March is an instrumental song that opens the second studio album of Iron Maiden called Killers. This song is great as an opening, March is the month when spring starts in my side of the world, is always time for optimism. Ides of March also means 15 of March in the Roman calendar (and the day of the assassination of Julius Caesar). Enjoy the song here.

We have put our best to make this release and with important help of the Prowler community of cloud security engineers around the world, thank you all! Special thanks to the Prowler full time engineers @jfagoagas, @n4ch04 and @sergargar! (and Bruce, my dog) ❤️

prowler-team-pic

Important changes in this version (read this!):

Now, if you have AWS Organizations and are scanning multiple accounts using the assume role functionality, Prowler can get your account details like Account Name, Email, ARN, Organization ID and Tags and add them to CSV and JSON output formats. More information and usage here.

New Features

  • 1 New check for S3 buckets have ACLs enabled by @jeffmaley in #1023 :
    7.172 [extra7172] Check if S3 buckets have ACLs enabled - s3 [Medium]
  • feat(metadata): Include account metadata in Prowler assessments by @toniblyx in #1049

Enhancements

Fixes

  • Fix issue extra75 reports default SecurityGroups as unused #1001 by @jansepke in #1006
  • Fix issue extra793 filtering out network LBs #1002 by @jansepke in #1007
  • Fix formatting by @lorchda in #1012
  • Fix docker references by @mike-stewart in #1018
  • Fix(check32): filterName base64encoded to avoid space problems in filter names by @n4ch04 in #1020
  • Fix: when prowler exits with a non-zero status, the remainder of the block is not executed by @lorchda in #1015
  • Fix(extra7148): Error handling and include missing policy by @toniblyx in #1021
  • Fix(extra760): Error handling by @lazize in #1025
  • Fix(CODEOWNERS): Rename team by @jfagoagas in #1027
  • Fix(include/outputs): Whitelist logic reformulated to exactly match input by @n4ch04 in #1029
  • Fix CFN CodeBuild example by @mmuller88 in #1030
  • Fix typo CodeBuild template by @dlorch in #1010
  • Fix(extra736): Recover only Customer Managed KMS keys by @jfagoagas in #1036
  • Fix(extra7141): Error handling and include missing policy by @lazize in #1024
  • Fix(extra730): Handle invalid date formats checking ACM certificates by @jfagoagas in #1033
  • Fix(check41/42): Added tcp protocol filter to query by @n4ch04 in #1035
  • Fix(include/outputs):Rolling back whitelist checking to RE check by @n4ch04 in #1037
  • Fix(extra758): Reduce API calls. Print correct instance state. by @lazize in #1057
  • Fix: extra7167 Advanced Shield and CloudFront bug parsing None output without distributions by @NMuee in #1062
  • Fix(extra776): Handle image tag commas and json output by @jfagoagas in #1063
  • Fix(whitelist): Whitelist logic reformulated again by @n4ch04 in #1061
  • Fix: Change lower case from bash variable expansion to tr by @lazize in #1064
  • Fix(check_extra7161): fixed check title by @n4ch04 in #1068
  • Fix(extra760): Improve error handling by @lazize in #1055
  • Fix(check122): Error when policy name contains commas by @plarso in #1067
  • Fix: Remove automatic PR labels by @jfagoagas in #1044
  • Fix(ES): Improve AWS CLI query and add error handling for ElasticSearch/OpenSearch checks by @lazize in #1032
  • Fix(extra771): jq fail when policy action is an array by @lazize in #1031
  • Fix(extra765/776): Add right region to CSV if access is denied by @roman-mueller in #1045
  • Fix: extra7167 Advanced Shield and CloudFront bug parsing None output without distributions by @NMuee in #1053
  • Fix(filter-region): Support comma separated regions by @thetemplateblog in #1071

New Contributors

Full Changelog: 2.7.0…2.8.0

Prowler 2.7.0 – Brave

This release name is in honor of Brave New World, a great song of 🔥Iron Maiden🔥 from their Brave New World album. Dedicated to all of you looking forward to having the world we had before COVID… We hope is not hitting you bad. Enjoy the rest of the note below.

Image copyright by Iron Maiden Holdings Ltd.

Important changes in this version (read this!):

  • As you can see, Prowler is now in a new organization called https://github.com/prowler-cloud/.
  • When Prowler doesn’t have permissions to check a resources or service it gives an INFO instead of FAIL. We have improved all checks error handling in those use cases when the CLI responds with a AccessDenied, UnauthorizedOperation or AuthorizationError.
  • From this version, master branch will be the latest available code and we will keep the stable code as each release, if you are installing or deploying Prowler using git clone to master take that into account and use the latest release instead, i.e.: git clone --branch 2.7 https://github.com/prowler-cloud/prowler or curl https://github.com/toniblyx/prowler/archive/refs/tags/2.7.0.tar.gz -o prowler-2.7.0.tar.gz
  • For known issues please see https://github.com/prowler-cloud/prowler/issues the ones open with bug as a red tag.
  • Discussions is now open in the Prowler repo https://github.com/prowler-cloud/prowler/discussions, feel free to use it if that works for you better than the current Discord server.
  • 11 new checks!! Thanks to @michael-dickinson-sainsburys, @jonloza, @rustic, @Obiakara, @Daniel-Peladeau, @maisenhe, @7thseraph. Now there have a total of 218 checks. See below for details.
  • An issue with Security Hub integration when resolving closed findings are either a lot of new findings, or a lot of resolved findings is now working as expected thanks to @Kirizan
  • When credential are in environment variable it failed to review, that was fixed by @lazize
  • See below new features and more details for this version.

New Features

  • 11 New checks for Redshift, EFS, CloudWatch, Secrets Manager, DynamoDB and Shield Advanced:
7.160 [extra7160] Check if Redshift has automatic upgrades enabled - redshift [Medium]
7.161 [extra7161] Check if EFS have protects sensative data with encryption at rest - efs [Medium]
7.162 [extra7162] Check if CloudWatch Log Groups have a retention policy of 365 days - cloudwatch [Medium]
7.163 [extra7163] Check if Secrets Manager key rotation is enabled - secretsmanager [Medium]
7.164 [extra7164] Check if CloudWatch log groups are protected by AWS KMS  - logs [Medium]
7.165 [extra7165] Check if DynamoDB: DAX Clusters are encrypted at rest - dynamodb [Medium]
7.166 [extra7166] Check if Elastic IP addresses with associations are protected by AWS Shield Advanced - shield [Medium]
7.167 [extra7167] Check if Cloudfront distributions are protected by AWS Shield Advanced - shield [Medium]
7.168 [extra7168] Check if Route53 hosted zones are protected by AWS Shield Advanced - shield [Medium]
7.169 [extra7169] Check if global accelerators are protected by AWS Shield Advanced - shield [Medium]
7.170 [extra7170] Check if internet-facing application load balancers are protected by AWS Shield Advanced - shield [Medium]
7.171 [extra7171] Check if classic load balancers are protected by AWS Shield Advanced - shield [Medium]
  • Add -D option to copy to S3 with the initial AWS credentials instead of the assumed as with -B option by @sectoramen in #974
  • Add new functions to backup and restore initial AWS credentials, for better handling chaining role by @sectoramen in #978
  • Add additional action permissions for Glue and Shield Advanced checks by @lazize in #995

Enhancements

  • Update Dockerfile to use Amazon Linux container image by @Kirizan in #972
  • Update Readme: -T option is not mandatory by @jfagoagas in #944
  • Add $PROFILE_OPT to CopyToS3 commands by @sectoramen in #976
  • Remove unneeded package “file” from Dockerfile by @sectoramen in #977
  • Update docs (templates): Improve bug template with more info by @jfagoagas in #982

Fixes

  • Fix in README and multiaccount serverless deployment templates by @dlorch in #939
  • Fix assume-role: check if -T and -A options are set together by @jfagoagas in #945
  • Fix group25 FTR by @lopmoris in #948
  • Fix in README link for group25 FTR by @lopmoris in #949
  • Fix issue #938 assume_role multiple times by @halfluke in #951
  • Fix and clean assume-role to better handle AWS STS CLI errors by @jfagoagas in #946
  • Fix issue with Security Hub integration when resolving closed findings are either a lot of new findings, or a lot of resolved findings by @Kirizan in #953
  • Fix broken link in README.md by @rtcms in #966
  • Fix checks with comma issues in checks by @j2clerck in #975
  • Fix: Credential chaining from environment variables by @lazize in #996

New Contributors

Full Changelog: 2.6.1…2.7

Prowler 2.6.0 – Phantom

This release name is in honor of Phantom of the Opera, one of my favorite songs and a master piece of 🔥Iron Maiden🔥. It starts by “I’ve been lookin’ so long for you now” like looking for security issues, isn’t it? 🤘🏼 Enjoy it here while reading the rest of this note.

PHOTO CREDIT Copyright JOHN McMURTRIE

Important changes in this version:

New Features:

  • 12 New checks for efs, redshift, elb, dynamodb, route53, cloiudformation, elb and apigateway:
7.148 [extra7148] Check if EFS File systems have backup enabled - efs [Medium]
7.149 [extra7149] Check if Redshift Clusters have automated snapshots enabled - redshift [Medium]
7.150 [extra7150] Check if Elastic Load Balancers have deletion protection enabled - elb [Medium]
7.151 [extra7151] Check if DynamoDB tables point-in-time recovery (PITR) is enabled - dynamodb [Medium]
7.152 [extra7152] Enable Privacy Protection for for a Route53 Domain - route53 [Medium]
7.153 [extra7153] Enable Transfer Lock for a Route53 Domain - route53 [Medium]
7.154 [extra7154] Enable termination protection for Cloudformation Stacks - cloudformation [MEDIUM]
7.155 [extra7155] Check whether the Application Load Balancer is configured with defensive or strictest desync mitigation mode - elb [MEDIUM]
7.156 [extra7156] Checks if API Gateway V2 has Access Logging enabled - apigateway [Medium]
7.157 [extra7157] Check if API Gateway V2 has configured authorizers - apigateway [Medium]
7.158 [extra7158] Check if ELBV2 has listeners underneath - elb [Medium]
7.159 [extra7159] Check if ELB has listeners underneath - elb [Medium]

Enhancements:

Fixes:

New Contributors

Full Changelog: 2.5.0…2.6.0

Thank you all for your contributions, Prowler community is awesome! 🥳

Run Prowler from AWS CloudShell in seconds

Using AWS CloudShell is probably the easier an quicker way to run Prowler in your AWS account.

Just start AWS CloudShell and run these commands:

git clone https://github.com/toniblyx/prowler
pip3 install detect-secrets --user
cd prowler 
./prowler

If you run Prowler and realize that takes more time that the CloudShell session you can use screen command line tool for that (screen manager with VT100/ANSI terminal emulation). To install it:

sudo yum install screen -y

Run Prowler in a screen session:

screen -dmS prowler sh -c "./prowler -M html"

Check existing running screen sessions:

screen -ls

Attach to the Prowler session:

screen -r prowler

Use ‘Ctrl+a d’ to detach without terminating.

If you want to run Prowler from CloudShell against multiple accounts, first declare a variable with all account you want to assess:

export AWS_ACCOUNTS='1111111 222222 333333'

Then, make sure you have a role to assume on each of those accounts. See this template (create_role_to_assume_cfn.yaml) that may help, then run this command:

for accountId in $AWS_ACCOUNTS; do  screen -dmS prowler sh -c "./prowler -A $accountId -R ProwlerExecRole -M csv,json,html"; done

For more options and details go to: https://github.com/toniblyx/prowler or run ./prowler -h.

Prowler 2.0: New release with improvements and new checks ready for re:Invent and BlackHat EU

Taking advantage of this week AWS re:Invent and  next week BlackHat Europe, I wanted to push forward a new version of Prowler.

In case you are new to Prowler:

Prowler is an AWS Security Best Practices Assessment, Auditing, Hardening and Forensics Readiness Tool. It follows guidelines of the CIS Amazon Web Services Foundations Benchmark and DOZENS of additional checks including GDPR and HIPAA groups. Official CIS benchmark for AWS guide is here.

This new version has more than 20 new extra checks (of +90), including GDPR and HIPAA group of checks as for a reference to help organizations to check the status of their infrastructure regarding those regulations. Prowler has also been refactored to allow easier extensibility. Another important feature is the JSON output that allows Prowler to be integrated, for example, with Splunk or Wazuh (more about that soon!). For all details about what is new, fixes and improvements please see the release notes here: https://github.com/toniblyx/prowler/releases/tag/2.0

For me, personally, there are two main benefits of Prowler. First of all, it helps many organizations and individuals around the world to improve their security posture on AWS, and using just one easy and simple command, they realize what do they have to do and how to get started with their hardening. Second, I’m learning a lot about AWS, its API, features, limitations, differences between services and AWS security in general.

Said that, I’m so happy to present Prowler 2.0 in BlackHat Europe next week in London! It will be at the Arsenal

and I’ll talk about AWS security, and show all new features, how it works, how to take advantage of all checks and output methods and some other cool things. If you are around please come by and say hello, I’ve got a bunch of laptop sticklers! Here all details, Location:  Business Hall, Arsenal Station 2. Date: Wednesday, December 5 | 3:15pm-4:50pm. Track Vulnerability Assessment. Session Type: Arsenal

BIG THANKS!

I want to thank the Open Source community that has helped out since first day, almost a thousand stars in Github and more than 500 commits talk by itself. Prowler has become pretty popular out there and all the community support is awesome, it motivates me to keep up with improvements and features. Thanks to you all!!

Prowler future?

Main goals for future versions are: to improve speed and reporting, including switch base code to Python to support existing checks and new ones in any language.

If you are interested on helping out, don’t hesitate to reach out to me. \m/

My arsenal of AWS security tools

I’ve been using and collecting a list of helpful tools for AWS security. This list is about the ones that I have tried at least once and I think they are good to look at for your own benefit and most important: to make your AWS cloud environment more secure.

They are not in any specific order, I just wanted to group them somehow. I have my favorites depending on the requirements but you can also have yours once you test them.

Feel free to send a pull request for improvements or add more tools (open source only in this list) here:

New additions at https://github.com/toniblyx/my-arsenal-of-aws-security-tools

 

Defensive (Hardening, Security Assessment, Inventory)

Offensive:

Continuous Security Auditing:

DFIR:

Development Security:

S3 Buckets Auditing:

Training:

Others:

Prowler 1.6: AWS Security Best Practices Assessment and Forensics Readiness Tool

It looks like Prowler has become a popular tool for those concerned about AWS security. I just made Prowler to solve an internal requirement we have here in Alfresco. I decided to make it public and I started getting a lot of feedback, pull requests, comments, advices, bugs reported, new ideas and I keep pushing to make it better and more comprehensive following all what cloud security community seems to need.
I know Prowler is not the best tool out there but it does what I wanted it to do: “Take a picture of my AWS account (or accounts) security settings and tell me from where to start working to improve it”. Do the basics, at least. And that’s what it does. I would use other tools to track service change, etc., I discuss that also in my talks.
Currently, Prowler performs 74 checks (for an entire list run `prowler -l`), being 52 of them part of the CIS benchmark.

Digital Forensics readiness capabilities into Prowler 1.6

`prowler -c forensics-ready`
I’m into DFIR, I love it and I read lot about cloud digital forensics and incident response, I enjoy investing my time R&D about that subject. And I’m concerned about random or targeted attacks to cloud infrastructure. For the talk I’m doing today at the SANS Cloud Security Summit 2018 in San Diego, I wanted to show something new and I thought about adding new checks to Prowler related to forensics and how to make sure you have all (or as much) what you need to perform a proper investigation in case of incident, logs that are not enabled by default in any AWS account by the way. Some of those checks are included and well described in the current CIS benchmark for AWS, or even in the CIS benchmark for AWS three tiers web deployments (another hardening guide that is way less popular but pretty interesting too), but there are checks that are not included anywhere. For example, I believe it is good idea to keep record of your API Gateway logs in your production accounts or even your ELB logs, among many others. So when you run  `prowler -c forensics-ready` now you will get the status of your resources across all regions, and you can make sure you are logging all what you may eventually need in case of security incident. Currently these are the checks supported (https://github.com/toniblyx/prowler#forensics-ready-checks):
  • 2.1 Ensure CloudTrail is enabled in all regions (Scored)
  • 2.2 Ensure CloudTrail log file validation is enabled (Scored)
  • 2.3 Ensure the S3 bucket CloudTrail logs to is not publicly accessible (Scored)
  • 2.4 Ensure CloudTrail trails are integrated with CloudWatch Logs (Scored)
  • 2.5 Ensure AWS Config is enabled in all regions (Scored)
  • 2.6 Ensure S3 bucket access logging is enabled on the CloudTrail S3 bucket (Scored)
  • 2.7 Ensure CloudTrail logs are encrypted at rest using KMS CMKs (Scored)
  • 4.3 Ensure VPC Flow Logging is Enabled in all VPCs (Scored)
  • 7.12 Check if Amazon Macie is enabled (Not Scored) (Not part of CIS benchmark)
  • 7.13 Check if GuardDuty is enabled (Not Scored) (Not part of CIS benchmark)
  • 7.14 Check if CloudFront distributions have logging enabled (Not Scored) (Not part of CIS benchmark)
  • 7.15 Check if Elasticsearch Service domains have logging enabled (Not Scored) (Not part of CIS benchmark)
  • 7.17 Check if Elastic Load Balancers have logging enabled (Not Scored) (Not part of CIS benchmark)
  • 7.18 Check if S3 buckets have server access logging enabled (Not Scored) (Not part of CIS benchmark)
  • 7.19 Check if Route53 hosted zones are logging queries to CloudWatch Logs (Not Scored) (Not part of CIS benchmark)
  • 7.20 Check if Lambda functions are being recorded by CloudTrail (Not Scored) (Not part of CIS benchmark)
  • 7.21 Check if Redshift cluster has audit logging enabled (Not Scored) (Not part of CIS benchmark)
  • 7.22 Check if API Gateway has logging enabled (Not Scored) (Not part of CIS benchmark)

Screenshot while running `forensics-ready` group of checks, here only showing 3 of the first checks that are part of that group

I haven’t added yet a RDS logging check and I’m probably missing many others so please feel free to open an issue in Github and let me know!
If you want to check out my slide deck used during my talk at the SANS Cloud Security Summit 2018 in San Diego, look at here: https://github.com/toniblyx/SANSCloudSecuritySummit2018

Getting started with AWS Certificate Manager (and Route53)

…with your own domain not hosted in Amazon Route 53 and a wild card certificate.
A main premise I follow when it comes to deploying or architecting any service in the cloud, whatever vendor I use, is full encryption between layers and intent to add elasticity on each service (adding them to what AWS calls Auto Scaling Groups).
For a pretty cool new project (wink wink) that we are working in the Security Operations Group at Alfresco, we need to deploy a bunch of AWS resources and we want to use https between all the services. Since we will use AutoScaling groups and ELB, we want to configure all ELB with HTTPS and to do so we have to provide the certificates. We can do that manually or automated with CloudFormation, and the CloudFormation option is what we have chosen as in many other projects. We also want to use our own certificates.
This article is to show you how to create your own wild card certificate with AWS Certificate Manager and use Route53 for a subdomain that you own. For example, if you have a domain that is not hosted in Route 53, like in my case with blyx.com which is hosted at joker.com using their own name servers. I’ll use a subdomain called cloud.blyx.com for this example.
Long story short, the whole process is something like this:
  1. Add a hosted zone in Route 53
  2. Configure your DNS server to point the custom subdomain to Route 53
  3. Create the wildcard certificate
So let’s go ahead:
  1. Start creating a hosted zone in Route 53. This new hosted zone will be a subdomain of our main domain that we will be able to manage entirely in AWS for our wild card certificates and obviously for our load balancers and URLs domain names instead of using default amazonaws.com names, in my case, I create a hosted zone called “cloud.blyx.com”:

  1. Now we have to go to our DNS server and add all records that we got in the previous step, in my case I will do it using joker.com web panel, if you use Bind or other solution you have to create a NS zone called cloud.blyx.com (something like that, like a 3rd level domain) and then point the Name servers that we got from AWS Route 53 above. Here an example, easy:

3. Once we have all DNS steps done, let’s create our wild card certificate with AWS Certificate Manager for *.cloud.blyx.com. Remember that to validate the certificate creation you will get an email from AWS and you will have to approve the request by following the instructions on the email:

This is the email to approve the request:

Here is the approval page:

Once it is approved you will see it as “issued”. And you are ready to use it. Now, from CloudFormation we can call Route 53 and use your own certificate to make all communications through HTTPS when needed.
Hope you get this article helpful. The Trooper is coming!

10 Security Concepts You Must Know (in 5 minutes with GoT)

Here is a lightning talk I have recorded recently. I did it internally at Alfresco for an Engineering meeting but I think is good idea to share it and take advantage of the coming new season of Game of Thrones 😉

You have links and resources also available below the video.

If you want to use the ppt for your own use you can download it from here in my GitHub:

https://github.com/toniblyx/10-security-concepts-with-got

All references and recommended reads about the subject are here:

http://www.infoworld.com/article/3144362/devops/10-key-security-terms-devops-ninjas-need-to-know.html

https://danielmiessler.com/study/encoding-encryption-hashing-obfuscation/

https://blog.instant2fa.com/how-to-build-a-wall-that-stops-white-walkers-fantastic-threat-models-ef2dfa7864a4