Book info – Hands-On AWS Penetration Testing with Kali Linux
Disclaimer: Working through this book will use AWS, which costs money. Make sure you are doing things to manage your costs. If I remember, I’ll keep up with my costs to help get a general idea. But prices can change at any time, so major grain of salt.
Disclaimer #2: Jail is bad. Cybercrime is illegal. Educational purposes. IANAL. Don’t do things you shouldn’t. Etc.
Remember that depending on your setup, you may not have to specify
profile in the CLI commands. And make sure your user has the required permissions so you can check things out in the CLI.
Ch 18 – Using Pacu for AWS Pentesting
So now to cover Pacu in more depth. I don’t really agree with the choice to put this chapter this far toward the end – I would probably have put this earlier. It’s fine here at the end, but it seems like it would have been a natural fit earlier. Pacu is from Rhino Security Labs and written in Python 3, which we covered previously. Basically it serves as a way to put all the research done on exploiting AWS into one place. It really is a cool project that has a ton of value.
The authors go over the installation process again and explain some of the pieces. They go more in depth about sessions (basically a way to keep things isolated) and explain it helps limit API calls. AWS keys are needed for most of Pacu’s functionality.
Pacu commands are also explored in depth…review the chapter or check out the github. I’m going to keep this very brief.
ls lists the modules and categories.
search, um, searches.
help does what you expect.
whoami outputs info about the keys. The amount of data available varies with how much has been enumerated. The
iam__enum_permissions will help fill it out.
data gives the data stored for the session.
services provides info about AWS services that has been stored.
regions gives region info.
update_regions shouldn’t be needed but can be used to update regions.
set_regions is important to help limit API calls – it controls what regions the commands are run in.
exec does the thing.
set_keys is used to add keys.
swap_keys lets you use a different set of keys.
import_keys pulls keys from the AWS CLI.
CTRL+C all quit. And remember you can run AWS CLI commands by using
proxy command was interesting because it lets you work with the built-in command and control structure PacuProxy. Basically gives you a way to avoid detection by shuttling through a compromised instance. More on this later.
Creating a new module
There’s a template for creating new modules that basically spells out what you need to do. To create your own modules, a better understanding of the API is helpful. So the covered API methods…
get_active_session API defines the
session variable. You can copy and modify session data, but the authors recommend only updating session data at the end to prevent database issues.
pacu_main.get_proxy_settings) pulls info about PacuProxy. It wouldn’t be likely to get used unless you are working with a module using PacuProxy.
input methods override the native Python
input methods so you can customize how things are printed. A cool feature is if the
SecretAccessKey in the dictionary and redact it to keep the key secure.
key_info method gets info about the active set of AWS keys. Definitely a handy feature.
fetch_data method facilitates purpose built modules by running modules to enumerate data needed if not available. Basically, if you are trying to do something that needs a different module run, this will run the module to grab the data needed if it’s not there.
get_regions method lets the module check the region info for the session and run accordingly as well as considering what regions the service is available in on AWS. This is a good option to limit API calls and takes regions off the list of things for module developers to deal with.
The authors start off the
install_dependencies section by saying it’s basically deprecated, but it can be helpful if you need to pull dependencies. It will likely be completely removed soon, so check if you are developing a module.
get_boto3_resource take care of the configuration options to interact with the boto3 library. Basically it makes the module developer’s life easier.
Module structure and implementation
The provided template module basically spells out the module structure. The authors walkthrough developing a module using one of the S3 scripts developed earlier. The module code is available on the book GitHub. The integration into Pacu is pretty straightforward. Create a new folder with the desired name in the modules folder, save the module script as
main.py in that folder, create an empty
__init__.py file in the folder, and run Pacu.
Note: Unfortunately this module has been removed – that’s a bummer since it looked really cool. I’m leaving this in place in case it comes back at some point.
The chapter wraps with a quick intro to PacuProxy. Basically it’s a C2 framework for the cloud that is part of the Pacu workflow. Very nice. It has it’s own modules that can be used. Probably the biggest takeaway is using PacuProxy will let you proxy through resources you have compromised to help avoid detection. I can see a lot of benefits to using the PacuProxy option once you’ve gotten yourself into an environment. I think if you are going to focus on pentesting AWS, going through the details of PacuProxy would be a very good use of time. The authors do say it’s still in development so they have provided limited details.
I still think this chapter would have been better situated earlier in the book. That’s not to say I don’t understand it being here – just that the info would have been helpful earlier on. It’s a really great tool that anyone who is responsible for security of an AWS environment would benefit from becoming familiar with. Between Pacu, Scout Suite, and the CyberArk tools SkyArk (that can scan Azure and AWS to look for privileged users) and SkyWrapper (that helps find abuse of temporary tokens), there are a growing number of tools available for those working with cloud environments. The cost can limit the ability to lab things out fully (my AWS costs have been running about $10.00 US per month), but that cost may be more affordable than building out a physical lab. I’m looking forward to doing a lot of this in Azure, but I will admit I have concerns about costs in that environment. I’ve found the cost explanations and breakdowns from AWS to be very clear, but I’ve not found that to be the case with Azure. I plan on pursuing the Developer Subscription when I start building out my own Azure environment. I’ve done a ton of Azure labs through Cybrary and find the environment very easy to work in, but I’m looking forward to getting my own environment spun up in the future.