python3 thunder.py destroy
Your virtual environment must be active to use thunder.py:
In order for this level to deploy correctly, you must have your git username and email configuration values set to some values, but they don't have to correspond to your actual details.
# Check if "user.email" and "user.name" are set: git config --list # If they aren't set, run: git config --global user.name "John Doe" git config --global user.email firstname.lastname@example.org
python3 thunder.py create thunder/a2finance
Activate the service account given to you. You MUST do this, or the level will not work as intended.
gcloud auth activate-service-account --key-file=start/a2-access.json
In this level, the secret is the (fake) credit card number of the person assigned to you upon level creation. Use the compromised service account that is given to you to navigate the cloud infrastructure to find the credit card.
Upon level creation, the name of the target is written to start/a2finance.txt, and the service account key file is written to start/a2-access.json.
From this level on, you will need to be able to figure out the project-wide permissions of service account credentials. To do this, we have provided you with a script that repeatedly queries the projects.testIamPermissions REST api function to figure out which permissions given credentials have. The script is stored at scripts/test-permissions.py. We recommend glancing over the script to understand how it works, but to run it just set PROJECT_ID and either SERVICE_ACCOUNT_KEY_FILE or ACCESS_TOKEN, then run the script, and it will output the permissions of the given credentials. By default, it will test the given service account key file, but if USE_ACCESS_TOKEN is set to True, it will test the given access token.
# If set to true, credentials will be created using ACCESS_TOKEN instead of SERVICE_ACCOUNT_KEY_FILE USE_ACCESS_TOKEN = False # Only one of the following need to be set: SERVICE_ACCOUNT_KEY_FILE = 'path/to/key/file' ACCESS_TOKEN = '' # Set the project ID PROJECT_ID = '[project-id]'
Run the permissions testing script on the given service account credentials
In scripts/test-permissions.py, set PROJECT_ID to your project id and SERVICE_ACCOUNT_KEY_FILE to 'start/a2-access.json', then run:
One of the permissions you have is storage.buckets.list. Try using this permission.
The command to do so is:
There's a bucket in the project. Check out what's inside of it.
Download the bucket:
gsutil cp -r gs://[bucket-name] .
The bucket stores a git repository. There might be something interesting in the git history.
View the previous git commits:
The most recent commit mentions a key file getting committed by the first commit.
Checkout the previous commit:
git checkout <old commit name>
There's an ssh key file! It could be used to login to a Google Compute Engine instance.
List the compute instances in the project:
gcloud compute instances list
Get more information on the running instance:
gcloud compute instances describe [instance-name]
In the metadata of the instance, there is information about the ssh keys that can be used to login to instance, including the username that the key is for.
To use the SSH key, you will need to restrict access to the key file:
chmod 400 [key-file]
SSH into the instance:
ssh -i [key-file] clouduser@[instance-external-ip]
Based on the name of the instance, it probably has access to Google Cloud's logging service, Stackdriver Logging.
List the logs on the project:
gcloud logging logs list
Most of the logs are automatically generated logs of events of other resources, which may not provide much useful information, but one of the logs is named "transactions"
Read the log named transaction. You will need to specify it's full resource name:
gcloud logging read "logName=projects/[project-id]/logs/[log-name]"
You won't want to look through all the log entries, so try filtering it to only show the name you are looking for. The documentation for logging filtration can be found here.
The command to get the log entry you want is:
gcloud logging read "logName=projects/[project-id]/logs/[log-name] AND jsonPayload.name=[name]"
Accidentally uploading sensitive information in git repositories is a common pattern. Unfortunately, modern tools allow one to immediately detect and copy
exposed credentials on sites such as GitHub in under a
with such tools detecting thousands of exposed keys being uploaded per day (Meli 2019).
One never put credentials in source-files, but rather to use environment variables or native cloud platform services to manage
keys. Regardless of the length of time a credential has been exposed, once it has been, developers must invalidate the credential to prevent abuse.
Another issue with applications deployed in the cloud is insecure logging. Since cloud applications often abstract out the machines they are running on, logging events are often collected centrally via a service such as Stack Driver. Unfortunately, log information may contain sensitive information that has not been sanitized and developers may have allowed excessive permissions to such log files from the machines that are writing to the log (e.g. allowing them to read logging information as well). When log data contains such information as PIN codes (Monzo 8/2019) and passwords (GitHub 2018, Instagram 2019, Twitter 2018), data breaches can occur.