Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,11 +1,12 @@
FROM python:3.9-alpine

COPY aws_push.py gcp_push.py output_report.py requirements.txt run.sh /
COPY aws_push.py gcp_push.py az_push.py output_report.py requirements.txt run.sh /
COPY contrib /contrib
COPY shared /shared

RUN apk add --no-cache nmap nmap-scripts git && \
RUN apk add --no-cache nmap nmap-scripts git build-base libffi-dev openssl-dev && \
pip install --no-cache-dir -r requirements.txt && \
apk del build-base && \
git clone https://github.com/vulnersCom/nmap-vulners \
/usr/share/nmap/scripts/vulners && \
nmap --script-updatedb && \
Expand Down
23 changes: 20 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,17 +56,17 @@ $ docker run -v $(CURDIR)/shared:/shared flan_scan <Nmap-flags>
Pushing Results to the Cloud
----------------------------

Flan Scan currently supports pushing Latex reports and raw XML Nmap output files to a GCS Bucket or to an AWS S3 Bucket. Flan Scan requires 2 environment variables to push results to the cloud. The first is `upload` which takes one of two values `gcp` or `aws`. The second is `bucket` and the value is the name of the S3 or GCS Bucket to upload the results to. To set the environment variables, after running `make build` run the container setting the environment variables like so:
Flan Scan currently supports pushing Latex reports and raw XML Nmap output files to a GCS Bucket, AWS S3 Bucket, or an Azure Storage account. Flan Scan requires 2 environment variables to push results to the cloud. The first is `upload` which takes one of three values `gcp` or `aws` or `az`. The second is `bucket` and the value is the name of the S3 or GCS Bucket or Azure Container to upload the results to. To set the environment variables, after running `make build` run the container setting the environment variables like so:
```bash
$ docker run --name <container-name> \
-v $(CURDIR)/shared:/shared \
-e upload=<gcp or aws> \
-e upload=<gcp or aws or az> \
-e bucket=<bucket-name> \
-e format=<optional, one of: md, html or json> \
flan_scan
```

Below are some examples for adding the necessary AWS or GCP authentication keys as environment variables in container. However, this can also be accomplished with a secret in Kubernetes that exposes the necessary environment variables or with other secrets management tools.
Below are some examples for adding the necessary AWS, GCP, or Azure authentication keys as environment variables in container. However, this can also be accomplished with a secret in Kubernetes that exposes the necessary environment variables or with other secrets management tools.


### Example GCS Bucket Configuration
Expand Down Expand Up @@ -103,6 +103,23 @@ docker run --name <container-name> \
flan_scan


```

### Example Azure Storage Configuration

Set the `AZURE_ACCOUNT_NAME` and `AZURE_ACCOUNT_KEY` environment variables to the corresponding variables for your Azure storage account.

```bash
docker run --name <container-name> \
-v $(pwd)/shared:/shared \
-e upload=az \
-e bucket=<storage-container-name> \
-e AZURE_ACCOUNT_URL=<your-azure-storage-account-url> \
-e AZURE_ACCOUNT_KEY=<your-azure-storage-secret-key-or-sas-string> \
-e format=<optional, one of: md, html or json> \
flan_scan


```

Deploying on Kubernetes
Expand Down
23 changes: 23 additions & 0 deletions az_push.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
import sys
import os
from azure.storage.blob import BlobServiceClient

filename = sys.argv[1]

account_url = os.getenv('AZURE_ACCOUNT_URL')
account_key = os.getenv('AZURE_ACCOUNT_KEY')
container_name = os.getenv('bucket')

try:
blob_service_client = BlobServiceClient(
account_url=account_url, credential=account_key
)
blob_client = blob_service_client.get_blob_client(
container=container_name, blob=filename
)

with open(filename, "rb") as data:
blob_client.upload_blob(data)
except Exception as e:
print('Error uploading to azure')
print(e)
2 changes: 1 addition & 1 deletion kubernetes_templates/deployment.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ spec:
mountPath: /shared
env:
- name: upload
value: <GCP_OR_AWS>
value: <GCP_OR_AWS_OR_AZ>
- name: bucket
value: <BUCKET_NAME>
- name: format
Expand Down
1 change: 1 addition & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
xmltodict==0.12.0
google-cloud-storage==1.23.0
azure-storage-blob==12.18.1
boto3==1.12.15
Jinja2==2.11.3
markupsafe==2.0.1
3 changes: 3 additions & 0 deletions run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,9 @@ function upload {
elif [ $upload = "gcp" ]
then
python /gcp_push.py $1
elif [ $upload = "az" ]
then
python /az_push.py $1
fi
}

Expand Down