This Python script allows you to upload files from a local folder to an AWS S3 bucket. It supports dynamic configuration through a YAML file, making it flexible and easy to manage.
Create a config.yaml
file inside the config/
folder:
aws_access_key_id: "YOUR_ACCESS_KEY_ID"
aws_secret_access_key: "YOUR_SECRET_ACCESS_KEY"
region_name: "us-west-2"
bucket_name: "your-s3-bucket-name"
local_folder: "D:\\Path\\To\\Local\\Folder"
s3_prefix: "optional/prefix/"
- Your AWS access key.
- Your AWS secret access key.
- AWS region where your bucket is located.
- Name of the S3 bucket.
- Path to the local folder with files to upload.
- (optional): Prefix to organize files within the S3 bucket.
- Clone or download the project.
- Navigate to the project folder:
cd AWS_Upload
- Install dependencies:
pip install -r requirements.txt
Run the script from the project directory:
python src/aws_uploader.py
- Log files are saved in the
logs/
folder. - Log format:
LOG_dd_mm_yyyy_hh_mm_ss.log
- Logs include information about upload progress and any errors.
- Errors during file uploads (e.g., invalid credentials, missing bucket) are logged both in the console and in the log file.
- General exceptions are also captured and logged for debugging.
Pull requests are welcome. For major changes, please open an issue first to discuss what you'd like to change.
This project is licensed under the MIT License.