Skip to content

Commit

Permalink
updates for PyPI release
Browse files Browse the repository at this point in the history
  • Loading branch information
srastatter committed Mar 6, 2023
1 parent 94ec41d commit 024594d
Show file tree
Hide file tree
Showing 16 changed files with 198 additions and 156 deletions.
37 changes: 18 additions & 19 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -10,25 +10,24 @@ __pycache__/
*.so

# Distribution / packaging
# Temporarily commenting these out
#.Python
# build/
# develop-eggs/
# dist/
# downloads/
# eggs/
# .eggs/
# lib/
# lib64/
# parts/
# sdist/
# var/
# wheels/
# share/python-wheels/
# *.egg-info/
# .installed.cfg
# *.egg
# MANIFEST
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST

# PyInstaller
# Usually these files are written by a python script from a template
Expand Down
24 changes: 0 additions & 24 deletions AutoMLOps.egg-info/PKG-INFO

This file was deleted.

16 changes: 0 additions & 16 deletions AutoMLOps.egg-info/SOURCES.txt

This file was deleted.

1 change: 0 additions & 1 deletion AutoMLOps.egg-info/dependency_links.txt

This file was deleted.

7 changes: 0 additions & 7 deletions AutoMLOps.egg-info/requires.txt

This file was deleted.

1 change: 0 additions & 1 deletion AutoMLOps.egg-info/top_level.txt

This file was deleted.

13 changes: 6 additions & 7 deletions AutoMLOps/AutoMLOps.py
Original file line number Diff line number Diff line change
Expand Up @@ -304,7 +304,6 @@ def _create_default_config(af_registry_location: str,
f' cloud_schedule_pattern: {schedule_pattern}\n'
f' cloud_source_repository: {csr_name}\n'
f' cloud_source_repository_branch: {csr_branch_name}\n'
f' gs_bucket_location: {gs_bucket_location}\n'
f' gs_bucket_name: {gs_bucket_name}\n'
f' pipeline_runner_service_account: {pipeline_runner_sa}\n'
f' project_id: {project_id}\n'
Expand Down Expand Up @@ -424,7 +423,7 @@ def _create_resources_scripts(run_local: bool):
f' sourcerepo.googleapis.com\n'
f'\n'
f'echo -e "$GREEN Checking for Artifact Registry: $AF_REGISTRY_NAME in project $PROJECT_ID $NC"\n'
f'if ! (gcloud artifacts repositories list --project="$PROJECT_ID" --location=$AF_REGISTRY_LOCATION | grep --fixed-strings "(^|[[:blank:]])$AF_REGISTRY_NAME($|[[:blank:]]))"; then\n'
f'if ! (gcloud artifacts repositories list --project="$PROJECT_ID" --location=$AF_REGISTRY_LOCATION | grep -E "(^|[[:blank:]])$AF_REGISTRY_NAME($|[[:blank:]])"); then\n'
f'\n'
f' echo "Creating Artifact Registry: ${left_bracket}AF_REGISTRY_NAME{right_bracket} in project $PROJECT_ID"\n'
f' gcloud artifacts repositories create "$AF_REGISTRY_NAME" \{newline}'
Expand All @@ -441,7 +440,7 @@ def _create_resources_scripts(run_local: bool):
f'\n'
f'\n'
f'echo -e "$GREEN Checking for GS Bucket: $BUCKET_NAME in project $PROJECT_ID $NC"\n'
f'if !(gsutil ls -b gs://$BUCKET_NAME | grep --fixed-strings "(^|[[:blank:]])$BUCKET_NAME($|[[:blank:]]))"; then\n'
f'if !(gsutil ls -b gs://$BUCKET_NAME | grep --fixed-strings "$BUCKET_NAME"); then\n'
f'\n'
f' echo "Creating GS Bucket: ${left_bracket}BUCKET_NAME{right_bracket} in project $PROJECT_ID"\n'
f' gsutil mb -l ${left_bracket}BUCKET_LOCATION{right_bracket} gs://$BUCKET_NAME\n'
Expand All @@ -453,7 +452,7 @@ def _create_resources_scripts(run_local: bool):
f'fi\n'
f'\n'
f'echo -e "$GREEN Checking for Service Account: $SERVICE_ACCOUNT_NAME in project $PROJECT_ID $NC"\n'
f'if ! (gcloud iam service-accounts list --project="$PROJECT_ID" | grep --fixed-strings "(^|[[:blank:]])$SERVICE_ACCOUNT_FULL($|[[:blank:]]))"; then\n'
f'if ! (gcloud iam service-accounts list --project="$PROJECT_ID" | grep -E "(^|[[:blank:]])$SERVICE_ACCOUNT_FULL($|[[:blank:]])"); then\n'
f'\n'
f' echo "Creating Service Account: ${left_bracket}SERVICE_ACCOUNT_NAME{right_bracket} in project $PROJECT_ID"\n'
f' gcloud iam service-accounts create $SERVICE_ACCOUNT_NAME \{newline}'
Expand Down Expand Up @@ -522,7 +521,7 @@ def _create_resources_scripts(run_local: bool):
f' --no-user-output-enabled\n'
f'\n'
f'echo -e "$GREEN Checking for Cloud Source Repository: $CLOUD_SOURCE_REPO in project $PROJECT_ID $NC"\n'
f'if ! (gcloud source repos list --project="$PROJECT_ID" | grep --fixed-strings "(^|[[:blank:]])$CLOUD_SOURCE_REPO($|[[:blank:]]))"; then\n'
f'if ! (gcloud source repos list --project="$PROJECT_ID" | grep -E "(^|[[:blank:]])$CLOUD_SOURCE_REPO($|[[:blank:]])"); then\n'
f'\n'
f' echo "Creating Cloud Source Repository: ${left_bracket}CLOUD_SOURCE_REPO{right_bracket} in project $PROJECT_ID"\n'
f' gcloud source repos create $CLOUD_SOURCE_REPO\n'
Expand All @@ -537,7 +536,7 @@ def _create_resources_scripts(run_local: bool):
f'\n'
f'# Create cloud tasks queue\n'
f'echo -e "$GREEN Checking for Cloud Tasks Queue: $CLOUD_TASKS_QUEUE_NAME in project $PROJECT_ID $NC"\n'
f'if ! (gcloud tasks queues list --location $CLOUD_TASKS_QUEUE_LOCATION | grep --fixed-strings "(^|[[:blank:]])$CLOUD_TASKS_QUEUE_NAME($|[[:blank:]]))"; then\n'
f'if ! (gcloud tasks queues list --location $CLOUD_TASKS_QUEUE_LOCATION | grep -E "(^|[[:blank:]])$CLOUD_TASKS_QUEUE_NAME($|[[:blank:]])"); then\n'
f'\n'
f' echo "Creating Cloud Tasks Queue: ${left_bracket}CLOUD_TASKS_QUEUE_NAME{right_bracket} in project $PROJECT_ID"\n'
f' gcloud tasks queues create $CLOUD_TASKS_QUEUE_NAME \{newline}'
Expand All @@ -551,7 +550,7 @@ def _create_resources_scripts(run_local: bool):
f'\n'
f'# Create cloud build trigger\n'
f'echo -e "$GREEN Checking for Cloudbuild Trigger: $CB_TRIGGER_NAME in project $PROJECT_ID $NC"\n'
f'if ! (gcloud beta builds triggers list --project="$PROJECT_ID" --region="$CB_TRIGGER_LOCATION" | grep --fixed-strings "(^|[[:blank:]])name: $CB_TRIGGER_NAME($|[[:blank:]]))"; then\n'
f'if ! (gcloud beta builds triggers list --project="$PROJECT_ID" --region="$CB_TRIGGER_LOCATION" | grep -E "(^|[[:blank:]])name: $CB_TRIGGER_NAME($|[[:blank:]])"); then\n'
f'\n'
f' echo "Creating Cloudbuild Trigger on branch $CLOUD_SOURCE_REPO_BRANCH in project $PROJECT_ID for repo ${left_bracket}CLOUD_SOURCE_REPO{right_bracket}"\n'
f' gcloud beta builds triggers create cloud-source-repositories \{newline}'
Expand Down
2 changes: 1 addition & 1 deletion AutoMLOps/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,6 @@
series of directories to support the creation of Vertex Pipelines.
"""
# pylint: disable=invalid-name
__version__ = '1.0.2'
__version__ = '1.0.5'
__author__ = 'Sean Rastatter'
__credits__ = 'Google'
Binary file modified AutoMLOps_Implementation_Guide_External.pdf
Binary file not shown.
13 changes: 13 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,19 @@
# Change Log
All notable changes to this project will be documented in this file.

## [1.0.5] - 2023-03-06

Official release on PyPI.

## [1.0.3] - 2023-03-06

Staging for PyPI.

### Changed

- Cleaning up wheel and egg files from repo.
- Remove dist/ and build/ directories.

## [1.0.2] - 2023-03-06

Added feature to allow for accelerated/distributed model training.
Expand Down
9 changes: 4 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ In order to use AutoMLOps, the following are required:

- Jupyter (or Jupyter-compatible) notebook environment
- [Notebooks API](https://console.cloud.google.com/marketplace/product/google/notebooks.googleapis.com) enabled
- Python 3.0 - 3.10
- Python 3.7 - 3.10
- [Google Cloud SDK 407.0.0](https://cloud.google.com/sdk/gcloud/reference)
- [beta 2022.10.21](https://cloud.google.com/sdk/gcloud/reference/beta)
- `git` installed
Expand All @@ -25,9 +25,9 @@ gcloud config set account <[email protected]>

# Install

Clone the repo and install either via setup.py or wheel (wheel requires less processing):
- setup.py: `pip install .`
- wheel: `pip install dist/AutoMLOps-1.0.2-py2.py3-none-any.whl`
Install AutoMLOps from [PyPI](https://pypi.org/project/google-cloud-automlops/): `pip install google-cloud-automlops`

Or Install locally by cloning the repo and running `pip install .`

# Dependencies
- `autoflake==2.0.0`,
Expand Down Expand Up @@ -203,7 +203,6 @@ The [example notebook](example/automlops_example_notebook.ipynb) comes with 3 co
- [Google Cloud Pipeline Components](https://github.com/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/official/pipelines/custom_model_training_and_batch_prediction.ipynb)
# Next Steps / Backlog
- PyPI
- Refine unit tests
- Use [terraform](https://github.com/GoogleCloudPlatform/vertex-pipelines-end-to-end-samples/tree/main/terraform) for the creation of resources.
- Allow multiple AutoMLOps pipelines within the same directory
Expand Down
Binary file removed dist/AutoMLOps-1.0.0-py2.py3-none-any.whl
Binary file not shown.
Binary file removed dist/AutoMLOps-1.0.1-py2.py3-none-any.whl
Binary file not shown.
Binary file removed dist/AutoMLOps-1.0.2-py2.py3-none-any.whl
Binary file not shown.
Loading

0 comments on commit 024594d

Please sign in to comment.