Skip to content

Commit

Permalink
Merge pull request #66 from GoogleCloudPlatform/feature/pipeline_job_…
Browse files Browse the repository at this point in the history
…location

Feature/pipeline job location
  • Loading branch information
srastatter authored Dec 6, 2024
2 parents 047a53d + 6bacb5c commit 75bd1d9
Show file tree
Hide file tree
Showing 12 changed files with 60 additions and 67 deletions.
Binary file modified AutoMLOps_User_Guide.pdf
Binary file not shown.
12 changes: 12 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,18 @@
# Change Log
All notable changes to this project will be documented in this file.

## [1.3.3] - 2024-12-02

### Added

- Added new pipeline parameter `pipeline_job_location` which defaults to `'us-central1'`.

### Changed

### Fixed

- Bug related to "CLOUD_SOURCE_REPOSITORIES" enum value still being used in utils.py

## [1.3.2] - 2024-11-21

### Added
Expand Down
46 changes: 24 additions & 22 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -226,28 +226,29 @@ Optional parameters (defaults shown):
8. `deployment_framework: str = 'github-actions'`
9. `naming_prefix: str = 'automlops-default-prefix'`
10. `orchestration_framework: str = 'kfp'`
11. `pipeline_job_runner_service_account: str = f'vertex-pipelines@{project_id}.iam.gserviceaccount.com'`
12. `pipeline_job_submission_service_location: str = 'us-central1'`
13. `pipeline_job_submission_service_name: str = f'{naming_prefix}-job-submission-svc'`
14. `pipeline_job_submission_service_type: str = 'cloud-functions'`
15. `project_number: str = None`
16. `provision_credentials_key: str = None`
17. `provisioning_framework: str = 'gcloud'`
18. `pubsub_topic_name: str = f'{naming_prefix}-queueing-svc'`
19. `schedule_location: str = 'us-central1'`
20. `schedule_name: str = f'{naming_prefix}-schedule'`
21. `schedule_pattern: str = 'No Schedule Specified'`
22. `setup_model_monitoring: Optional[bool] = False`
23. `source_repo_branch: str = 'automlops'`
24. `source_repo_name: str = f'{naming_prefix}-repository'`
25. `source_repo_type: str = 'github'`
26. `storage_bucket_location: str = 'us-central1'`
27. `storage_bucket_name: str = f'{project_id}-{naming_prefix}-bucket'`
28. `use_ci: bool = False`
29. `vpc_connector: str = 'No VPC Specified'`
30. `workload_identity_pool: str = None`
31. `workload_identity_provider: str = None`
32. `workload_identity_service_account: str = None`
11. `pipeline_job_location: str = 'us-central1'`
12. `pipeline_job_runner_service_account: str = f'vertex-pipelines@{project_id}.iam.gserviceaccount.com'`
13. `pipeline_job_submission_service_location: str = 'us-central1'`
14. `pipeline_job_submission_service_name: str = f'{naming_prefix}-job-submission-svc'`
15. `pipeline_job_submission_service_type: str = 'cloud-functions'`
16. `project_number: str = None`
17. `provision_credentials_key: str = None`
18. `provisioning_framework: str = 'gcloud'`
19. `pubsub_topic_name: str = f'{naming_prefix}-queueing-svc'`
20. `schedule_location: str = 'us-central1'`
21. `schedule_name: str = f'{naming_prefix}-schedule'`
22. `schedule_pattern: str = 'No Schedule Specified'`
23. `setup_model_monitoring: Optional[bool] = False`
24. `source_repo_branch: str = 'automlops'`
25. `source_repo_name: str = f'{naming_prefix}-repository'`
26. `source_repo_type: str = 'github'`
27. `storage_bucket_location: str = 'us-central1'`
28. `storage_bucket_name: str = f'{project_id}-{naming_prefix}-bucket'`
29. `use_ci: bool = False`
30. `vpc_connector: str = 'No VPC Specified'`
31. `workload_identity_pool: str = None`
32. `workload_identity_provider: str = None`
33. `workload_identity_service_account: str = None`

Parameter Options:
- `artifact_repo_type=`:
Expand Down Expand Up @@ -289,6 +290,7 @@ A description of the parameters is below:
- `deployment_framework`: The CI tool to use (e.g. cloud build, github actions, etc.)
- `naming_prefix`: Unique value used to differentiate pipelines and services across AutoMLOps runs.
- `orchestration_framework`: The orchestration framework to use (e.g. kfp, tfx, etc.)
- `pipeline_job_location`: The location to run the Pipeline Job in.
- `pipeline_job_runner_service_account`: Service Account to run PipelineJobs (specify the full string).
- `pipeline_job_submission_service_location`: The location of the cloud submission service.
- `pipeline_job_submission_service_name`: The name of the cloud submission service.
Expand Down
5 changes: 5 additions & 0 deletions google_cloud_automlops/AutoMLOps.py
Original file line number Diff line number Diff line change
Expand Up @@ -113,6 +113,7 @@ def launchAll(
deployment_framework: Optional[str] = Deployer.GITHUB_ACTIONS.value,
naming_prefix: Optional[str] = DEFAULT_NAMING_PREFIX,
orchestration_framework: Optional[str] = Orchestrator.KFP.value,
pipeline_job_location: Optional[str] = DEFAULT_RESOURCE_LOCATION,
pipeline_job_runner_service_account: Optional[str] = None,
pipeline_job_submission_service_location: Optional[str] = DEFAULT_RESOURCE_LOCATION,
pipeline_job_submission_service_name: Optional[str] = None,
Expand Down Expand Up @@ -153,6 +154,7 @@ def launchAll(
deployment_framework: The CI tool to use (e.g. cloud build, github actions, etc.)
naming_prefix: Unique value used to differentiate pipelines and services across AutoMLOps runs.
orchestration_framework: The orchestration framework to use (e.g. kfp, tfx, etc.)
pipeline_job_location: The location to run the Pipeline Job in.
pipeline_job_runner_service_account: Service Account to run PipelineJobs (specify the full string).
pipeline_job_submission_service_location: The location of the cloud submission service.
pipeline_job_submission_service_name: The name of the cloud submission service.
Expand Down Expand Up @@ -191,6 +193,7 @@ def launchAll(
deployment_framework=deployment_framework,
naming_prefix=naming_prefix,
orchestration_framework=orchestration_framework,
pipeline_job_location=pipeline_job_location,
pipeline_job_runner_service_account=pipeline_job_runner_service_account,
pipeline_job_submission_service_location=pipeline_job_submission_service_location,
pipeline_job_submission_service_name=pipeline_job_submission_service_name,
Expand Down Expand Up @@ -230,6 +233,7 @@ def generate(
deployment_framework: Optional[str] = Deployer.GITHUB_ACTIONS.value,
naming_prefix: Optional[str] = DEFAULT_NAMING_PREFIX,
orchestration_framework: Optional[str] = Orchestrator.KFP.value,
pipeline_job_location: Optional[str] = DEFAULT_RESOURCE_LOCATION,
pipeline_job_runner_service_account: Optional[str] = None,
pipeline_job_submission_service_location: Optional[str] = DEFAULT_RESOURCE_LOCATION,
pipeline_job_submission_service_name: Optional[str] = None,
Expand Down Expand Up @@ -333,6 +337,7 @@ def generate(
deployment_framework=deployment_framework,
naming_prefix=naming_prefix,
orchestration_framework=orchestration_framework,
pipeline_job_location=pipeline_job_location,
pipeline_job_runner_service_account=derived_pipeline_job_runner_service_account,
pipeline_job_submission_service_location=pipeline_job_submission_service_location,
pipeline_job_submission_service_name=derived_pipeline_job_submission_service_name,
Expand Down
2 changes: 1 addition & 1 deletion google_cloud_automlops/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,6 @@
series of directories to support the creation of Vertex Pipelines.
"""
# pylint: disable=invalid-name
__version__ = '1.3.2'
__version__ = '1.3.3'
__author__ = 'Sean Rastatter'
__credits__ = 'Google'
1 change: 1 addition & 0 deletions google_cloud_automlops/orchestration/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -327,6 +327,7 @@ def build(self):
# Extract additional attributes from defaults file
defaults = read_yaml_file(GENERATED_DEFAULTS_FILE)
self.pipeline_storage_path = defaults['pipelines']['pipeline_storage_path']
self.pipeline_job_location = defaults['gcp']['pipeline_job_location']
self.pipeline_job_runner_service_account = defaults['gcp']['pipeline_job_runner_service_account']
self.pipeline_job_submission_service_type = defaults['gcp']['pipeline_job_submission_service_type']
self.project_id = defaults['gcp']['project_id']
Expand Down
1 change: 1 addition & 0 deletions google_cloud_automlops/orchestration/kfp.py
Original file line number Diff line number Diff line change
Expand Up @@ -492,6 +492,7 @@ def _build_submission_services(self):
template_path=import_files(KFP_TEMPLATES_PATH + '.services.submission_service') / 'main.py.j2',
generated_license=GENERATED_LICENSE,
pipeline_root=self.pipeline_storage_path,
pipeline_job_location=self.pipeline_job_location,
pipeline_job_runner_service_account=self.pipeline_job_runner_service_account,
pipeline_job_submission_service_type=self.pipeline_job_submission_service_type,
project_id=self.project_id,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ logger.setLevel(log_level)
def run_pipeline(
project_id: str,
pipeline_root: str,
pipeline_job_location: str,
pipeline_job_runner_service_account: str,
parameter_values_path: str,
pipeline_spec_path: str,
Expand All @@ -26,6 +27,7 @@ def run_pipeline(
Args:
project_id: The project_id.
pipeline_root: GCS location of the pipeline runs metadata.
pipeline_job_location: The location to run the Pipeline Job in.
pipeline_job_runner_service_account: Service Account to runner PipelineJobs.
parameter_values_path: Location of parameter values JSON.
pipeline_spec_path: Location of the pipeline spec JSON.
Expand Down Expand Up @@ -54,6 +56,7 @@ def run_pipeline(
aiplatform.init(project=project_id)
job = aiplatform.PipelineJob(
display_name = display_name,
location = pipeline_job_location,
template_path = pipeline_spec_path,
pipeline_root = pipeline_root,
parameter_values = pipeline_params,
Expand All @@ -76,6 +79,7 @@ if __name__ == '__main__':
run_pipeline(
project_id=config['gcp']['project_id'],
pipeline_root=config['pipelines']['pipeline_storage_path'],
pipeline_job_location=config['gcp']['pipeline_job_location'],
pipeline_job_runner_service_account=config['gcp']['pipeline_job_runner_service_account'],
parameter_values_path=config['pipelines']['parameter_values_path'],
pipeline_spec_path=config['pipelines']['pipeline_job_spec_path'])
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ import google.cloud.logging
NAMING_PREFIX = '{{naming_prefix}}'{% endif %}
PROJECT_ID = '{{project_id}}'
PIPELINE_ROOT = '{{pipeline_root}}'
PIPELINE_JOB_LOCATION = '{{pipeline_job_location}}'
PIPELINE_JOB_RUNNER_SERVICE_ACCOUNT = '{{pipeline_job_runner_service_account}}'

{% if pipeline_job_submission_service_type == 'cloud-run' %}app = flask.Flask(__name__){% endif %}
Expand Down Expand Up @@ -87,6 +88,7 @@ def process_request({% if pipeline_job_submission_service_type == 'cloud-functio
dashboard_uri, resource_name = submit_pipeline(
project_id=PROJECT_ID,
pipeline_root=PIPELINE_ROOT,
pipeline_job_location=PIPELINE_JOB_LOCATION,
pipeline_job_runner_service_account=PIPELINE_JOB_RUNNER_SERVICE_ACCOUNT,
pipeline_params=data_payload,
pipeline_spec_path=gs_pipeline_spec_path,
Expand All @@ -104,6 +106,7 @@ def process_request({% if pipeline_job_submission_service_type == 'cloud-functio
def submit_pipeline(
project_id: str,
pipeline_root: str,
pipeline_job_location: str,
pipeline_job_runner_service_account: str,
pipeline_params: dict,
pipeline_spec_path: str,
Expand All @@ -116,6 +119,7 @@ def submit_pipeline(
Args:
project_id: The project_id.
pipeline_root: GCS location of the pipeline runs metadata.
pipeline_job_location: The location to run the Pipeline Job in.
pipeline_job_runner_service_account: Service Account to runner PipelineJobs.
pipeline_params: Pipeline parameters values.
pipeline_spec_path: Location of the pipeline spec JSON.
Expand All @@ -130,6 +134,7 @@ def submit_pipeline(
aiplatform.init(project=project_id)
job = aiplatform.PipelineJob(
display_name = display_name,
location = pipeline_job_location,
template_path = pipeline_spec_path,
pipeline_root = pipeline_root,
parameter_values = pipeline_params,
Expand Down
3 changes: 0 additions & 3 deletions google_cloud_automlops/provisioning/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,6 @@

from google_cloud_automlops.utils.enums import (
ArtifactRepository,
CodeRepository,
Orchestrator,
PipelineJobSubmitter
)
Expand Down Expand Up @@ -113,8 +112,6 @@ def _get_required_apis(self):
required_apis.append('run.googleapis.com')
if self.pipeline_job_submission_service_type == PipelineJobSubmitter.CLOUD_FUNCTIONS.value:
required_apis.append('cloudfunctions.googleapis.com')
if self.source_repo_type == CodeRepository.CLOUD_SOURCE_REPOSITORIES.value:
required_apis.append('sourcerepo.googleapis.com')
if self.setup_model_monitoring:
required_apis.append('logging.googleapis.com')
return required_apis
Loading

0 comments on commit 75bd1d9

Please sign in to comment.