Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
The Ray provider 0.2.1 allowed users to define a hard-coded configuration to materialize the Kubernetes cluster. This PR aims to enable users to define a function that can receive the Airflow context and generate the configuration dynamically using context properties. This request came from an Astronomer customer. There is an example DAG file illustrating how to use this feature. It has a parent DAG that triggers two child DAGs, which leverage the just introduced `@ray.task` callable configuration. The screenshots below show their success, when using the [local development instructions](https://github.com/astronomer/astro-provider-ray/blob/main/docs/getting_started/local_development_setup.rst) using Astro CLI. Parent DAG: <img width="1624" alt="Screenshot 2024-11-29 at 12 15 13" src="https://github.com/user-attachments/assets/586b4575-ee62-4344-bbd7-a1a6423360ce"> Child 1 DAG: <img width="1624" alt="Screenshot 2024-11-29 at 12 15 56" src="https://github.com/user-attachments/assets/23d89288-c68a-498e-848a-743fb2684c4f"> Example of logs that illustrate the RayCluster using dynamic configuration was created and used in Kubernetes, with its own IP address: ``` (...) [2024-11-29T12:14:52.276+0000] {standard_task_runner.py:104} INFO - Running: ['airflow', 'tasks', 'run', 'ray_dynamic_config_child_1', 'process_data_with_ray', 'manual__2024-11-29T12:14:50.273712+00:00', '--job-id', '773', '--raw', '--subdir', 'DAGS_FOLDER/ray_dynamic_config.py', '--cfg-path', '/tmp/tmpkggwlv23'] [2024-11-29T12:14:52.278+0000] {logging_mixin.py:190} WARNING - /usr/local/lib/python3.12/site-packages/airflow/task/task_runner/standard_task_runner.py:70 DeprecationWarning: This process (pid=238) is multi-threaded, use of fork() may lead to deadlocks in the child. (...) [2024-11-29T12:14:52.745+0000] {decorators.py:94} INFO - Using the following config {'conn_id': 'ray_conn', 'runtime_env': {'working_dir': '/usr/local/airflow/dags/ray_scripts', 'pip': ['numpy']}, 'num_cpus': 1, 'num_gpus': 0, 'memory': 0, 'poll_interval': 5, 'ray_cluster_yaml': '/usr/local/airflow/dags/scripts/first-254.yaml', 'xcom_task_key': 'dashboard'} (...) [2024-11-29T12:14:55.430+0000] {hooks.py:474} INFO - ::group::Create Ray Cluster [2024-11-29T12:14:55.430+0000] {hooks.py:475} INFO - Loading yaml content for Ray cluster CRD... [2024-11-29T12:14:55.451+0000] {hooks.py:410} INFO - Creating new Ray cluster: first-254 [2024-11-29T12:14:55.456+0000] {hooks.py:494} INFO - ::endgroup:: (...) [2024-11-29T12:14:55.663+0000] {hooks.py:498} INFO - ::group::Setup Load Balancer service [2024-11-29T12:14:55.663+0000] {hooks.py:334} INFO - Attempt 1: Checking LoadBalancer status... [2024-11-29T12:14:55.669+0000] {hooks.py:278} ERROR - Error getting service first-254-head-svc: (404) Reason: Not Found HTTP response headers: HTTPHeaderDict({'Audit-Id': '81b07ac4-db3b-48a6-b336-f52ae93bee55', 'Cache-Control': 'no-cache, private', 'Content-Type': 'application/json', 'X-Kubernetes-Pf-Flowschema-Uid': '955e8bb0-08b1-4d45-a768-e49387a9767c', 'X-Kubernetes-Pf-Prioritylevel-Uid': 'd5240328-288d-4366-b094-d8fd793c7431', 'Date': 'Fri, 29 Nov 2024 12:14:55 GMT', 'Content-Length': '212'}) HTTP response body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"services \"first-254-head-svc\" not found","reason":"NotFound","details":{"name":"first-254-head-svc","kind":"services"},"code":404} [2024-11-29T12:14:55.669+0000] {hooks.py:355} INFO - LoadBalancer service is not available yet... [2024-11-29T12:15:35.670+0000] {hooks.py:334} INFO - Attempt 2: Checking LoadBalancer status... [2024-11-29T12:15:35.688+0000] {hooks.py:348} INFO - LoadBalancer is ready. [2024-11-29T12:15:35.688+0000] {hooks.py:441} INFO - {'ip': '172.18.255.1', 'hostname': None, 'ports': [{'name': 'client', 'port': 10001}, {'name': 'dashboard', 'port': 8265}, {'name': 'gcs', 'port': 6379}, {'name': 'metrics', 'port': 8080}, {'name': 'serve', 'port': 8000}], 'working_address': '172.18.255.1'} (...) [2024-11-29T12:15:38.345+0000] {triggers.py:124} INFO - ::group:: Trigger 1/2: Checking the job status [2024-11-29T12:15:38.345+0000] {triggers.py:125} INFO - Polling for job raysubmit_paxAkyLiKxEHPmwG every 5 seconds... (...) [2024-11-29T12:15:38.354+0000] {hooks.py:156} INFO - Dashboard URL is: http://172.18.255.1:8265 [2024-11-29T12:15:38.361+0000] {hooks.py:208} INFO - Job raysubmit_paxAkyLiKxEHPmwG status: PENDING [2024-11-29T12:15:38.361+0000] {triggers.py:100} INFO - Status of job raysubmit_paxAkyLiKxEHPmwG is: PENDING [2024-11-29T12:15:38.361+0000] {triggers.py:108} INFO - ::group::raysubmit_paxAkyLiKxEHPmwG logs [2024-11-29T12:15:43.416+0000] {hooks.py:208} INFO - Job raysubmit_paxAkyLiKxEHPmwG status: RUNNING [2024-11-29T12:15:43.416+0000] {triggers.py:100} INFO - Status of job raysubmit_paxAkyLiKxEHPmwG is: RUNNING [2024-11-29T12:15:43.417+0000] {triggers.py:112} INFO - 2024-11-29 04:15:40,813 INFO worker.py:1429 -- Using address 10.244.0.140:6379 set in the environment variable RAY_ADDRESS [2024-11-29T12:15:43.417+0000] {triggers.py:112} INFO - 2024-11-29 04:15:40,814 INFO worker.py:1564 -- Connecting to existing Ray cluster at address: 10.244.0.140:6379... [2024-11-29T12:15:43.417+0000] {triggers.py:112} INFO - 2024-11-29 04:15:40,820 INFO worker.py:1740 -- Connected to Ray cluster. View the dashboard at �[1m�[32m10.244.0.140:8265 �[39m�[22m [2024-11-29T12:15:48.430+0000] {hooks.py:208} INFO - Job raysubmit_paxAkyLiKxEHPmwG status: SUCCEEDED [2024-11-29T12:15:48.430+0000] {triggers.py:112} INFO - Mean of this population is 12.0 [2024-11-29T12:15:48.430+0000] {triggers.py:112} INFO - �[36m(autoscaler +5s)�[0m Tip: use `ray status` to view detailed cluster status. To disable these messages, set RAY_SCHEDULER_EVENTS=0. [2024-11-29T12:15:48.430+0000] {triggers.py:112} INFO - �[36m(autoscaler +5s)�[0m Adding 1 node(s) of type small-group. [2024-11-29T12:15:49.448+0000] {triggers.py:113} INFO - ::endgroup:: [2024-11-29T12:15:49.448+0000] {triggers.py:144} INFO - ::endgroup:: [2024-11-29T12:15:49.448+0000] {triggers.py:145} INFO - ::group:: Trigger 2/2: Job reached a terminal state [2024-11-29T12:15:49.448+0000] {triggers.py:146} INFO - Status of completed job raysubmit_paxAkyLiKxEHPmwG is: SUCCEEDED (...) ``` Child 2 DAG: <img width="1624" alt="Screenshot 2024-11-29 at 12 17 20" src="https://github.com/user-attachments/assets/5f0320a2-3bce-49a9-8580-a584d1f894dc"> Kubernetes RayClusters spun: <img width="758" alt="Screenshot 2024-11-29 at 12 15 37" src="https://github.com/user-attachments/assets/aabcce4a-ce4a-47db-b9cf-e87cf68f2316"> **Limitations** The example DAGs are not currently being executed in the CI, but there is a dedicated ticket for this work: #95 **References** This PR had inspiration from: #67
- Loading branch information