chore: Adding Spark35 support #2182
Build #20250109.4 had test failures
Details
- Failed: 12 (0.70%, 12 new, 0 recurring)
- Passed: 1,683 (98.25%)
- Other: 18 (1.05%)
- Total: 1,713
Annotations
Check failure on line 2206 in Build log
azure-pipelines / SynapseML-Official
Build log #L2206
Script failed with exit code: 1
Check failure on line 4129 in Build log
azure-pipelines / SynapseML-Official
Build log #L4129
Script failed with exit code: 1
Check failure on line 1218 in Build log
azure-pipelines / SynapseML-Official
Build log #L1218
Script failed with exit code: 1
Check failure on line 1056 in Build log
azure-pipelines / SynapseML-Official
Build log #L1056
Script failed with exit code: 1
Check failure on line 1 in (It is not a test it is a sbt.testing.SuiteSelector)
azure-pipelines / SynapseML-Official
(It is not a test it is a sbt.testing.SuiteSelector)
Failed: response: HttpResponseProxy{HTTP/1.1 400 Bad Request [Cache-Control: no-cache, Pragma: no-cache, Content-Length: 187, Content-Type: application/json; charset=utf-8, Expires: -1, Strict-Transport-Security: max-age=31536000; includeSubDomains, x-ms-request-id: 46a418f2-4051-4944-9895-bd54641c902c, x-ms-ratelimit-remaining-subscription-writes: 799, x-ms-ratelimit-remaining-subscription-global-writes: 11999, x-ms-correlation-request-id: 2b15b821-a0f2-4a5a-b6a5-75fc1523792a, x-ms-routing-request-id: WESTUS:20250110T004419Z:2b15b821-a0f2-4a5a-b6a5-75fc1523792a, X-Content-Type-Options: nosniff, X-Cache: CONFIG_NOCACHE, X-MSEdge-Ref: Ref A: B95978D083DC4FB1BDB1079F15E86DA7 Ref B: SJC211051205045 Ref C: 2025-01-10T00:44:18Z, Date: Fri, 10 Jan 2025 00:44:19 GMT] ResponseEntityProxy{[Content-Type: application/json; charset=utf-8,Content-Length: 187,Chunked: false]}} requestUrl: https://management.azure.com/subscriptions/e342c2c0-f844-4b18-9208-52c8c234c30e/resourceGroups/marhamil-mmlspark/providers/Microsoft.Synapse/workspaces/mmlsparkbuild/bigDataPools/tc1736469851485?api-version=2021-06-01-preview requestBody: { "name": "tc1736469851485", "location": "eastus2", "tags": { "createdBy": "SynapseE2E Tests", "createdAt": "Fri Jan 10 00:44:11 UTC 2025", "buildId": "unknown", "buildNumber": "unknown", }, "properties": { "autoPause": { "delayInMinutes": "10", "enabled": "true" }, "autoScale": { "enabled": "true", "maxNodeCount": "10", "minNodeCount": "3" }, "cacheSize": "20", "dynamicExecutorAllocation": { "enabled": "true", "maxExecutors": "8", "minExecutors": "2" }, "isComputeIsolationEnabled": "false", "nodeCount": "0", "nodeSize": "Small", "nodeSizeFamily": "MemoryOptimized", "provisioningState": "Succeeded", "sessionLevelPackagesEnabled": "true", "sparkVersion": "3.5" } } responseBody: {"error":{"code":"ValidationFailed","message":"Spark pool request validation failed.","details":[{"code":"InvalidSparkComputeVersion","message":"Spark Compute version: 3.5 is invalid"}]}}
Raw output
java.lang.RuntimeException: Failed:
response: HttpResponseProxy{HTTP/1.1 400 Bad Request [Cache-Control: no-cache, Pragma: no-cache, Content-Length: 187, Content-Type: application/json; charset=utf-8, Expires: -1, Strict-Transport-Security: max-age=31536000; includeSubDomains, x-ms-request-id: 46a418f2-4051-4944-9895-bd54641c902c, x-ms-ratelimit-remaining-subscription-writes: 799, x-ms-ratelimit-remaining-subscription-global-writes: 11999, x-ms-correlation-request-id: 2b15b821-a0f2-4a5a-b6a5-75fc1523792a, x-ms-routing-request-id: WESTUS:20250110T004419Z:2b15b821-a0f2-4a5a-b6a5-75fc1523792a, X-Content-Type-Options: nosniff, X-Cache: CONFIG_NOCACHE, X-MSEdge-Ref: Ref A: B95978D083DC4FB1BDB1079F15E86DA7 Ref B: SJC211051205045 Ref C: 2025-01-10T00:44:18Z, Date: Fri, 10 Jan 2025 00:44:19 GMT] ResponseEntityProxy{[Content-Type: application/json; charset=utf-8,Content-Length: 187,Chunked: false]}}
requestUrl: https://management.azure.com/subscriptions/e342c2c0-f844-4b18-9208-52c8c234c30e/resourceGroups/marhamil-mmlspark/providers/Microsoft.Synapse/workspaces/mmlsparkbuild/bigDataPools/tc1736469851485?api-version=2021-06-01-preview
requestBody:
{
"name": "tc1736469851485",
"location": "eastus2",
"tags": {
"createdBy": "SynapseE2E Tests",
"createdAt": "Fri Jan 10 00:44:11 UTC 2025",
"buildId": "unknown",
"buildNumber": "unknown",
},
"properties": {
"autoPause": {
"delayInMinutes": "10",
"enabled": "true"
},
"autoScale": {
"enabled": "true",
"maxNodeCount": "10",
"minNodeCount": "3"
},
"cacheSize": "20",
"dynamicExecutorAllocation": {
"enabled": "true",
"maxExecutors": "8",
"minExecutors": "2"
},
"isComputeIsolationEnabled": "false",
"nodeCount": "0",
"nodeSize": "Small",
"nodeSizeFamily": "MemoryOptimized",
"provisioningState": "Succeeded",
"sessionLevelPackagesEnabled": "true",
"sparkVersion": "3.5"
}
}
responseBody: {"error":{"code":"ValidationFailed","message":"Spark pool request validation failed.","details":[{"code":"InvalidSparkComputeVersion","message":"Spark Compute version: 3.5 is invalid"}]}}
at com.microsoft.azure.synapse.ml.io.http.RESTHelpers$.$anonfun$safeSend$1(RESTHelpers.scala:77)
at com.microsoft.azure.synapse.ml.io.http.RESTHelpers$.retry(RESTHelpers.scala:40)
at com.microsoft.azure.synapse.ml.io.http.RESTHelpers$.safeSend(RESTHelpers.scala:57)
at com.microsoft.azure.synapse.ml.nbtest.SynapseUtilities$.$anonfun$createSparkPools$1(SynapseUtilities.scala:310)
at com.microsoft.azure.synapse.ml.nbtest.SynapseUtilities$.$anonfun$createSparkPools$1$adapted(SynapseUtilities.scala:297)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
at scala.collection.immutable.Range.foreach(Range.scala:158)
at scala.collection.TraversableLike.map(TraversableLike.scala:286)
at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
at scala.collection.AbstractTraversable.map(Traversable.scala:108)
at com.microsoft.azure.synapse.ml.nbtest.SynapseUtilities$.createSparkPools(SynapseUtilities.scala:297)
at com.microsoft.azure.synapse.ml.nbtest.SynapseTests.<init>(SynapseTests.scala:75)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:454)
at sbt.TestRunner.runTest$1(TestFramework.scala:140)
at sbt.TestRunner.run(TestFramework.scala:155)
at sbt.TestFramework$$anon$3$$anonfun$$lessinit$greater$1.$anonfun$apply$1(TestFramework.scala:318)
at sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:278)
at sbt.TestFramework$$an
Check failure on line 1 in Quickstart - Fine-tune a Text Classifier.ipynb
azure-pipelines / SynapseML-Official
Quickstart - Fine-tune a Text Classifier.ipynb
Notebook /SynapseMLBuild/build_1.0.9-spark3.5/Explore Algorithms/Deep Learning/Quickstart - Fine-tune a Text Classifier.ipynb failed with state FAILED. For more information check the run page: https://adb-1885762835647850.10.azuredatabricks.net/?o=1885762835647850#job/753926283260071/run/620097467013607
Raw output
java.lang.RuntimeException: Notebook /SynapseMLBuild/build_1.0.9-spark3.5/Explore Algorithms/Deep Learning/Quickstart - Fine-tune a Text Classifier.ipynb failed with state FAILED. For more information check the run page:
https://adb-1885762835647850.10.azuredatabricks.net/?o=1885762835647850#job/753926283260071/run/620097467013607
at com.microsoft.azure.synapse.ml.nbtest.DatabricksUtilities$.monitorJob(DatabricksUtilities.scala:350)
at com.microsoft.azure.synapse.ml.nbtest.DatabricksNotebookRun.monitor(DatabricksUtilities.scala:475)
at com.microsoft.azure.synapse.ml.nbtest.DatabricksUtilities$.runNotebook(DatabricksUtilities.scala:372)
at com.microsoft.azure.synapse.ml.nbtest.DatabricksTestHelper.$anonfun$databricksTestHelper$6(DatabricksUtilities.scala:453)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
at scala.util.Success.$anonfun$map$1(Try.scala:255)
at scala.util.Success.map(Try.scala:213)
at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Check failure on line 1 in Langchain.ipynb
azure-pipelines / SynapseML-Official
Langchain.ipynb
Notebook /SynapseMLBuild/build_1.0.9-spark3.5/Explore Algorithms/OpenAI/Langchain.ipynb failed with state FAILED. For more information check the run page: https://adb-1885762835647850.10.azuredatabricks.net/?o=1885762835647850#job/420721597746215/run/87232412102932
Raw output
java.lang.RuntimeException: Notebook /SynapseMLBuild/build_1.0.9-spark3.5/Explore Algorithms/OpenAI/Langchain.ipynb failed with state FAILED. For more information check the run page:
https://adb-1885762835647850.10.azuredatabricks.net/?o=1885762835647850#job/420721597746215/run/87232412102932
at com.microsoft.azure.synapse.ml.nbtest.DatabricksUtilities$.monitorJob(DatabricksUtilities.scala:350)
at com.microsoft.azure.synapse.ml.nbtest.DatabricksNotebookRun.monitor(DatabricksUtilities.scala:475)
at com.microsoft.azure.synapse.ml.nbtest.DatabricksUtilities$.runNotebook(DatabricksUtilities.scala:372)
at com.microsoft.azure.synapse.ml.nbtest.DatabricksTestHelper.$anonfun$databricksTestHelper$6(DatabricksUtilities.scala:453)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
at scala.util.Success.$anonfun$map$1(Try.scala:255)
at scala.util.Success.map(Try.scala:213)
at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Check failure on line 1 in Quickstart - Fine-tune a Vision Classifier.ipynb
azure-pipelines / SynapseML-Official
Quickstart - Fine-tune a Vision Classifier.ipynb
Notebook /SynapseMLBuild/build_1.0.9-spark3.5/Explore Algorithms/Deep Learning/Quickstart - Fine-tune a Vision Classifier.ipynb failed with state FAILED. For more information check the run page: https://adb-1885762835647850.10.azuredatabricks.net/?o=1885762835647850#job/888381655880512/run/699333404805940
Raw output
java.lang.RuntimeException: Notebook /SynapseMLBuild/build_1.0.9-spark3.5/Explore Algorithms/Deep Learning/Quickstart - Fine-tune a Vision Classifier.ipynb failed with state FAILED. For more information check the run page:
https://adb-1885762835647850.10.azuredatabricks.net/?o=1885762835647850#job/888381655880512/run/699333404805940
at com.microsoft.azure.synapse.ml.nbtest.DatabricksUtilities$.monitorJob(DatabricksUtilities.scala:350)
at com.microsoft.azure.synapse.ml.nbtest.DatabricksNotebookRun.monitor(DatabricksUtilities.scala:475)
at com.microsoft.azure.synapse.ml.nbtest.DatabricksUtilities$.runNotebook(DatabricksUtilities.scala:372)
at com.microsoft.azure.synapse.ml.nbtest.DatabricksTestHelper.$anonfun$databricksTestHelper$6(DatabricksUtilities.scala:453)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
at scala.util.Success.$anonfun$map$1(Try.scala:255)
at scala.util.Success.map(Try.scala:213)
at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)