-
Notifications
You must be signed in to change notification settings - Fork 6
Test failure: test_runtime_backend_errors_handled[\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import NotFound\nbackend = RuntimeBackend()\ntry:\n query_response = backend.fetch("DESCRIBE __RANDOM__")\n print("FAILED")\nexcept NotFound as e:\n print("PASSED")\n]
#426
Copy link
Copy link
Open
Labels
bugSomething isn't workingSomething isn't working
Description
❌ test_runtime_backend_errors_handled[\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import NotFound\nbackend = RuntimeBackend()\ntry:\n query_response = backend.fetch("DESCRIBE __RANDOM__")\n print("FAILED")\nexcept NotFound as e:\n print("PASSED")\n]: databricks.sdk.errors.platform.ResourceDoesNotExist: Cluster DATABRICKS_CLUSTER_ID does not exist (4.394s)
databricks.sdk.errors.platform.ResourceDoesNotExist: Cluster DATABRICKS_CLUSTER_ID does not exist
04:56 DEBUG [databricks.sdk] Loaded from environment
04:56 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
04:56 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
04:56 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
04:56 INFO [databricks.sdk] Using Databricks Metadata Service authentication
[gw0] linux -- Python 3.10.18 /home/runner/work/lsql/lsql/.venv/bin/python
04:56 DEBUG [databricks.sdk] Loaded from environment
04:56 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
04:56 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
04:56 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
04:56 INFO [databricks.sdk] Using Databricks Metadata Service authentication
04:56 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
< "active": true,
< "displayName": "labs-runtime-identity",
< "emails": [
< {
< "primary": true,
< "type": "work",
< "value": "**REDACTED**"
< }
< ],
< "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
< "groups": [
< {
< "$ref": "Groups/300667344111082",
< "display": "labs.scope.runtime",
< "type": "direct",
< "value": "**REDACTED**"
< }
< ],
< "id": "4643477475987733",
< "name": {
< "givenName": "labs-runtime-identity"
< },
< "schemas": [
< "urn:ietf:params:scim:schemas:core:2.0:User",
< "... (1 additional elements)"
< ],
< "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
04:56 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmpbaveicc_/working-copy in /tmp/tmpbaveicc_
04:56 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.hJp6/wheels/databricks_labs_lsql-0.16.1+320250718045631-py3-none-any.whl
04:56 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
04:56 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
< "error_code": "RESOURCE_DOES_NOT_EXIST",
< "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.hJp6/wheels) does not exist."
< }
04:56 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.hJp6/wheels
04:56 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
> "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.hJp6/wheels"
> }
< 200 OK
< {}
04:56 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
04:56 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
< "object_id": 284701136197760
< }
04:56 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
04:56 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.hJp6/version.json
04:56 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
04:56 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
< "object_id": 284701136197762
< }
04:56 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 400 Bad Request
< {
< "details": [
< {
< "@type": "type.googleapis.com/google.rpc.RequestInfo",
< "request_id": "3539dfc7-9952-4f6c-90d6-cb17e1379e5a",
< "serving_data": ""
< }
< ],
< "error_code": "INVALID_PARAMETER_VALUE",
< "message": "Cluster DATABRICKS_CLUSTER_ID does not exist"
< }
04:56 DEBUG [databricks.sdk] Loaded from environment
04:56 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
04:56 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
04:56 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
04:56 INFO [databricks.sdk] Using Databricks Metadata Service authentication
04:56 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
< "active": true,
< "displayName": "labs-runtime-identity",
< "emails": [
< {
< "primary": true,
< "type": "work",
< "value": "**REDACTED**"
< }
< ],
< "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
< "groups": [
< {
< "$ref": "Groups/300667344111082",
< "display": "labs.scope.runtime",
< "type": "direct",
< "value": "**REDACTED**"
< }
< ],
< "id": "4643477475987733",
< "name": {
< "givenName": "labs-runtime-identity"
< },
< "schemas": [
< "urn:ietf:params:scim:schemas:core:2.0:User",
< "... (1 additional elements)"
< ],
< "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
04:56 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmpbaveicc_/working-copy in /tmp/tmpbaveicc_
04:56 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.hJp6/wheels/databricks_labs_lsql-0.16.1+320250718045631-py3-none-any.whl
04:56 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
04:56 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
< "error_code": "RESOURCE_DOES_NOT_EXIST",
< "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.hJp6/wheels) does not exist."
< }
04:56 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.hJp6/wheels
04:56 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
> "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.hJp6/wheels"
> }
< 200 OK
< {}
04:56 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
04:56 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
< "object_id": 284701136197760
< }
04:56 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
04:56 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.hJp6/version.json
04:56 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
04:56 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
< "object_id": 284701136197762
< }
04:56 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 400 Bad Request
< {
< "details": [
< {
< "@type": "type.googleapis.com/google.rpc.RequestInfo",
< "request_id": "3539dfc7-9952-4f6c-90d6-cb17e1379e5a",
< "serving_data": ""
< }
< ],
< "error_code": "INVALID_PARAMETER_VALUE",
< "message": "Cluster DATABRICKS_CLUSTER_ID does not exist"
< }
[gw0] linux -- Python 3.10.18 /home/runner/work/lsql/lsql/.venv/bin/python
Running from nightly #248
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working