Skip to content

Error: 3 INVALID_ARGUMENT:, batch #523

@cristobalufro

Description

@cristobalufro

Context:

Project ID: alimetra-fc43f
Cloud Function: processSurveyAnalysisCore (Node.js 20, in southamerica-west1)
Vertex AI Model: text-multilingual-embedding-002 (Publisher Model)
Vertex AI Client: @google-cloud/aiplatform PredictionServiceClient
Target Vertex AI Location for API Call: us-central1 (also tested southamerica-east1)
Authentication: Firebase Cloud Function's runtime service account (alimetra-fc43f@appspot.gserviceaccount.com) with "Vertex AI User" IAM role.
Previous Success: Individual (non-batched) embedding calls using Vertex AI were reportedly working at an earlier development stage. A separate Python script using vertexai.language_models.TextEmbeddingModel with project="alimetra-fc43f" and location="southamerica-east1" also successfully generates embeddings for the same model.
Expected Behavior:

The Node.js Cloud Function (processSurveyAnalysisCore, specifically the getEmbeddingsBatch helper) constructs a request to the Vertex AI publisher model text-multilingual-embedding-002.
The endpoint in this request is correctly formatted as projects/alimetra-fc43f/locations/us-central1/publishers/google/models/text-multilingual-embedding-002 (or the specified VERTEX_AI_LOCATION).
The instances array contains valid text content, e.g., [{ "content": "manzana" }] or [{ "content": "hello world" }].
The parameters object includes { task_type: "RETRIEVAL_DOCUMENT" } (or is an empty object {}).
The predictionServiceClient.predict(request) call successfully contacts Vertex AI.
Vertex AI returns a valid response containing the embedding vectors for the provided text instances.
The aiplatform.googleapis.com%2Fprediction_access log (server-side Vertex AI log) should show the jsonPayload.endpoint reflecting the publisher model path under the context of project alimetra-fc43f.
Actual Behavior:

The Node.js Cloud Function constructs the request to Vertex AI with the seemingly correct endpoint for project alimetra-fc43f and valid instances (e.g., [{ "content": "manzana" }] or [{ "content": "hello world" }]) and parameters (including task_type or {}).
The predictionServiceClient.predict(request) call fails.
The error caught in the Cloud Function's stderr log is consistently Error: 3 INVALID_ARGUMENT:, with error.details being an empty string. The gRPC stack trace points to an issue during the onReceiveStatus phase of the client call.
Crucially, the aiplatform.googleapis.com%2Fprediction_access log (server-side Vertex AI log) for the failed request shows:
logName: "projects/alimetra-fc43f/logs/aiplatform.googleapis.com%2Fprediction_access" (correctly attributing the call to the user's project).
resource.labels.resource_container: "alimetra-fc43f" (correctly identifying the user's project as the resource consuming the API).
BUT, jsonPayload.endpoint shows: "projects/3972195257/locations/us-central1/endpoints/text-multilingual-embedding-002" (where 3972195257 is not the user's project ID alimetra-fc43f). This path format (.../endpoints/...) also differs from the expected publisher model path format (.../publishers/google/models/...).
This same prediction_access log entry contains jsonPayload.error.code: 3.

Metadata

Metadata

Assignees

No one assigned

    Labels

    api: aiplatformIssues related to the googleapis/nodejs-vertexai API.priority: p2Moderately-important priority. Fix may not be included in next release.type: bugError or flaw in code with unintended results or allowing sub-optimal usage patterns.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions