You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to run the PPL tool using openai as LLM.
I followed the PPL tool tutorial step by step but instead of using SageMaker I used openai LLM, I configured the connector exactly as the document of connector blue print montioned, and I used the exact sample data the document suggested.
But when I tried to execute the agent
POST /_plugins/_ml/agents/J32A9ZQBuiy8gn6F9NLo/_execute
{
"parameters": {
"verbose": true,
"question": "what is the error rate yesterday",
"index": "opensearch_dashboards_sample_data_logs"
}
}
The weird thing I tried to change the index name to a fake index that is not in my cluster
{
"status": 400,
"error": {
"type": "IllegalArgumentException",
"reason": "Invalid Request",
"details": "Return this final answer to human directly and do not use other tools: 'Please provide index name'. Please try to directly send this message to human to ask for index name"
}
}
I checked the code it should return that this index doesn't exists, right? and why the message is different from the previous.
I searched the documentation for using openai and if there is any special configuration for openai prompt or payload in agents or connector but I couldn't find anything useful.
Note
I tried the model with _predict action and it was responding correctly to my questions
POST /_plugins/_ml/models/xxxxxxxxxxxx/_predict
{
"parameters": {
"messages": [
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "what is Sagemaker?"
}
]
}
}
Response was
{
"inference_results": [
{
"output": [
{
"name": "response",
"dataAsMap": {
"id": "chatcmpl-B0OHUsTTc1w3WMzO1qDJ5Tq5lnofq",
"object": "chat.completion",
"created": 1739433036,
"model": "gpt-3.5-turbo-0125",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Amazon SageMaker is a fully-managed service that enables data scientists and developers to build, train, and deploy machine learning models at scale. It provides all the necessary tools and resources for every step of the machine learning process, from data preparation and model training to deployment and monitoring. SageMaker simplifies the machine learning workflow and helps to accelerate the development of AI applications."
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 23,
"completion_tokens": 75,
"total_tokens": 98,
"prompt_tokens_details": {
"cached_tokens": 0,
"audio_tokens": 0
},
"completion_tokens_details": {
"reasoning_tokens": 0,
"audio_tokens": 0,
"accepted_prediction_tokens": 0,
"rejected_prediction_tokens": 0
}
},
"service_tier": "default"
}
}
],
"status_code": 200
}
]
}
it seems the issue is between connecting the tool with LLM.
POST /_plugins/_ml/agents/_register
{
"name": "Test_Agent_For_PPL",
"type": "flow",
"description": "this is a test agent",
"memory": {
"type": "demo"
},
"tools": [
{
"type": "PPLTool",
"name": "TransferQuestionToPPLAndExecuteTool",
"description": "Use this tool to transfer natural language to generate PPL and execute PPL to query inside. Use this tool after you know the index name, otherwise, call IndexRoutingTool first. The input parameters are: {index:IndexName, question:UserQuestion}",
"parameters": {
"model_id": "xxxxxxxxxxxxxxxxx",
"model_type": "OPENAI",
"execute": true,
"input": "{\"index\": \"${parameters.index}\", \"question\": ${parameters.question} }"
}
}
]
}
Execute the agent like in tutorial
POST /_plugins/_ml/agents/xxxxxxxxxxxxx/_execute
{
"parameters": {
"verbose": true,
"question": "what is the error rate yesterday",
"index": "opensearch_dashboards_sample_data"
}
}
Expected behavior
As the tutorial mentioned I have to get the PPL query for my question as
Describe the bug
I'm trying to run the PPL tool using openai as LLM.
I followed the PPL tool tutorial step by step but instead of using SageMaker I used openai LLM, I configured the connector exactly as the document of connector blue print montioned, and I used the exact sample data the document suggested.
But when I tried to execute the agent
I'm getting this error:
The weird thing I tried to change the index name to a fake index that is not in my cluster
I checked the code it should return that this index doesn't exists, right? and why the message is different from the previous.
I searched the documentation for using openai and if there is any special configuration for openai prompt or payload in agents or connector but I couldn't find anything useful.
Note
I tried the model with _predict action and it was responding correctly to my questions
Response was
it seems the issue is between connecting the tool with LLM.
Related component
No response
To Reproduce
Expected behavior
As the tutorial mentioned I have to get the PPL query for my question as
Additional Details
Plugins
GET _cat/plugins
opensearch-alerting
opensearch-anomaly-detection
opensearch-asynchronous-search
opensearch-cross-cluster-replication
opensearch-custom-codecs
opensearch-flow-framework
opensearch-geospatial
opensearch-index-management
opensearch-job-scheduler
opensearch-knn
opensearch-ml
opensearch-neural-search
opensearch-notifications
opensearch-notifications-core
opensearch-observability
opensearch-performance-analyzer
opensearch-reports-scheduler
opensearch-security
opensearch-security-analytics
opensearch-skills
opensearch-sql
opensearch-system-templates
prometheus-exporter
query-insights
opensearch-alerting
opensearch-anomaly-detection
opensearch-asynchronous-search
opensearch-cross-cluster-replication
opensearch-custom-codecs
opensearch-flow-framework
opensearch-geospatial
opensearch-index-management
opensearch-job-scheduler
opensearch-knn
opensearch-ml
opensearch-neural-search
opensearch-notifications
opensearch-notifications-core
opensearch-observability
opensearch-performance-analyzer
opensearch-reports-scheduler
opensearch-security
opensearch-security-analytics
opensearch-skills
opensearch-sql
opensearch-system-templates
prometheus-exporter
query-insights
opensearch-alerting
opensearch-anomaly-detection
opensearch-asynchronous-search
opensearch-cross-cluster-replication
opensearch-custom-codecs
opensearch-flow-framework
opensearch-geospatial
opensearch-index-management
opensearch-job-scheduler
opensearch-knn
opensearch-ml
opensearch-neural-search
opensearch-notifications
opensearch-notifications-core
opensearch-observability
opensearch-performance-analyzer
opensearch-reports-scheduler
opensearch-security
opensearch-security-analytics
opensearch-skills
opensearch-sql
opensearch-system-templates
prometheus-exporter
query-insights
Host/Environment (please complete the following information):
The text was updated successfully, but these errors were encountered: