Overview
The List Prompts endpoint allows you to retrieve information about prompt templates available within a flow context. Prompts define how LLM nodes interpret context, structure responses, and maintain conversational coherence, making them critical components for response quality and consistency.- Method:
GET
- URL:
https://{flow_name}.flows.graphorlm.com/prompts
- Authentication: Required (API Token)
Authentication
All requests must include a valid API token in the Authorization header:Learn how to generate API tokens in the API Tokens guide.
Request Format
Headers
Header | Value | Required |
---|---|---|
Authorization | Bearer YOUR_API_TOKEN | Yes |
Parameters
No query parameters are required for this endpoint.Example Request
Response Format
Success Response (200 OK)
The response contains an array of prompt template objects:Response Structure
Each prompt template in the array contains:Field | Type | Description |
---|---|---|
id | string | Unique identifier for the prompt template (UUID or system ID) |
name | string | Human-readable name of the prompt template |
text | string | Complete prompt template text with instructions and placeholders |
Prompt Template Structure
System Prompt Fields
Component | Description | Purpose |
---|---|---|
Instructions | Core behavioral guidelines for the LLM | Defines response style, tone, and approach |
Context Placeholder | {context} variable for retrieved information | Insertion point for RAG context |
Response Guidelines | Specific rules for answer generation | Ensures consistency and quality |
Formatting Rules | Structure and presentation requirements | Controls output format and organization |
Error Handling | Instructions for unknown or ambiguous queries | Manages edge cases and limitations |
Code Examples
JavaScript/Node.js
Python
cURL
PHP
Error Responses
Common Error Codes
Status Code | Description | Example Response |
---|---|---|
401 | Unauthorized - Invalid or missing API token | {"detail": "Invalid authentication credentials"} |
404 | Not Found - Flow not found | {"detail": "Flow not found"} |
500 | Internal Server Error - Server error | {"detail": "Failed to retrieve prompts"} |
Error Response Format
Example Error Responses
Invalid API Token
Flow Not Found
Server Error
Use Cases
Prompt Template Management
Use this endpoint to:- Template Discovery: Explore available prompt templates for LLM configuration
- Prompt Engineering: Analyze existing templates for optimization opportunities
- Consistency Auditing: Review prompt structures across different use cases
- Template Selection: Choose appropriate templates for specific LLM node configurations
Integration Examples
Prompt Template Optimizer
Template Compliance Validator
Best Practices
Prompt Engineering Excellence
- Context Integration: Always include
{context}
placeholder for RAG functionality - Clear Instructions: Provide specific, actionable guidelines for LLM behavior
- Anti-Hallucination: Include explicit instructions to use only provided context
- Error Handling: Define behavior for unknown or ambiguous queries
Template Organization
- Naming Conventions: Use descriptive names that indicate template purpose and scope
- Categorization: Organize templates by use case (technical, support, general, etc.)
- Version Control: Maintain template versions for iterative improvements
- Documentation: Document template purposes and optimization rationales
Performance Optimization
- Length Balance: Optimize template length for clarity without overwhelming the LLM
- Instruction Clarity: Use precise language to minimize ambiguous interpretations
- Response Structure: Provide formatting guidelines for consistent output
- Testing: Regularly test templates with representative queries
Quality Assurance
- Compliance Validation: Regularly audit templates against quality criteria
- Performance Monitoring: Track response quality metrics for template effectiveness
- User Feedback: Incorporate user feedback for continuous template improvement
- A/B Testing: Compare template variants to optimize performance
Troubleshooting
Flow Not Found Error
Flow Not Found Error
Solution: Verify that:
- The flow name in the URL is correct and matches exactly
- The flow exists in your project
- Your API token has access to the correct project
- The flow has been created and saved properly
Empty Prompts Array
Empty Prompts Array
Solution: If no prompts are returned:
- Verify that the system default prompt should always be present
- Check for API configuration issues affecting prompt service
- Ensure the dataset has proper access to prompt templates
- Contact support if default prompt is missing
Template Content Issues
Template Content Issues
Solution: If templates have formatting or content problems:
- Verify template text encoding and special characters
- Check for proper placeholder syntax (
{context}
) - Ensure template structure follows expected formats
- Validate that custom templates were created correctly
Missing Context Integration
Missing Context Integration
Solution: If templates lack RAG context capability:
- Ensure templates include
{context}
placeholder - Verify that context integration instructions are present
- Check that templates are designed for RAG workflows
- Update templates to include proper context handling
LLM Hallucination Issues
LLM Hallucination Issues
Solution: If LLM generates responses without using context:
- Add explicit anti-hallucination instructions to templates
- Include phrases like “use only the provided context”
- Emphasize that LLM should not use internal knowledge
- Test templates with queries that require context-only responses
Connection Issues
Connection Issues
Solution: For connectivity problems:
- Check your internet connection
- Verify the flow URL is accessible
- Ensure your firewall allows HTTPS traffic to *.flows.graphorlm.com
- Try accessing the endpoint from a different network
Next Steps
After retrieving prompt templates, you might want to:Update LLM Configuration
Configure LLM nodes to use specific prompt templates for response generation
List LLM Nodes
View LLM nodes and their current prompt template configurations
Create Custom Prompts
Design and create custom prompt templates for specialized use cases
Flow Overview
Learn about all available flow management endpoints