Salesforce Agent
This workflow leverages AI to optimize business conversations by integrating a query function to the Salesforce database. Its main function is to provide conversational responses to the inputs of the
graph TD
%%{init: {'theme': 'mc','layout': 'elk'}}%%
OpenAIModel-1kevp[<div><img src="/_astro/openAI.BhmuxEs3.svg" style="height: 20px !important;width: 20px !important"/></div>OpenAI]
style OpenAIModel-1kevp stroke:#a170ff
ChatOutput-e303u[<div><img src="/_astro/messages-square.BaSDmT6g.svg" style="height: 20px !important;width: 20px !important"/></div>Chat Output]
style ChatOutput-e303u stroke:#a170ff
ChatInput-wkxmm[<div><img src="/_astro/messages-square.BaSDmT6g.svg" style="height: 20px !important;width: 20px !important"/></div>Chat Input]
style ChatInput-wkxmm stroke:#a170ff
LanggraphReactAgent-fy1on[Agent]
style LanggraphReactAgent-fy1on stroke:#a170ff
SalesForceTool-b81do[Salesforce Tool]
style SalesForceTool-b81do stroke:#a170ff
ChatInput-wkxmm -.- LanggraphReactAgent-fy1on
linkStyle 0 stroke:#a170ff
OpenAIModel-1kevp -.- LanggraphReactAgent-fy1on
linkStyle 1 stroke:#a170ff
LanggraphReactAgent-fy1on -.- ChatOutput-e303u
linkStyle 2 stroke:#a170ff
SalesForceTool-b81do -.- LanggraphReactAgent-fy1on
linkStyle 3 stroke:#a170ff
Salesforce Agent Documentation
đź§© Overview
The Salesforce Agent integrates an OpenAI language model into a Langgraph React Agent and augments it with a Salesforce query tool. It accepts user messages through a chat interface, processes them with the LLM and optional Salesforce data, and returns a natural‑language response. The workflow streamlines business conversations by automatically querying Salesforce and generating context‑aware replies.
⚙️ Main Features
- Seamlessly connects a chat UI to an OpenAI language model via a Langgraph React Agent.
- Allows the agent to execute Salesforce queries during the conversation.
- Provides real‑time chat output with sender and session metadata.
- Supports configurable temperature, token limits, and system prompts for the LLM.
- Enables fallback model handling and streaming responses (currently disabled).
🔄 Workflow Steps
| Component Name | Role in the Workflow | Key Inputs | Key Outputs |
|---|---|---|---|
| Chat Input | Captures the user’s message and any attached files. | User Text, Sender Type, Session ID, Conversation ID | Message object containing the user input |
| Salesforce Tool | Executes a Salesforce query supplied by the agent. | Query string | Data object with query results |
| OpenAI Model | Generates text using the OpenAI LLM. | Prompt text, System Message, Max Tokens, Temperature, Model name | Generated text (Message) |
| Langgraph React Agent | Orchestrates the conversation, integrates the LLM and tools, and produces a reply. | Initial input message, LLM (OpenAI Model), Tool list (Salesforce Tool), System prompt | Response message |
| Chat Output | Renders the agent’s reply in the chat interface. | Response text, Sender info, Session ID, Conversation ID | Displayed chat message |
đź§ Notes
- An OpenAI API key and a valid Salesforce credential are required for the workflow to operate.
- The agent uses the
gpt-4o-minimodel by default; this can be changed via the model selector. - The Salesforce Tool must be provisioned with a Salesforce user name, password, security token, and domain before use.
- Streaming responses are disabled by default; enable
use_streamto receive incremental output. - Fallback model support is available but currently disabled; enable
enable_fallbackto provide a secondary LLM. - The workflow stores chat history if
should_store_messageis true, allowing context to be maintained across turns. - The Langgraph React Agent framework manages memory and execution limits;
max_execution_timeandmax_iterationscan be tuned for longer interactions.