CreateSamplingMessageRequest extends Request
A request from the server to sample an LLM via the client. The client has full discretion over which model to select.
The client should also inform the user before beginning sampling, to allow them to inspect the request (human in the loop) and decide whether to approve it.
Tags
Table of Contents
Properties
- $includeContext : SamplingContext|null
- $maxTokens : int
- $messages : array<string|int, mixed>
- $metadata : array<string|int, mixed>|null
- $preferences : ModelPreferences|null
- $stopSequences : array<string|int, mixed>|null
- $systemPrompt : string|null
- $temperature : float|null
- $id : string|int
- $meta : array<string, mixed>|null
Methods
- __construct() : mixed
- fromArray() : static
- getId() : string|int
- getMeta() : array<string, mixed>|null
- getMethod() : string
- jsonSerialize() : RequestData
- withId() : static
- withMeta() : static
- fromParams() : static
-
getParams()
: ModelPreferences, systemPrompt?: string, includeContext?: string, temperature?: float, stopSequences?: string[], metadata?: array
}
Properties
$includeContext read-only
public
SamplingContext|null
$includeContext
= null
$maxTokens read-only
public
int
$maxTokens
$messages read-only
public
array<string|int, mixed>
$messages
$metadata read-only
public
array<string|int, mixed>|null
$metadata
= null
$preferences read-only
public
ModelPreferences|null
$preferences
= null
$stopSequences read-only
public
array<string|int, mixed>|null
$stopSequences
= null
$systemPrompt read-only
public
string|null
$systemPrompt
= null
$temperature read-only
public
float|null
$temperature
= null
$id
protected
string|int
$id
$meta
protected
array<string, mixed>|null
$meta
= null
Methods
__construct()
public
__construct(array<string|int, SamplingMessage> $messages, int $maxTokens[, ModelPreferences|null $preferences = null ][, string|null $systemPrompt = null ][, SamplingContext|null $includeContext = null ][, float|null $temperature = null ][, array<string|int, string>|null $stopSequences = null ][, array<string, mixed>|null $metadata = null ]) : mixed
Parameters
- $messages : array<string|int, SamplingMessage>
-
the messages to send to the model
- $maxTokens : int
-
The maximum number of tokens to sample, as requested by the server. The client MAY choose to sample fewer tokens than requested.
- $preferences : ModelPreferences|null = null
-
The server's preferences for which model to select. The client MAY ignore these preferences.
- $systemPrompt : string|null = null
-
An optional system prompt the server wants to use for sampling. The client MAY modify or omit this prompt.
- $includeContext : SamplingContext|null = null
-
A request to include context from one or more MCP servers (including the caller), to be attached to the prompt. The client MAY ignore this request. Allowed values: "none", "thisServer", "allServers"
- $temperature : float|null = null
-
The temperature to use for sampling. The client MAY ignore this request.
- $stopSequences : array<string|int, string>|null = null
-
A list of sequences to stop sampling at. The client MAY ignore this request.
- $metadata : array<string, mixed>|null = null
-
Optional metadata to pass through to the LLM provider. The format of this metadata is provider-specific.
fromArray()
public
static fromArray(RequestData $data) : static
Parameters
- $data : RequestData
Return values
staticgetId()
public
getId() : string|int
Return values
string|intgetMeta()
public
getMeta() : array<string, mixed>|null
Return values
array<string, mixed>|nullgetMethod()
public
static getMethod() : string
Return values
stringjsonSerialize()
public
jsonSerialize() : RequestData
Return values
RequestDatawithId()
public
withId(string|int $id) : static
Parameters
- $id : string|int
Return values
staticwithMeta()
public
withMeta(array<string, mixed>|null $meta) : static
Parameters
- $meta : array<string, mixed>|null
Return values
staticfromParams()
protected
static fromParams(array<string|int, mixed>|null $params) : static
Parameters
- $params : array<string|int, mixed>|null
Return values
staticgetParams()
protected
getParams() : ModelPreferences, systemPrompt?: string, includeContext?: string, temperature?: float, stopSequences?: string[], metadata?: array}