MCP PHP SDK

CreateSamplingMessageRequest extends Request

FinalYes

A request from the server to sample an LLM via the client. The client has full discretion over which model to select.

The client should also inform the user before beginning sampling, to allow them to inspect the request (human in the loop) and decide whether to approve it.

Tags
author

Kyrian Obikwelu koshnawaza@gmail.com

Table of Contents

Properties

$includeContext  : SamplingContext|null
$maxTokens  : int
$messages  : array<string|int, mixed>
$metadata  : array<string|int, mixed>|null
$preferences  : ModelPreferences|null
$stopSequences  : array<string|int, mixed>|null
$systemPrompt  : string|null
$temperature  : float|null
$id  : string|int
$meta  : array<string, mixed>|null

Methods

__construct()  : mixed
fromArray()  : static
getId()  : string|int
getMeta()  : array<string, mixed>|null
getMethod()  : string
jsonSerialize()  : RequestData
withId()  : static
withMeta()  : static
fromParams()  : static
getParams()  : ModelPreferences, systemPrompt?: string, includeContext?: string, temperature?: float, stopSequences?: string[], metadata?: array}

Properties

$meta

protected array<string, mixed>|null $meta = null

Methods

__construct()

public __construct(array<string|int, SamplingMessage$messages, int $maxTokens[, ModelPreferences|null $preferences = null ][, string|null $systemPrompt = null ][, SamplingContext|null $includeContext = null ][, float|null $temperature = null ][, array<string|int, string>|null $stopSequences = null ][, array<string, mixed>|null $metadata = null ]) : mixed
Parameters
$messages : array<string|int, SamplingMessage>

the messages to send to the model

$maxTokens : int

The maximum number of tokens to sample, as requested by the server. The client MAY choose to sample fewer tokens than requested.

$preferences : ModelPreferences|null = null

The server's preferences for which model to select. The client MAY ignore these preferences.

$systemPrompt : string|null = null

An optional system prompt the server wants to use for sampling. The client MAY modify or omit this prompt.

$includeContext : SamplingContext|null = null

A request to include context from one or more MCP servers (including the caller), to be attached to the prompt. The client MAY ignore this request. Allowed values: "none", "thisServer", "allServers"

$temperature : float|null = null

The temperature to use for sampling. The client MAY ignore this request.

$stopSequences : array<string|int, string>|null = null

A list of sequences to stop sampling at. The client MAY ignore this request.

$metadata : array<string, mixed>|null = null

Optional metadata to pass through to the LLM provider. The format of this metadata is provider-specific.

fromArray()

public static fromArray(RequestData $data) : static
Parameters
$data : RequestData
Return values
static

getId()

public getId() : string|int
Return values
string|int

getMeta()

public getMeta() : array<string, mixed>|null
Return values
array<string, mixed>|null

jsonSerialize()

public jsonSerialize() : RequestData
Return values
RequestData

withId()

public withId(string|int $id) : static
Parameters
$id : string|int
Return values
static

withMeta()

public withMeta(array<string, mixed>|null $meta) : static
Parameters
$meta : array<string, mixed>|null
Return values
static

fromParams()

protected static fromParams(array<string|int, mixed>|null $params) : static
Parameters
$params : array<string|int, mixed>|null
Return values
static

getParams()

protected getParams() : ModelPreferences, systemPrompt?: string, includeContext?: string, temperature?: float, stopSequences?: string[], metadata?: array}
Return values
ModelPreferences, systemPrompt?: string, includeContext?: string, temperature?: float, stopSequences?: string[], metadata?: array}

        
On this page

Search results