1. Packages
  2. AWS Cloud Control
  3. API Docs
  4. sagemaker
  5. InferenceExperiment

We recommend new projects start with resources from the AWS provider.

AWS Cloud Control v1.27.0 published on Monday, Apr 14, 2025 by Pulumi

aws-native.sagemaker.InferenceExperiment

Explore with Pulumi AI

We recommend new projects start with resources from the AWS provider.

AWS Cloud Control v1.27.0 published on Monday, Apr 14, 2025 by Pulumi

Resource Type definition for AWS::SageMaker::InferenceExperiment

Create InferenceExperiment Resource

Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.

Constructor syntax

new InferenceExperiment(name: string, args: InferenceExperimentArgs, opts?: CustomResourceOptions);
@overload
def InferenceExperiment(resource_name: str,
                        args: InferenceExperimentArgs,
                        opts: Optional[ResourceOptions] = None)

@overload
def InferenceExperiment(resource_name: str,
                        opts: Optional[ResourceOptions] = None,
                        model_variants: Optional[Sequence[InferenceExperimentModelVariantConfigArgs]] = None,
                        type: Optional[InferenceExperimentType] = None,
                        role_arn: Optional[str] = None,
                        endpoint_name: Optional[str] = None,
                        name: Optional[str] = None,
                        kms_key: Optional[str] = None,
                        data_storage_config: Optional[InferenceExperimentDataStorageConfigArgs] = None,
                        desired_state: Optional[InferenceExperimentDesiredState] = None,
                        schedule: Optional[InferenceExperimentScheduleArgs] = None,
                        shadow_mode_config: Optional[InferenceExperimentShadowModeConfigArgs] = None,
                        status_reason: Optional[str] = None,
                        tags: Optional[Sequence[_root_inputs.TagArgs]] = None,
                        description: Optional[str] = None)
func NewInferenceExperiment(ctx *Context, name string, args InferenceExperimentArgs, opts ...ResourceOption) (*InferenceExperiment, error)
public InferenceExperiment(string name, InferenceExperimentArgs args, CustomResourceOptions? opts = null)
public InferenceExperiment(String name, InferenceExperimentArgs args)
public InferenceExperiment(String name, InferenceExperimentArgs args, CustomResourceOptions options)
type: aws-native:sagemaker:InferenceExperiment
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.

Parameters

name This property is required. string
The unique name of the resource.
args This property is required. InferenceExperimentArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
resource_name This property is required. str
The unique name of the resource.
args This property is required. InferenceExperimentArgs
The arguments to resource properties.
opts ResourceOptions
Bag of options to control resource's behavior.
ctx Context
Context object for the current deployment.
name This property is required. string
The unique name of the resource.
args This property is required. InferenceExperimentArgs
The arguments to resource properties.
opts ResourceOption
Bag of options to control resource's behavior.
name This property is required. string
The unique name of the resource.
args This property is required. InferenceExperimentArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
name This property is required. String
The unique name of the resource.
args This property is required. InferenceExperimentArgs
The arguments to resource properties.
options CustomResourceOptions
Bag of options to control resource's behavior.

InferenceExperiment Resource Properties

To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

Inputs

In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.

The InferenceExperiment resource accepts the following input properties:

EndpointName This property is required. string
The name of the endpoint.
ModelVariants This property is required. List<Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentModelVariantConfig>
An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
RoleArn This property is required. string
The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
Type This property is required. Pulumi.AwsNative.SageMaker.InferenceExperimentType
The type of the inference experiment that you want to run.
DataStorageConfig Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentDataStorageConfig
The Amazon S3 location and configuration for storing inference request and response data.
Description string
The description of the inference experiment.
DesiredState Pulumi.AwsNative.SageMaker.InferenceExperimentDesiredState
The desired state of the experiment after starting or stopping operation.
KmsKey string
The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
Name string
The name for the inference experiment.
Schedule Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentSchedule

The duration for which the inference experiment ran or will run.

The maximum duration that you can set for an inference experiment is 30 days.

ShadowModeConfig Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentShadowModeConfig
The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
StatusReason string
The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
Tags List<Pulumi.AwsNative.Inputs.Tag>
An array of key-value pairs to apply to this resource.
EndpointName This property is required. string
The name of the endpoint.
ModelVariants This property is required. []InferenceExperimentModelVariantConfigArgs
An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
RoleArn This property is required. string
The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
Type This property is required. InferenceExperimentType
The type of the inference experiment that you want to run.
DataStorageConfig InferenceExperimentDataStorageConfigArgs
The Amazon S3 location and configuration for storing inference request and response data.
Description string
The description of the inference experiment.
DesiredState InferenceExperimentDesiredState
The desired state of the experiment after starting or stopping operation.
KmsKey string
The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
Name string
The name for the inference experiment.
Schedule InferenceExperimentScheduleArgs

The duration for which the inference experiment ran or will run.

The maximum duration that you can set for an inference experiment is 30 days.

ShadowModeConfig InferenceExperimentShadowModeConfigArgs
The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
StatusReason string
The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
Tags TagArgs
An array of key-value pairs to apply to this resource.
endpointName This property is required. String
The name of the endpoint.
modelVariants This property is required. List<InferenceExperimentModelVariantConfig>
An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
roleArn This property is required. String
The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
type This property is required. InferenceExperimentType
The type of the inference experiment that you want to run.
dataStorageConfig InferenceExperimentDataStorageConfig
The Amazon S3 location and configuration for storing inference request and response data.
description String
The description of the inference experiment.
desiredState InferenceExperimentDesiredState
The desired state of the experiment after starting or stopping operation.
kmsKey String
The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
name String
The name for the inference experiment.
schedule InferenceExperimentSchedule

The duration for which the inference experiment ran or will run.

The maximum duration that you can set for an inference experiment is 30 days.

shadowModeConfig InferenceExperimentShadowModeConfig
The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
statusReason String
The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
tags List<Tag>
An array of key-value pairs to apply to this resource.
endpointName This property is required. string
The name of the endpoint.
modelVariants This property is required. InferenceExperimentModelVariantConfig[]
An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
roleArn This property is required. string
The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
type This property is required. InferenceExperimentType
The type of the inference experiment that you want to run.
dataStorageConfig InferenceExperimentDataStorageConfig
The Amazon S3 location and configuration for storing inference request and response data.
description string
The description of the inference experiment.
desiredState InferenceExperimentDesiredState
The desired state of the experiment after starting or stopping operation.
kmsKey string
The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
name string
The name for the inference experiment.
schedule InferenceExperimentSchedule

The duration for which the inference experiment ran or will run.

The maximum duration that you can set for an inference experiment is 30 days.

shadowModeConfig InferenceExperimentShadowModeConfig
The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
statusReason string
The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
tags Tag[]
An array of key-value pairs to apply to this resource.
endpoint_name This property is required. str
The name of the endpoint.
model_variants This property is required. Sequence[InferenceExperimentModelVariantConfigArgs]
An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
role_arn This property is required. str
The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
type This property is required. InferenceExperimentType
The type of the inference experiment that you want to run.
data_storage_config InferenceExperimentDataStorageConfigArgs
The Amazon S3 location and configuration for storing inference request and response data.
description str
The description of the inference experiment.
desired_state InferenceExperimentDesiredState
The desired state of the experiment after starting or stopping operation.
kms_key str
The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
name str
The name for the inference experiment.
schedule InferenceExperimentScheduleArgs

The duration for which the inference experiment ran or will run.

The maximum duration that you can set for an inference experiment is 30 days.

shadow_mode_config InferenceExperimentShadowModeConfigArgs
The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
status_reason str
The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
tags Sequence[TagArgs]
An array of key-value pairs to apply to this resource.
endpointName This property is required. String
The name of the endpoint.
modelVariants This property is required. List<Property Map>
An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
roleArn This property is required. String
The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
type This property is required. "ShadowMode"
The type of the inference experiment that you want to run.
dataStorageConfig Property Map
The Amazon S3 location and configuration for storing inference request and response data.
description String
The description of the inference experiment.
desiredState "Running" | "Completed" | "Cancelled"
The desired state of the experiment after starting or stopping operation.
kmsKey String
The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
name String
The name for the inference experiment.
schedule Property Map

The duration for which the inference experiment ran or will run.

The maximum duration that you can set for an inference experiment is 30 days.

shadowModeConfig Property Map
The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
statusReason String
The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
tags List<Property Map>
An array of key-value pairs to apply to this resource.

Outputs

All input properties are implicitly available as output properties. Additionally, the InferenceExperiment resource produces the following output properties:

Arn string
The Amazon Resource Name (ARN) of the inference experiment.
CreationTime string
The timestamp at which you created the inference experiment.
EndpointMetadata Pulumi.AwsNative.SageMaker.Outputs.InferenceExperimentEndpointMetadata
Id string
The provider-assigned unique ID for this managed resource.
LastModifiedTime string
The timestamp at which you last modified the inference experiment.
Status Pulumi.AwsNative.SageMaker.InferenceExperimentStatus
The status of the inference experiment.
Arn string
The Amazon Resource Name (ARN) of the inference experiment.
CreationTime string
The timestamp at which you created the inference experiment.
EndpointMetadata InferenceExperimentEndpointMetadata
Id string
The provider-assigned unique ID for this managed resource.
LastModifiedTime string
The timestamp at which you last modified the inference experiment.
Status InferenceExperimentStatus
The status of the inference experiment.
arn String
The Amazon Resource Name (ARN) of the inference experiment.
creationTime String
The timestamp at which you created the inference experiment.
endpointMetadata InferenceExperimentEndpointMetadata
id String
The provider-assigned unique ID for this managed resource.
lastModifiedTime String
The timestamp at which you last modified the inference experiment.
status InferenceExperimentStatus
The status of the inference experiment.
arn string
The Amazon Resource Name (ARN) of the inference experiment.
creationTime string
The timestamp at which you created the inference experiment.
endpointMetadata InferenceExperimentEndpointMetadata
id string
The provider-assigned unique ID for this managed resource.
lastModifiedTime string
The timestamp at which you last modified the inference experiment.
status InferenceExperimentStatus
The status of the inference experiment.
arn str
The Amazon Resource Name (ARN) of the inference experiment.
creation_time str
The timestamp at which you created the inference experiment.
endpoint_metadata InferenceExperimentEndpointMetadata
id str
The provider-assigned unique ID for this managed resource.
last_modified_time str
The timestamp at which you last modified the inference experiment.
status InferenceExperimentStatus
The status of the inference experiment.
arn String
The Amazon Resource Name (ARN) of the inference experiment.
creationTime String
The timestamp at which you created the inference experiment.
endpointMetadata Property Map
id String
The provider-assigned unique ID for this managed resource.
lastModifiedTime String
The timestamp at which you last modified the inference experiment.
status "Creating" | "Created" | "Updating" | "Starting" | "Stopping" | "Running" | "Completed" | "Cancelled"
The status of the inference experiment.

Supporting Types

InferenceExperimentCaptureContentTypeHeader
, InferenceExperimentCaptureContentTypeHeaderArgs

CsvContentTypes List<string>
The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
JsonContentTypes List<string>
The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
CsvContentTypes []string
The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
JsonContentTypes []string
The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
csvContentTypes List<String>
The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
jsonContentTypes List<String>
The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
csvContentTypes string[]
The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
jsonContentTypes string[]
The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
csv_content_types Sequence[str]
The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
json_content_types Sequence[str]
The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
csvContentTypes List<String>
The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
jsonContentTypes List<String>
The list of all content type headers that SageMaker will treat as JSON and capture accordingly.

InferenceExperimentDataStorageConfig
, InferenceExperimentDataStorageConfigArgs

Destination This property is required. string
The Amazon S3 bucket where the inference request and response data is stored.
ContentType Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentCaptureContentTypeHeader
Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
KmsKey string
The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
Destination This property is required. string
The Amazon S3 bucket where the inference request and response data is stored.
ContentType InferenceExperimentCaptureContentTypeHeader
Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
KmsKey string
The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
destination This property is required. String
The Amazon S3 bucket where the inference request and response data is stored.
contentType InferenceExperimentCaptureContentTypeHeader
Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
kmsKey String
The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
destination This property is required. string
The Amazon S3 bucket where the inference request and response data is stored.
contentType InferenceExperimentCaptureContentTypeHeader
Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
kmsKey string
The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
destination This property is required. str
The Amazon S3 bucket where the inference request and response data is stored.
content_type InferenceExperimentCaptureContentTypeHeader
Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
kms_key str
The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
destination This property is required. String
The Amazon S3 bucket where the inference request and response data is stored.
contentType Property Map
Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
kmsKey String
The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.

InferenceExperimentDesiredState
, InferenceExperimentDesiredStateArgs

Running
Running
Completed
Completed
Cancelled
Cancelled
InferenceExperimentDesiredStateRunning
Running
InferenceExperimentDesiredStateCompleted
Completed
InferenceExperimentDesiredStateCancelled
Cancelled
Running
Running
Completed
Completed
Cancelled
Cancelled
Running
Running
Completed
Completed
Cancelled
Cancelled
RUNNING
Running
COMPLETED
Completed
CANCELLED
Cancelled
"Running"
Running
"Completed"
Completed
"Cancelled"
Cancelled

InferenceExperimentEndpointMetadata
, InferenceExperimentEndpointMetadataArgs

EndpointName This property is required. string
The name of the endpoint.
EndpointConfigName string
The name of the endpoint configuration.
EndpointStatus Pulumi.AwsNative.SageMaker.InferenceExperimentEndpointMetadataEndpointStatus
The status of the endpoint. For possible values of the status of an endpoint.
EndpointName This property is required. string
The name of the endpoint.
EndpointConfigName string
The name of the endpoint configuration.
EndpointStatus InferenceExperimentEndpointMetadataEndpointStatus
The status of the endpoint. For possible values of the status of an endpoint.
endpointName This property is required. String
The name of the endpoint.
endpointConfigName String
The name of the endpoint configuration.
endpointStatus InferenceExperimentEndpointMetadataEndpointStatus
The status of the endpoint. For possible values of the status of an endpoint.
endpointName This property is required. string
The name of the endpoint.
endpointConfigName string
The name of the endpoint configuration.
endpointStatus InferenceExperimentEndpointMetadataEndpointStatus
The status of the endpoint. For possible values of the status of an endpoint.
endpoint_name This property is required. str
The name of the endpoint.
endpoint_config_name str
The name of the endpoint configuration.
endpoint_status InferenceExperimentEndpointMetadataEndpointStatus
The status of the endpoint. For possible values of the status of an endpoint.
endpointName This property is required. String
The name of the endpoint.
endpointConfigName String
The name of the endpoint configuration.
endpointStatus "Creating" | "Updating" | "SystemUpdating" | "RollingBack" | "InService" | "OutOfService" | "Deleting" | "Failed"
The status of the endpoint. For possible values of the status of an endpoint.

InferenceExperimentEndpointMetadataEndpointStatus
, InferenceExperimentEndpointMetadataEndpointStatusArgs

Creating
Creating
Updating
Updating
SystemUpdating
SystemUpdating
RollingBack
RollingBack
InService
InService
OutOfService
OutOfService
Deleting
Deleting
Failed
Failed
InferenceExperimentEndpointMetadataEndpointStatusCreating
Creating
InferenceExperimentEndpointMetadataEndpointStatusUpdating
Updating
InferenceExperimentEndpointMetadataEndpointStatusSystemUpdating
SystemUpdating
InferenceExperimentEndpointMetadataEndpointStatusRollingBack
RollingBack
InferenceExperimentEndpointMetadataEndpointStatusInService
InService
InferenceExperimentEndpointMetadataEndpointStatusOutOfService
OutOfService
InferenceExperimentEndpointMetadataEndpointStatusDeleting
Deleting
InferenceExperimentEndpointMetadataEndpointStatusFailed
Failed
Creating
Creating
Updating
Updating
SystemUpdating
SystemUpdating
RollingBack
RollingBack
InService
InService
OutOfService
OutOfService
Deleting
Deleting
Failed
Failed
Creating
Creating
Updating
Updating
SystemUpdating
SystemUpdating
RollingBack
RollingBack
InService
InService
OutOfService
OutOfService
Deleting
Deleting
Failed
Failed
CREATING
Creating
UPDATING
Updating
SYSTEM_UPDATING
SystemUpdating
ROLLING_BACK
RollingBack
IN_SERVICE
InService
OUT_OF_SERVICE
OutOfService
DELETING
Deleting
FAILED
Failed
"Creating"
Creating
"Updating"
Updating
"SystemUpdating"
SystemUpdating
"RollingBack"
RollingBack
"InService"
InService
"OutOfService"
OutOfService
"Deleting"
Deleting
"Failed"
Failed

InferenceExperimentModelInfrastructureConfig
, InferenceExperimentModelInfrastructureConfigArgs

InfrastructureType This property is required. Pulumi.AwsNative.SageMaker.InferenceExperimentModelInfrastructureConfigInfrastructureType
The type of the inference experiment that you want to run.
RealTimeInferenceConfig This property is required. Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentRealTimeInferenceConfig
The infrastructure configuration for deploying the model to real-time inference.
InfrastructureType This property is required. InferenceExperimentModelInfrastructureConfigInfrastructureType
The type of the inference experiment that you want to run.
RealTimeInferenceConfig This property is required. InferenceExperimentRealTimeInferenceConfig
The infrastructure configuration for deploying the model to real-time inference.
infrastructureType This property is required. InferenceExperimentModelInfrastructureConfigInfrastructureType
The type of the inference experiment that you want to run.
realTimeInferenceConfig This property is required. InferenceExperimentRealTimeInferenceConfig
The infrastructure configuration for deploying the model to real-time inference.
infrastructureType This property is required. InferenceExperimentModelInfrastructureConfigInfrastructureType
The type of the inference experiment that you want to run.
realTimeInferenceConfig This property is required. InferenceExperimentRealTimeInferenceConfig
The infrastructure configuration for deploying the model to real-time inference.
infrastructure_type This property is required. InferenceExperimentModelInfrastructureConfigInfrastructureType
The type of the inference experiment that you want to run.
real_time_inference_config This property is required. InferenceExperimentRealTimeInferenceConfig
The infrastructure configuration for deploying the model to real-time inference.
infrastructureType This property is required. "RealTimeInference"
The type of the inference experiment that you want to run.
realTimeInferenceConfig This property is required. Property Map
The infrastructure configuration for deploying the model to real-time inference.

InferenceExperimentModelInfrastructureConfigInfrastructureType
, InferenceExperimentModelInfrastructureConfigInfrastructureTypeArgs

RealTimeInference
RealTimeInference
InferenceExperimentModelInfrastructureConfigInfrastructureTypeRealTimeInference
RealTimeInference
RealTimeInference
RealTimeInference
RealTimeInference
RealTimeInference
REAL_TIME_INFERENCE
RealTimeInference
"RealTimeInference"
RealTimeInference

InferenceExperimentModelVariantConfig
, InferenceExperimentModelVariantConfigArgs

InfrastructureConfig This property is required. Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentModelInfrastructureConfig
The configuration for the infrastructure that the model will be deployed to.
ModelName This property is required. string
The name of the Amazon SageMaker Model entity.
VariantName This property is required. string
The name of the variant.
InfrastructureConfig This property is required. InferenceExperimentModelInfrastructureConfig
The configuration for the infrastructure that the model will be deployed to.
ModelName This property is required. string
The name of the Amazon SageMaker Model entity.
VariantName This property is required. string
The name of the variant.
infrastructureConfig This property is required. InferenceExperimentModelInfrastructureConfig
The configuration for the infrastructure that the model will be deployed to.
modelName This property is required. String
The name of the Amazon SageMaker Model entity.
variantName This property is required. String
The name of the variant.
infrastructureConfig This property is required. InferenceExperimentModelInfrastructureConfig
The configuration for the infrastructure that the model will be deployed to.
modelName This property is required. string
The name of the Amazon SageMaker Model entity.
variantName This property is required. string
The name of the variant.
infrastructure_config This property is required. InferenceExperimentModelInfrastructureConfig
The configuration for the infrastructure that the model will be deployed to.
model_name This property is required. str
The name of the Amazon SageMaker Model entity.
variant_name This property is required. str
The name of the variant.
infrastructureConfig This property is required. Property Map
The configuration for the infrastructure that the model will be deployed to.
modelName This property is required. String
The name of the Amazon SageMaker Model entity.
variantName This property is required. String
The name of the variant.

InferenceExperimentRealTimeInferenceConfig
, InferenceExperimentRealTimeInferenceConfigArgs

InstanceCount This property is required. int
The number of instances of the type specified by InstanceType.
InstanceType This property is required. string
The instance type the model is deployed to.
InstanceCount This property is required. int
The number of instances of the type specified by InstanceType.
InstanceType This property is required. string
The instance type the model is deployed to.
instanceCount This property is required. Integer
The number of instances of the type specified by InstanceType.
instanceType This property is required. String
The instance type the model is deployed to.
instanceCount This property is required. number
The number of instances of the type specified by InstanceType.
instanceType This property is required. string
The instance type the model is deployed to.
instance_count This property is required. int
The number of instances of the type specified by InstanceType.
instance_type This property is required. str
The instance type the model is deployed to.
instanceCount This property is required. Number
The number of instances of the type specified by InstanceType.
instanceType This property is required. String
The instance type the model is deployed to.

InferenceExperimentSchedule
, InferenceExperimentScheduleArgs

EndTime string
The timestamp at which the inference experiment ended or will end.
StartTime string
The timestamp at which the inference experiment started or will start.
EndTime string
The timestamp at which the inference experiment ended or will end.
StartTime string
The timestamp at which the inference experiment started or will start.
endTime String
The timestamp at which the inference experiment ended or will end.
startTime String
The timestamp at which the inference experiment started or will start.
endTime string
The timestamp at which the inference experiment ended or will end.
startTime string
The timestamp at which the inference experiment started or will start.
end_time str
The timestamp at which the inference experiment ended or will end.
start_time str
The timestamp at which the inference experiment started or will start.
endTime String
The timestamp at which the inference experiment ended or will end.
startTime String
The timestamp at which the inference experiment started or will start.

InferenceExperimentShadowModeConfig
, InferenceExperimentShadowModeConfigArgs

ShadowModelVariants This property is required. List<Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentShadowModelVariantConfig>
List of shadow variant configurations.
SourceModelVariantName This property is required. string
The name of the production variant, which takes all the inference requests.
ShadowModelVariants This property is required. []InferenceExperimentShadowModelVariantConfig
List of shadow variant configurations.
SourceModelVariantName This property is required. string
The name of the production variant, which takes all the inference requests.
shadowModelVariants This property is required. List<InferenceExperimentShadowModelVariantConfig>
List of shadow variant configurations.
sourceModelVariantName This property is required. String
The name of the production variant, which takes all the inference requests.
shadowModelVariants This property is required. InferenceExperimentShadowModelVariantConfig[]
List of shadow variant configurations.
sourceModelVariantName This property is required. string
The name of the production variant, which takes all the inference requests.
shadow_model_variants This property is required. Sequence[InferenceExperimentShadowModelVariantConfig]
List of shadow variant configurations.
source_model_variant_name This property is required. str
The name of the production variant, which takes all the inference requests.
shadowModelVariants This property is required. List<Property Map>
List of shadow variant configurations.
sourceModelVariantName This property is required. String
The name of the production variant, which takes all the inference requests.

InferenceExperimentShadowModelVariantConfig
, InferenceExperimentShadowModelVariantConfigArgs

SamplingPercentage This property is required. int
The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
ShadowModelVariantName This property is required. string
The name of the shadow variant.
SamplingPercentage This property is required. int
The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
ShadowModelVariantName This property is required. string
The name of the shadow variant.
samplingPercentage This property is required. Integer
The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
shadowModelVariantName This property is required. String
The name of the shadow variant.
samplingPercentage This property is required. number
The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
shadowModelVariantName This property is required. string
The name of the shadow variant.
sampling_percentage This property is required. int
The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
shadow_model_variant_name This property is required. str
The name of the shadow variant.
samplingPercentage This property is required. Number
The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
shadowModelVariantName This property is required. String
The name of the shadow variant.

InferenceExperimentStatus
, InferenceExperimentStatusArgs

Creating
Creating
Created
Created
Updating
Updating
Starting
Starting
Stopping
Stopping
Running
Running
Completed
Completed
Cancelled
Cancelled
InferenceExperimentStatusCreating
Creating
InferenceExperimentStatusCreated
Created
InferenceExperimentStatusUpdating
Updating
InferenceExperimentStatusStarting
Starting
InferenceExperimentStatusStopping
Stopping
InferenceExperimentStatusRunning
Running
InferenceExperimentStatusCompleted
Completed
InferenceExperimentStatusCancelled
Cancelled
Creating
Creating
Created
Created
Updating
Updating
Starting
Starting
Stopping
Stopping
Running
Running
Completed
Completed
Cancelled
Cancelled
Creating
Creating
Created
Created
Updating
Updating
Starting
Starting
Stopping
Stopping
Running
Running
Completed
Completed
Cancelled
Cancelled
CREATING
Creating
CREATED
Created
UPDATING
Updating
STARTING
Starting
STOPPING
Stopping
RUNNING
Running
COMPLETED
Completed
CANCELLED
Cancelled
"Creating"
Creating
"Created"
Created
"Updating"
Updating
"Starting"
Starting
"Stopping"
Stopping
"Running"
Running
"Completed"
Completed
"Cancelled"
Cancelled

InferenceExperimentType
, InferenceExperimentTypeArgs

ShadowMode
ShadowMode
InferenceExperimentTypeShadowMode
ShadowMode
ShadowMode
ShadowMode
ShadowMode
ShadowMode
SHADOW_MODE
ShadowMode
"ShadowMode"
ShadowMode

Tag
, TagArgs

Key This property is required. string
The key name of the tag
Value This property is required. string
The value of the tag
Key This property is required. string
The key name of the tag
Value This property is required. string
The value of the tag
key This property is required. String
The key name of the tag
value This property is required. String
The value of the tag
key This property is required. string
The key name of the tag
value This property is required. string
The value of the tag
key This property is required. str
The key name of the tag
value This property is required. str
The value of the tag
key This property is required. String
The key name of the tag
value This property is required. String
The value of the tag

Package Details

Repository
AWS Native pulumi/pulumi-aws-native
License
Apache-2.0

We recommend new projects start with resources from the AWS provider.

AWS Cloud Control v1.27.0 published on Monday, Apr 14, 2025 by Pulumi