We recommend new projects start with resources from the AWS provider.
aws-native.sagemaker.InferenceExperiment
Explore with Pulumi AI
We recommend new projects start with resources from the AWS provider.
Resource Type definition for AWS::SageMaker::InferenceExperiment
Create InferenceExperiment Resource
Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.
Constructor syntax
new InferenceExperiment(name: string, args: InferenceExperimentArgs, opts?: CustomResourceOptions);@overload
def InferenceExperiment(resource_name: str,
                        args: InferenceExperimentArgs,
                        opts: Optional[ResourceOptions] = None)
@overload
def InferenceExperiment(resource_name: str,
                        opts: Optional[ResourceOptions] = None,
                        model_variants: Optional[Sequence[InferenceExperimentModelVariantConfigArgs]] = None,
                        type: Optional[InferenceExperimentType] = None,
                        role_arn: Optional[str] = None,
                        endpoint_name: Optional[str] = None,
                        name: Optional[str] = None,
                        kms_key: Optional[str] = None,
                        data_storage_config: Optional[InferenceExperimentDataStorageConfigArgs] = None,
                        desired_state: Optional[InferenceExperimentDesiredState] = None,
                        schedule: Optional[InferenceExperimentScheduleArgs] = None,
                        shadow_mode_config: Optional[InferenceExperimentShadowModeConfigArgs] = None,
                        status_reason: Optional[str] = None,
                        tags: Optional[Sequence[_root_inputs.TagArgs]] = None,
                        description: Optional[str] = None)func NewInferenceExperiment(ctx *Context, name string, args InferenceExperimentArgs, opts ...ResourceOption) (*InferenceExperiment, error)public InferenceExperiment(string name, InferenceExperimentArgs args, CustomResourceOptions? opts = null)
public InferenceExperiment(String name, InferenceExperimentArgs args)
public InferenceExperiment(String name, InferenceExperimentArgs args, CustomResourceOptions options)
type: aws-native:sagemaker:InferenceExperiment
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.
Parameters
- name string
- The unique name of the resource.
- args InferenceExperimentArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- args InferenceExperimentArgs
- The arguments to resource properties.
- opts ResourceOptions
- Bag of options to control resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args InferenceExperimentArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args InferenceExperimentArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- name String
- The unique name of the resource.
- args InferenceExperimentArgs
- The arguments to resource properties.
- options CustomResourceOptions
- Bag of options to control resource's behavior.
InferenceExperiment Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.
Inputs
In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.
The InferenceExperiment resource accepts the following input properties:
- EndpointName string
- The name of the endpoint.
- ModelVariants List<Pulumi.Aws Native. Sage Maker. Inputs. Inference Experiment Model Variant Config> 
- An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- RoleArn string
- The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
- Type
Pulumi.Aws Native. Sage Maker. Inference Experiment Type 
- The type of the inference experiment that you want to run.
- DataStorage Pulumi.Config Aws Native. Sage Maker. Inputs. Inference Experiment Data Storage Config 
- The Amazon S3 location and configuration for storing inference request and response data.
- Description string
- The description of the inference experiment.
- DesiredState Pulumi.Aws Native. Sage Maker. Inference Experiment Desired State 
- The desired state of the experiment after starting or stopping operation.
- KmsKey string
- The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
- Name string
- The name for the inference experiment.
- Schedule
Pulumi.Aws Native. Sage Maker. Inputs. Inference Experiment Schedule 
- The duration for which the inference experiment ran or will run. - The maximum duration that you can set for an inference experiment is 30 days. 
- ShadowMode Pulumi.Config Aws Native. Sage Maker. Inputs. Inference Experiment Shadow Mode Config 
- The configuration of ShadowModeinference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
- StatusReason string
- The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- 
List<Pulumi.Aws Native. Inputs. Tag> 
- An array of key-value pairs to apply to this resource.
- EndpointName string
- The name of the endpoint.
- ModelVariants []InferenceExperiment Model Variant Config Args 
- An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- RoleArn string
- The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
- Type
InferenceExperiment Type 
- The type of the inference experiment that you want to run.
- DataStorage InferenceConfig Experiment Data Storage Config Args 
- The Amazon S3 location and configuration for storing inference request and response data.
- Description string
- The description of the inference experiment.
- DesiredState InferenceExperiment Desired State 
- The desired state of the experiment after starting or stopping operation.
- KmsKey string
- The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
- Name string
- The name for the inference experiment.
- Schedule
InferenceExperiment Schedule Args 
- The duration for which the inference experiment ran or will run. - The maximum duration that you can set for an inference experiment is 30 days. 
- ShadowMode InferenceConfig Experiment Shadow Mode Config Args 
- The configuration of ShadowModeinference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
- StatusReason string
- The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- 
TagArgs 
- An array of key-value pairs to apply to this resource.
- endpointName String
- The name of the endpoint.
- modelVariants List<InferenceExperiment Model Variant Config> 
- An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- roleArn String
- The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
- type
InferenceExperiment Type 
- The type of the inference experiment that you want to run.
- dataStorage InferenceConfig Experiment Data Storage Config 
- The Amazon S3 location and configuration for storing inference request and response data.
- description String
- The description of the inference experiment.
- desiredState InferenceExperiment Desired State 
- The desired state of the experiment after starting or stopping operation.
- kmsKey String
- The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
- name String
- The name for the inference experiment.
- schedule
InferenceExperiment Schedule 
- The duration for which the inference experiment ran or will run. - The maximum duration that you can set for an inference experiment is 30 days. 
- shadowMode InferenceConfig Experiment Shadow Mode Config 
- The configuration of ShadowModeinference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
- statusReason String
- The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- List<Tag>
- An array of key-value pairs to apply to this resource.
- endpointName string
- The name of the endpoint.
- modelVariants InferenceExperiment Model Variant Config[] 
- An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- roleArn string
- The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
- type
InferenceExperiment Type 
- The type of the inference experiment that you want to run.
- dataStorage InferenceConfig Experiment Data Storage Config 
- The Amazon S3 location and configuration for storing inference request and response data.
- description string
- The description of the inference experiment.
- desiredState InferenceExperiment Desired State 
- The desired state of the experiment after starting or stopping operation.
- kmsKey string
- The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
- name string
- The name for the inference experiment.
- schedule
InferenceExperiment Schedule 
- The duration for which the inference experiment ran or will run. - The maximum duration that you can set for an inference experiment is 30 days. 
- shadowMode InferenceConfig Experiment Shadow Mode Config 
- The configuration of ShadowModeinference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
- statusReason string
- The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- Tag[]
- An array of key-value pairs to apply to this resource.
- endpoint_name str
- The name of the endpoint.
- model_variants Sequence[InferenceExperiment Model Variant Config Args] 
- An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- role_arn str
- The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
- type
InferenceExperiment Type 
- The type of the inference experiment that you want to run.
- data_storage_ Inferenceconfig Experiment Data Storage Config Args 
- The Amazon S3 location and configuration for storing inference request and response data.
- description str
- The description of the inference experiment.
- desired_state InferenceExperiment Desired State 
- The desired state of the experiment after starting or stopping operation.
- kms_key str
- The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
- name str
- The name for the inference experiment.
- schedule
InferenceExperiment Schedule Args 
- The duration for which the inference experiment ran or will run. - The maximum duration that you can set for an inference experiment is 30 days. 
- shadow_mode_ Inferenceconfig Experiment Shadow Mode Config Args 
- The configuration of ShadowModeinference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
- status_reason str
- The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- 
Sequence[TagArgs] 
- An array of key-value pairs to apply to this resource.
- endpointName String
- The name of the endpoint.
- modelVariants List<Property Map>
- An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- roleArn String
- The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
- type
"ShadowMode" 
- The type of the inference experiment that you want to run.
- dataStorage Property MapConfig 
- The Amazon S3 location and configuration for storing inference request and response data.
- description String
- The description of the inference experiment.
- desiredState "Running" | "Completed" | "Cancelled"
- The desired state of the experiment after starting or stopping operation.
- kmsKey String
- The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
- name String
- The name for the inference experiment.
- schedule Property Map
- The duration for which the inference experiment ran or will run. - The maximum duration that you can set for an inference experiment is 30 days. 
- shadowMode Property MapConfig 
- The configuration of ShadowModeinference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
- statusReason String
- The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- List<Property Map>
- An array of key-value pairs to apply to this resource.
Outputs
All input properties are implicitly available as output properties. Additionally, the InferenceExperiment resource produces the following output properties:
- Arn string
- The Amazon Resource Name (ARN) of the inference experiment.
- CreationTime string
- The timestamp at which you created the inference experiment.
- EndpointMetadata Pulumi.Aws Native. Sage Maker. Outputs. Inference Experiment Endpoint Metadata 
- Id string
- The provider-assigned unique ID for this managed resource.
- LastModified stringTime 
- The timestamp at which you last modified the inference experiment.
- Status
Pulumi.Aws Native. Sage Maker. Inference Experiment Status 
- The status of the inference experiment.
- Arn string
- The Amazon Resource Name (ARN) of the inference experiment.
- CreationTime string
- The timestamp at which you created the inference experiment.
- EndpointMetadata InferenceExperiment Endpoint Metadata 
- Id string
- The provider-assigned unique ID for this managed resource.
- LastModified stringTime 
- The timestamp at which you last modified the inference experiment.
- Status
InferenceExperiment Status 
- The status of the inference experiment.
- arn String
- The Amazon Resource Name (ARN) of the inference experiment.
- creationTime String
- The timestamp at which you created the inference experiment.
- endpointMetadata InferenceExperiment Endpoint Metadata 
- id String
- The provider-assigned unique ID for this managed resource.
- lastModified StringTime 
- The timestamp at which you last modified the inference experiment.
- status
InferenceExperiment Status 
- The status of the inference experiment.
- arn string
- The Amazon Resource Name (ARN) of the inference experiment.
- creationTime string
- The timestamp at which you created the inference experiment.
- endpointMetadata InferenceExperiment Endpoint Metadata 
- id string
- The provider-assigned unique ID for this managed resource.
- lastModified stringTime 
- The timestamp at which you last modified the inference experiment.
- status
InferenceExperiment Status 
- The status of the inference experiment.
- arn str
- The Amazon Resource Name (ARN) of the inference experiment.
- creation_time str
- The timestamp at which you created the inference experiment.
- endpoint_metadata InferenceExperiment Endpoint Metadata 
- id str
- The provider-assigned unique ID for this managed resource.
- last_modified_ strtime 
- The timestamp at which you last modified the inference experiment.
- status
InferenceExperiment Status 
- The status of the inference experiment.
- arn String
- The Amazon Resource Name (ARN) of the inference experiment.
- creationTime String
- The timestamp at which you created the inference experiment.
- endpointMetadata Property Map
- id String
- The provider-assigned unique ID for this managed resource.
- lastModified StringTime 
- The timestamp at which you last modified the inference experiment.
- status "Creating" | "Created" | "Updating" | "Starting" | "Stopping" | "Running" | "Completed" | "Cancelled"
- The status of the inference experiment.
Supporting Types
InferenceExperimentCaptureContentTypeHeader, InferenceExperimentCaptureContentTypeHeaderArgs            
- CsvContent List<string>Types 
- The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- JsonContent List<string>Types 
- The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
- CsvContent []stringTypes 
- The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- JsonContent []stringTypes 
- The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
- csvContent List<String>Types 
- The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- jsonContent List<String>Types 
- The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
- csvContent string[]Types 
- The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- jsonContent string[]Types 
- The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
- csv_content_ Sequence[str]types 
- The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- json_content_ Sequence[str]types 
- The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
- csvContent List<String>Types 
- The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- jsonContent List<String>Types 
- The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
InferenceExperimentDataStorageConfig, InferenceExperimentDataStorageConfigArgs          
- Destination string
- The Amazon S3 bucket where the inference request and response data is stored.
- ContentType Pulumi.Aws Native. Sage Maker. Inputs. Inference Experiment Capture Content Type Header 
- Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- KmsKey string
- The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
- Destination string
- The Amazon S3 bucket where the inference request and response data is stored.
- ContentType InferenceExperiment Capture Content Type Header 
- Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- KmsKey string
- The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
- destination String
- The Amazon S3 bucket where the inference request and response data is stored.
- contentType InferenceExperiment Capture Content Type Header 
- Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- kmsKey String
- The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
- destination string
- The Amazon S3 bucket where the inference request and response data is stored.
- contentType InferenceExperiment Capture Content Type Header 
- Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- kmsKey string
- The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
- destination str
- The Amazon S3 bucket where the inference request and response data is stored.
- content_type InferenceExperiment Capture Content Type Header 
- Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- kms_key str
- The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
- destination String
- The Amazon S3 bucket where the inference request and response data is stored.
- contentType Property Map
- Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- kmsKey String
- The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
InferenceExperimentDesiredState, InferenceExperimentDesiredStateArgs        
- Running
- Running
- Completed
- Completed
- Cancelled
- Cancelled
- InferenceExperiment Desired State Running 
- Running
- InferenceExperiment Desired State Completed 
- Completed
- InferenceExperiment Desired State Cancelled 
- Cancelled
- Running
- Running
- Completed
- Completed
- Cancelled
- Cancelled
- Running
- Running
- Completed
- Completed
- Cancelled
- Cancelled
- RUNNING
- Running
- COMPLETED
- Completed
- CANCELLED
- Cancelled
- "Running"
- Running
- "Completed"
- Completed
- "Cancelled"
- Cancelled
InferenceExperimentEndpointMetadata, InferenceExperimentEndpointMetadataArgs        
- EndpointName string
- The name of the endpoint.
- EndpointConfig stringName 
- The name of the endpoint configuration.
- EndpointStatus Pulumi.Aws Native. Sage Maker. Inference Experiment Endpoint Metadata Endpoint Status 
- The status of the endpoint. For possible values of the status of an endpoint.
- EndpointName string
- The name of the endpoint.
- EndpointConfig stringName 
- The name of the endpoint configuration.
- EndpointStatus InferenceExperiment Endpoint Metadata Endpoint Status 
- The status of the endpoint. For possible values of the status of an endpoint.
- endpointName String
- The name of the endpoint.
- endpointConfig StringName 
- The name of the endpoint configuration.
- endpointStatus InferenceExperiment Endpoint Metadata Endpoint Status 
- The status of the endpoint. For possible values of the status of an endpoint.
- endpointName string
- The name of the endpoint.
- endpointConfig stringName 
- The name of the endpoint configuration.
- endpointStatus InferenceExperiment Endpoint Metadata Endpoint Status 
- The status of the endpoint. For possible values of the status of an endpoint.
- endpoint_name str
- The name of the endpoint.
- endpoint_config_ strname 
- The name of the endpoint configuration.
- endpoint_status InferenceExperiment Endpoint Metadata Endpoint Status 
- The status of the endpoint. For possible values of the status of an endpoint.
- endpointName String
- The name of the endpoint.
- endpointConfig StringName 
- The name of the endpoint configuration.
- endpointStatus "Creating" | "Updating" | "SystemUpdating" | "Rolling Back" | "In Service" | "Out Of Service" | "Deleting" | "Failed" 
- The status of the endpoint. For possible values of the status of an endpoint.
InferenceExperimentEndpointMetadataEndpointStatus, InferenceExperimentEndpointMetadataEndpointStatusArgs            
- Creating
- Creating
- Updating
- Updating
- SystemUpdating 
- SystemUpdating
- RollingBack 
- RollingBack
- InService 
- InService
- OutOf Service 
- OutOfService
- Deleting
- Deleting
- Failed
- Failed
- InferenceExperiment Endpoint Metadata Endpoint Status Creating 
- Creating
- InferenceExperiment Endpoint Metadata Endpoint Status Updating 
- Updating
- InferenceExperiment Endpoint Metadata Endpoint Status System Updating 
- SystemUpdating
- InferenceExperiment Endpoint Metadata Endpoint Status Rolling Back 
- RollingBack
- InferenceExperiment Endpoint Metadata Endpoint Status In Service 
- InService
- InferenceExperiment Endpoint Metadata Endpoint Status Out Of Service 
- OutOfService
- InferenceExperiment Endpoint Metadata Endpoint Status Deleting 
- Deleting
- InferenceExperiment Endpoint Metadata Endpoint Status Failed 
- Failed
- Creating
- Creating
- Updating
- Updating
- SystemUpdating 
- SystemUpdating
- RollingBack 
- RollingBack
- InService 
- InService
- OutOf Service 
- OutOfService
- Deleting
- Deleting
- Failed
- Failed
- Creating
- Creating
- Updating
- Updating
- SystemUpdating 
- SystemUpdating
- RollingBack 
- RollingBack
- InService 
- InService
- OutOf Service 
- OutOfService
- Deleting
- Deleting
- Failed
- Failed
- CREATING
- Creating
- UPDATING
- Updating
- SYSTEM_UPDATING
- SystemUpdating
- ROLLING_BACK
- RollingBack
- IN_SERVICE
- InService
- OUT_OF_SERVICE
- OutOfService
- DELETING
- Deleting
- FAILED
- Failed
- "Creating"
- Creating
- "Updating"
- Updating
- "SystemUpdating" 
- SystemUpdating
- "RollingBack" 
- RollingBack
- "InService" 
- InService
- "OutOf Service" 
- OutOfService
- "Deleting"
- Deleting
- "Failed"
- Failed
InferenceExperimentModelInfrastructureConfig, InferenceExperimentModelInfrastructureConfigArgs          
- InfrastructureType Pulumi.Aws Native. Sage Maker. Inference Experiment Model Infrastructure Config Infrastructure Type 
- The type of the inference experiment that you want to run.
- RealTime Pulumi.Inference Config Aws Native. Sage Maker. Inputs. Inference Experiment Real Time Inference Config 
- The infrastructure configuration for deploying the model to real-time inference.
- InfrastructureType InferenceExperiment Model Infrastructure Config Infrastructure Type 
- The type of the inference experiment that you want to run.
- RealTime InferenceInference Config Experiment Real Time Inference Config 
- The infrastructure configuration for deploying the model to real-time inference.
- infrastructureType InferenceExperiment Model Infrastructure Config Infrastructure Type 
- The type of the inference experiment that you want to run.
- realTime InferenceInference Config Experiment Real Time Inference Config 
- The infrastructure configuration for deploying the model to real-time inference.
- infrastructureType InferenceExperiment Model Infrastructure Config Infrastructure Type 
- The type of the inference experiment that you want to run.
- realTime InferenceInference Config Experiment Real Time Inference Config 
- The infrastructure configuration for deploying the model to real-time inference.
- infrastructure_type InferenceExperiment Model Infrastructure Config Infrastructure Type 
- The type of the inference experiment that you want to run.
- real_time_ Inferenceinference_ config Experiment Real Time Inference Config 
- The infrastructure configuration for deploying the model to real-time inference.
- infrastructureType "RealTime Inference" 
- The type of the inference experiment that you want to run.
- realTime Property MapInference Config 
- The infrastructure configuration for deploying the model to real-time inference.
InferenceExperimentModelInfrastructureConfigInfrastructureType, InferenceExperimentModelInfrastructureConfigInfrastructureTypeArgs              
- RealTime Inference 
- RealTimeInference
- InferenceExperiment Model Infrastructure Config Infrastructure Type Real Time Inference 
- RealTimeInference
- RealTime Inference 
- RealTimeInference
- RealTime Inference 
- RealTimeInference
- REAL_TIME_INFERENCE
- RealTimeInference
- "RealTime Inference" 
- RealTimeInference
InferenceExperimentModelVariantConfig, InferenceExperimentModelVariantConfigArgs          
- InfrastructureConfig Pulumi.Aws Native. Sage Maker. Inputs. Inference Experiment Model Infrastructure Config 
- The configuration for the infrastructure that the model will be deployed to.
- ModelName string
- The name of the Amazon SageMaker Model entity.
- VariantName string
- The name of the variant.
- InfrastructureConfig InferenceExperiment Model Infrastructure Config 
- The configuration for the infrastructure that the model will be deployed to.
- ModelName string
- The name of the Amazon SageMaker Model entity.
- VariantName string
- The name of the variant.
- infrastructureConfig InferenceExperiment Model Infrastructure Config 
- The configuration for the infrastructure that the model will be deployed to.
- modelName String
- The name of the Amazon SageMaker Model entity.
- variantName String
- The name of the variant.
- infrastructureConfig InferenceExperiment Model Infrastructure Config 
- The configuration for the infrastructure that the model will be deployed to.
- modelName string
- The name of the Amazon SageMaker Model entity.
- variantName string
- The name of the variant.
- infrastructure_config InferenceExperiment Model Infrastructure Config 
- The configuration for the infrastructure that the model will be deployed to.
- model_name str
- The name of the Amazon SageMaker Model entity.
- variant_name str
- The name of the variant.
- infrastructureConfig Property Map
- The configuration for the infrastructure that the model will be deployed to.
- modelName String
- The name of the Amazon SageMaker Model entity.
- variantName String
- The name of the variant.
InferenceExperimentRealTimeInferenceConfig, InferenceExperimentRealTimeInferenceConfigArgs            
- InstanceCount int
- The number of instances of the type specified by InstanceType.
- InstanceType string
- The instance type the model is deployed to.
- InstanceCount int
- The number of instances of the type specified by InstanceType.
- InstanceType string
- The instance type the model is deployed to.
- instanceCount Integer
- The number of instances of the type specified by InstanceType.
- instanceType String
- The instance type the model is deployed to.
- instanceCount number
- The number of instances of the type specified by InstanceType.
- instanceType string
- The instance type the model is deployed to.
- instance_count int
- The number of instances of the type specified by InstanceType.
- instance_type str
- The instance type the model is deployed to.
- instanceCount Number
- The number of instances of the type specified by InstanceType.
- instanceType String
- The instance type the model is deployed to.
InferenceExperimentSchedule, InferenceExperimentScheduleArgs      
- end_time str
- The timestamp at which the inference experiment ended or will end.
- start_time str
- The timestamp at which the inference experiment started or will start.
InferenceExperimentShadowModeConfig, InferenceExperimentShadowModeConfigArgs          
- ShadowModel List<Pulumi.Variants Aws Native. Sage Maker. Inputs. Inference Experiment Shadow Model Variant Config> 
- List of shadow variant configurations.
- SourceModel stringVariant Name 
- The name of the production variant, which takes all the inference requests.
- ShadowModel []InferenceVariants Experiment Shadow Model Variant Config 
- List of shadow variant configurations.
- SourceModel stringVariant Name 
- The name of the production variant, which takes all the inference requests.
- shadowModel List<InferenceVariants Experiment Shadow Model Variant Config> 
- List of shadow variant configurations.
- sourceModel StringVariant Name 
- The name of the production variant, which takes all the inference requests.
- shadowModel InferenceVariants Experiment Shadow Model Variant Config[] 
- List of shadow variant configurations.
- sourceModel stringVariant Name 
- The name of the production variant, which takes all the inference requests.
- shadow_model_ Sequence[Inferencevariants Experiment Shadow Model Variant Config] 
- List of shadow variant configurations.
- source_model_ strvariant_ name 
- The name of the production variant, which takes all the inference requests.
- shadowModel List<Property Map>Variants 
- List of shadow variant configurations.
- sourceModel StringVariant Name 
- The name of the production variant, which takes all the inference requests.
InferenceExperimentShadowModelVariantConfig, InferenceExperimentShadowModelVariantConfigArgs            
- SamplingPercentage int
- The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- ShadowModel stringVariant Name 
- The name of the shadow variant.
- SamplingPercentage int
- The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- ShadowModel stringVariant Name 
- The name of the shadow variant.
- samplingPercentage Integer
- The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- shadowModel StringVariant Name 
- The name of the shadow variant.
- samplingPercentage number
- The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- shadowModel stringVariant Name 
- The name of the shadow variant.
- sampling_percentage int
- The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- shadow_model_ strvariant_ name 
- The name of the shadow variant.
- samplingPercentage Number
- The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- shadowModel StringVariant Name 
- The name of the shadow variant.
InferenceExperimentStatus, InferenceExperimentStatusArgs      
- Creating
- Creating
- Created
- Created
- Updating
- Updating
- Starting
- Starting
- Stopping
- Stopping
- Running
- Running
- Completed
- Completed
- Cancelled
- Cancelled
- InferenceExperiment Status Creating 
- Creating
- InferenceExperiment Status Created 
- Created
- InferenceExperiment Status Updating 
- Updating
- InferenceExperiment Status Starting 
- Starting
- InferenceExperiment Status Stopping 
- Stopping
- InferenceExperiment Status Running 
- Running
- InferenceExperiment Status Completed 
- Completed
- InferenceExperiment Status Cancelled 
- Cancelled
- Creating
- Creating
- Created
- Created
- Updating
- Updating
- Starting
- Starting
- Stopping
- Stopping
- Running
- Running
- Completed
- Completed
- Cancelled
- Cancelled
- Creating
- Creating
- Created
- Created
- Updating
- Updating
- Starting
- Starting
- Stopping
- Stopping
- Running
- Running
- Completed
- Completed
- Cancelled
- Cancelled
- CREATING
- Creating
- CREATED
- Created
- UPDATING
- Updating
- STARTING
- Starting
- STOPPING
- Stopping
- RUNNING
- Running
- COMPLETED
- Completed
- CANCELLED
- Cancelled
- "Creating"
- Creating
- "Created"
- Created
- "Updating"
- Updating
- "Starting"
- Starting
- "Stopping"
- Stopping
- "Running"
- Running
- "Completed"
- Completed
- "Cancelled"
- Cancelled
InferenceExperimentType, InferenceExperimentTypeArgs      
- ShadowMode 
- ShadowMode
- InferenceExperiment Type Shadow Mode 
- ShadowMode
- ShadowMode 
- ShadowMode
- ShadowMode 
- ShadowMode
- SHADOW_MODE
- ShadowMode
- "ShadowMode" 
- ShadowMode
Tag, TagArgs  
Package Details
- Repository
- AWS Native pulumi/pulumi-aws-native
- License
- Apache-2.0
We recommend new projects start with resources from the AWS provider.