akamai.Datastream
Explore with Pulumi AI
Create Datastream Resource
Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.
Constructor syntax
new Datastream(name: string, args: DatastreamArgs, opts?: CustomResourceOptions);@overload
def Datastream(resource_name: str,
               args: DatastreamArgs,
               opts: Optional[ResourceOptions] = None)
@overload
def Datastream(resource_name: str,
               opts: Optional[ResourceOptions] = None,
               properties: Optional[Sequence[str]] = None,
               active: Optional[bool] = None,
               stream_name: Optional[str] = None,
               contract_id: Optional[str] = None,
               group_id: Optional[str] = None,
               dataset_fields: Optional[Sequence[int]] = None,
               delivery_configuration: Optional[DatastreamDeliveryConfigurationArgs] = None,
               splunk_connector: Optional[DatastreamSplunkConnectorArgs] = None,
               sumologic_connector: Optional[DatastreamSumologicConnectorArgs] = None,
               azure_connector: Optional[DatastreamAzureConnectorArgs] = None,
               elasticsearch_connector: Optional[DatastreamElasticsearchConnectorArgs] = None,
               loggly_connector: Optional[DatastreamLogglyConnectorArgs] = None,
               new_relic_connector: Optional[DatastreamNewRelicConnectorArgs] = None,
               notification_emails: Optional[Sequence[str]] = None,
               oracle_connector: Optional[DatastreamOracleConnectorArgs] = None,
               https_connector: Optional[DatastreamHttpsConnectorArgs] = None,
               s3_connector: Optional[DatastreamS3ConnectorArgs] = None,
               datadog_connector: Optional[DatastreamDatadogConnectorArgs] = None,
               collect_midgress: Optional[bool] = None,
               gcs_connector: Optional[DatastreamGcsConnectorArgs] = None)func NewDatastream(ctx *Context, name string, args DatastreamArgs, opts ...ResourceOption) (*Datastream, error)public Datastream(string name, DatastreamArgs args, CustomResourceOptions? opts = null)
public Datastream(String name, DatastreamArgs args)
public Datastream(String name, DatastreamArgs args, CustomResourceOptions options)
type: akamai:Datastream
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.
Parameters
- name string
- The unique name of the resource.
- args DatastreamArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- args DatastreamArgs
- The arguments to resource properties.
- opts ResourceOptions
- Bag of options to control resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args DatastreamArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args DatastreamArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- name String
- The unique name of the resource.
- args DatastreamArgs
- The arguments to resource properties.
- options CustomResourceOptions
- Bag of options to control resource's behavior.
Constructor example
The following reference example uses placeholder values for all input properties.
var datastreamResource = new Akamai.Datastream("datastreamResource", new()
{
    Properties = new[]
    {
        "string",
    },
    Active = false,
    StreamName = "string",
    ContractId = "string",
    GroupId = "string",
    DatasetFields = new[]
    {
        0,
    },
    DeliveryConfiguration = new Akamai.Inputs.DatastreamDeliveryConfigurationArgs
    {
        Format = "string",
        Frequency = new Akamai.Inputs.DatastreamDeliveryConfigurationFrequencyArgs
        {
            IntervalInSecs = 0,
        },
        FieldDelimiter = "string",
        UploadFilePrefix = "string",
        UploadFileSuffix = "string",
    },
    SplunkConnector = new Akamai.Inputs.DatastreamSplunkConnectorArgs
    {
        DisplayName = "string",
        Endpoint = "string",
        EventCollectorToken = "string",
        CaCert = "string",
        ClientCert = "string",
        ClientKey = "string",
        CompressLogs = false,
        CustomHeaderName = "string",
        CustomHeaderValue = "string",
        MTls = false,
        TlsHostname = "string",
    },
    SumologicConnector = new Akamai.Inputs.DatastreamSumologicConnectorArgs
    {
        CollectorCode = "string",
        DisplayName = "string",
        Endpoint = "string",
        CompressLogs = false,
        ContentType = "string",
        CustomHeaderName = "string",
        CustomHeaderValue = "string",
    },
    AzureConnector = new Akamai.Inputs.DatastreamAzureConnectorArgs
    {
        AccessKey = "string",
        AccountName = "string",
        ContainerName = "string",
        DisplayName = "string",
        Path = "string",
        CompressLogs = false,
    },
    ElasticsearchConnector = new Akamai.Inputs.DatastreamElasticsearchConnectorArgs
    {
        DisplayName = "string",
        UserName = "string",
        Password = "string",
        IndexName = "string",
        Endpoint = "string",
        ContentType = "string",
        CustomHeaderValue = "string",
        CustomHeaderName = "string",
        CaCert = "string",
        MTls = false,
        ClientKey = "string",
        TlsHostname = "string",
        ClientCert = "string",
    },
    LogglyConnector = new Akamai.Inputs.DatastreamLogglyConnectorArgs
    {
        AuthToken = "string",
        DisplayName = "string",
        Endpoint = "string",
        ContentType = "string",
        CustomHeaderName = "string",
        CustomHeaderValue = "string",
        Tags = "string",
    },
    NewRelicConnector = new Akamai.Inputs.DatastreamNewRelicConnectorArgs
    {
        AuthToken = "string",
        DisplayName = "string",
        Endpoint = "string",
        ContentType = "string",
        CustomHeaderName = "string",
        CustomHeaderValue = "string",
    },
    NotificationEmails = new[]
    {
        "string",
    },
    OracleConnector = new Akamai.Inputs.DatastreamOracleConnectorArgs
    {
        AccessKey = "string",
        Bucket = "string",
        DisplayName = "string",
        Namespace = "string",
        Path = "string",
        Region = "string",
        SecretAccessKey = "string",
        CompressLogs = false,
    },
    HttpsConnector = new Akamai.Inputs.DatastreamHttpsConnectorArgs
    {
        AuthenticationType = "string",
        Endpoint = "string",
        DisplayName = "string",
        ClientKey = "string",
        CompressLogs = false,
        ContentType = "string",
        CustomHeaderName = "string",
        CustomHeaderValue = "string",
        ClientCert = "string",
        CaCert = "string",
        MTls = false,
        Password = "string",
        TlsHostname = "string",
        UserName = "string",
    },
    S3Connector = new Akamai.Inputs.DatastreamS3ConnectorArgs
    {
        AccessKey = "string",
        Bucket = "string",
        DisplayName = "string",
        Path = "string",
        Region = "string",
        SecretAccessKey = "string",
        CompressLogs = false,
    },
    DatadogConnector = new Akamai.Inputs.DatastreamDatadogConnectorArgs
    {
        AuthToken = "string",
        DisplayName = "string",
        Endpoint = "string",
        CompressLogs = false,
        Service = "string",
        Source = "string",
        Tags = "string",
    },
    CollectMidgress = false,
    GcsConnector = new Akamai.Inputs.DatastreamGcsConnectorArgs
    {
        Bucket = "string",
        DisplayName = "string",
        PrivateKey = "string",
        ProjectId = "string",
        ServiceAccountName = "string",
        CompressLogs = false,
        Path = "string",
    },
});
example, err := akamai.NewDatastream(ctx, "datastreamResource", &akamai.DatastreamArgs{
	Properties: pulumi.StringArray{
		pulumi.String("string"),
	},
	Active:     pulumi.Bool(false),
	StreamName: pulumi.String("string"),
	ContractId: pulumi.String("string"),
	GroupId:    pulumi.String("string"),
	DatasetFields: pulumi.IntArray{
		pulumi.Int(0),
	},
	DeliveryConfiguration: &akamai.DatastreamDeliveryConfigurationArgs{
		Format: pulumi.String("string"),
		Frequency: &akamai.DatastreamDeliveryConfigurationFrequencyArgs{
			IntervalInSecs: pulumi.Int(0),
		},
		FieldDelimiter:   pulumi.String("string"),
		UploadFilePrefix: pulumi.String("string"),
		UploadFileSuffix: pulumi.String("string"),
	},
	SplunkConnector: &akamai.DatastreamSplunkConnectorArgs{
		DisplayName:         pulumi.String("string"),
		Endpoint:            pulumi.String("string"),
		EventCollectorToken: pulumi.String("string"),
		CaCert:              pulumi.String("string"),
		ClientCert:          pulumi.String("string"),
		ClientKey:           pulumi.String("string"),
		CompressLogs:        pulumi.Bool(false),
		CustomHeaderName:    pulumi.String("string"),
		CustomHeaderValue:   pulumi.String("string"),
		MTls:                pulumi.Bool(false),
		TlsHostname:         pulumi.String("string"),
	},
	SumologicConnector: &akamai.DatastreamSumologicConnectorArgs{
		CollectorCode:     pulumi.String("string"),
		DisplayName:       pulumi.String("string"),
		Endpoint:          pulumi.String("string"),
		CompressLogs:      pulumi.Bool(false),
		ContentType:       pulumi.String("string"),
		CustomHeaderName:  pulumi.String("string"),
		CustomHeaderValue: pulumi.String("string"),
	},
	AzureConnector: &akamai.DatastreamAzureConnectorArgs{
		AccessKey:     pulumi.String("string"),
		AccountName:   pulumi.String("string"),
		ContainerName: pulumi.String("string"),
		DisplayName:   pulumi.String("string"),
		Path:          pulumi.String("string"),
		CompressLogs:  pulumi.Bool(false),
	},
	ElasticsearchConnector: &akamai.DatastreamElasticsearchConnectorArgs{
		DisplayName:       pulumi.String("string"),
		UserName:          pulumi.String("string"),
		Password:          pulumi.String("string"),
		IndexName:         pulumi.String("string"),
		Endpoint:          pulumi.String("string"),
		ContentType:       pulumi.String("string"),
		CustomHeaderValue: pulumi.String("string"),
		CustomHeaderName:  pulumi.String("string"),
		CaCert:            pulumi.String("string"),
		MTls:              pulumi.Bool(false),
		ClientKey:         pulumi.String("string"),
		TlsHostname:       pulumi.String("string"),
		ClientCert:        pulumi.String("string"),
	},
	LogglyConnector: &akamai.DatastreamLogglyConnectorArgs{
		AuthToken:         pulumi.String("string"),
		DisplayName:       pulumi.String("string"),
		Endpoint:          pulumi.String("string"),
		ContentType:       pulumi.String("string"),
		CustomHeaderName:  pulumi.String("string"),
		CustomHeaderValue: pulumi.String("string"),
		Tags:              pulumi.String("string"),
	},
	NewRelicConnector: &akamai.DatastreamNewRelicConnectorArgs{
		AuthToken:         pulumi.String("string"),
		DisplayName:       pulumi.String("string"),
		Endpoint:          pulumi.String("string"),
		ContentType:       pulumi.String("string"),
		CustomHeaderName:  pulumi.String("string"),
		CustomHeaderValue: pulumi.String("string"),
	},
	NotificationEmails: pulumi.StringArray{
		pulumi.String("string"),
	},
	OracleConnector: &akamai.DatastreamOracleConnectorArgs{
		AccessKey:       pulumi.String("string"),
		Bucket:          pulumi.String("string"),
		DisplayName:     pulumi.String("string"),
		Namespace:       pulumi.String("string"),
		Path:            pulumi.String("string"),
		Region:          pulumi.String("string"),
		SecretAccessKey: pulumi.String("string"),
		CompressLogs:    pulumi.Bool(false),
	},
	HttpsConnector: &akamai.DatastreamHttpsConnectorArgs{
		AuthenticationType: pulumi.String("string"),
		Endpoint:           pulumi.String("string"),
		DisplayName:        pulumi.String("string"),
		ClientKey:          pulumi.String("string"),
		CompressLogs:       pulumi.Bool(false),
		ContentType:        pulumi.String("string"),
		CustomHeaderName:   pulumi.String("string"),
		CustomHeaderValue:  pulumi.String("string"),
		ClientCert:         pulumi.String("string"),
		CaCert:             pulumi.String("string"),
		MTls:               pulumi.Bool(false),
		Password:           pulumi.String("string"),
		TlsHostname:        pulumi.String("string"),
		UserName:           pulumi.String("string"),
	},
	S3Connector: &akamai.DatastreamS3ConnectorArgs{
		AccessKey:       pulumi.String("string"),
		Bucket:          pulumi.String("string"),
		DisplayName:     pulumi.String("string"),
		Path:            pulumi.String("string"),
		Region:          pulumi.String("string"),
		SecretAccessKey: pulumi.String("string"),
		CompressLogs:    pulumi.Bool(false),
	},
	DatadogConnector: &akamai.DatastreamDatadogConnectorArgs{
		AuthToken:    pulumi.String("string"),
		DisplayName:  pulumi.String("string"),
		Endpoint:     pulumi.String("string"),
		CompressLogs: pulumi.Bool(false),
		Service:      pulumi.String("string"),
		Source:       pulumi.String("string"),
		Tags:         pulumi.String("string"),
	},
	CollectMidgress: pulumi.Bool(false),
	GcsConnector: &akamai.DatastreamGcsConnectorArgs{
		Bucket:             pulumi.String("string"),
		DisplayName:        pulumi.String("string"),
		PrivateKey:         pulumi.String("string"),
		ProjectId:          pulumi.String("string"),
		ServiceAccountName: pulumi.String("string"),
		CompressLogs:       pulumi.Bool(false),
		Path:               pulumi.String("string"),
	},
})
var datastreamResource = new Datastream("datastreamResource", DatastreamArgs.builder()
    .properties("string")
    .active(false)
    .streamName("string")
    .contractId("string")
    .groupId("string")
    .datasetFields(0)
    .deliveryConfiguration(DatastreamDeliveryConfigurationArgs.builder()
        .format("string")
        .frequency(DatastreamDeliveryConfigurationFrequencyArgs.builder()
            .intervalInSecs(0)
            .build())
        .fieldDelimiter("string")
        .uploadFilePrefix("string")
        .uploadFileSuffix("string")
        .build())
    .splunkConnector(DatastreamSplunkConnectorArgs.builder()
        .displayName("string")
        .endpoint("string")
        .eventCollectorToken("string")
        .caCert("string")
        .clientCert("string")
        .clientKey("string")
        .compressLogs(false)
        .customHeaderName("string")
        .customHeaderValue("string")
        .mTls(false)
        .tlsHostname("string")
        .build())
    .sumologicConnector(DatastreamSumologicConnectorArgs.builder()
        .collectorCode("string")
        .displayName("string")
        .endpoint("string")
        .compressLogs(false)
        .contentType("string")
        .customHeaderName("string")
        .customHeaderValue("string")
        .build())
    .azureConnector(DatastreamAzureConnectorArgs.builder()
        .accessKey("string")
        .accountName("string")
        .containerName("string")
        .displayName("string")
        .path("string")
        .compressLogs(false)
        .build())
    .elasticsearchConnector(DatastreamElasticsearchConnectorArgs.builder()
        .displayName("string")
        .userName("string")
        .password("string")
        .indexName("string")
        .endpoint("string")
        .contentType("string")
        .customHeaderValue("string")
        .customHeaderName("string")
        .caCert("string")
        .mTls(false)
        .clientKey("string")
        .tlsHostname("string")
        .clientCert("string")
        .build())
    .logglyConnector(DatastreamLogglyConnectorArgs.builder()
        .authToken("string")
        .displayName("string")
        .endpoint("string")
        .contentType("string")
        .customHeaderName("string")
        .customHeaderValue("string")
        .tags("string")
        .build())
    .newRelicConnector(DatastreamNewRelicConnectorArgs.builder()
        .authToken("string")
        .displayName("string")
        .endpoint("string")
        .contentType("string")
        .customHeaderName("string")
        .customHeaderValue("string")
        .build())
    .notificationEmails("string")
    .oracleConnector(DatastreamOracleConnectorArgs.builder()
        .accessKey("string")
        .bucket("string")
        .displayName("string")
        .namespace("string")
        .path("string")
        .region("string")
        .secretAccessKey("string")
        .compressLogs(false)
        .build())
    .httpsConnector(DatastreamHttpsConnectorArgs.builder()
        .authenticationType("string")
        .endpoint("string")
        .displayName("string")
        .clientKey("string")
        .compressLogs(false)
        .contentType("string")
        .customHeaderName("string")
        .customHeaderValue("string")
        .clientCert("string")
        .caCert("string")
        .mTls(false)
        .password("string")
        .tlsHostname("string")
        .userName("string")
        .build())
    .s3Connector(DatastreamS3ConnectorArgs.builder()
        .accessKey("string")
        .bucket("string")
        .displayName("string")
        .path("string")
        .region("string")
        .secretAccessKey("string")
        .compressLogs(false)
        .build())
    .datadogConnector(DatastreamDatadogConnectorArgs.builder()
        .authToken("string")
        .displayName("string")
        .endpoint("string")
        .compressLogs(false)
        .service("string")
        .source("string")
        .tags("string")
        .build())
    .collectMidgress(false)
    .gcsConnector(DatastreamGcsConnectorArgs.builder()
        .bucket("string")
        .displayName("string")
        .privateKey("string")
        .projectId("string")
        .serviceAccountName("string")
        .compressLogs(false)
        .path("string")
        .build())
    .build());
datastream_resource = akamai.Datastream("datastreamResource",
    properties=["string"],
    active=False,
    stream_name="string",
    contract_id="string",
    group_id="string",
    dataset_fields=[0],
    delivery_configuration={
        "format": "string",
        "frequency": {
            "interval_in_secs": 0,
        },
        "field_delimiter": "string",
        "upload_file_prefix": "string",
        "upload_file_suffix": "string",
    },
    splunk_connector={
        "display_name": "string",
        "endpoint": "string",
        "event_collector_token": "string",
        "ca_cert": "string",
        "client_cert": "string",
        "client_key": "string",
        "compress_logs": False,
        "custom_header_name": "string",
        "custom_header_value": "string",
        "m_tls": False,
        "tls_hostname": "string",
    },
    sumologic_connector={
        "collector_code": "string",
        "display_name": "string",
        "endpoint": "string",
        "compress_logs": False,
        "content_type": "string",
        "custom_header_name": "string",
        "custom_header_value": "string",
    },
    azure_connector={
        "access_key": "string",
        "account_name": "string",
        "container_name": "string",
        "display_name": "string",
        "path": "string",
        "compress_logs": False,
    },
    elasticsearch_connector={
        "display_name": "string",
        "user_name": "string",
        "password": "string",
        "index_name": "string",
        "endpoint": "string",
        "content_type": "string",
        "custom_header_value": "string",
        "custom_header_name": "string",
        "ca_cert": "string",
        "m_tls": False,
        "client_key": "string",
        "tls_hostname": "string",
        "client_cert": "string",
    },
    loggly_connector={
        "auth_token": "string",
        "display_name": "string",
        "endpoint": "string",
        "content_type": "string",
        "custom_header_name": "string",
        "custom_header_value": "string",
        "tags": "string",
    },
    new_relic_connector={
        "auth_token": "string",
        "display_name": "string",
        "endpoint": "string",
        "content_type": "string",
        "custom_header_name": "string",
        "custom_header_value": "string",
    },
    notification_emails=["string"],
    oracle_connector={
        "access_key": "string",
        "bucket": "string",
        "display_name": "string",
        "namespace": "string",
        "path": "string",
        "region": "string",
        "secret_access_key": "string",
        "compress_logs": False,
    },
    https_connector={
        "authentication_type": "string",
        "endpoint": "string",
        "display_name": "string",
        "client_key": "string",
        "compress_logs": False,
        "content_type": "string",
        "custom_header_name": "string",
        "custom_header_value": "string",
        "client_cert": "string",
        "ca_cert": "string",
        "m_tls": False,
        "password": "string",
        "tls_hostname": "string",
        "user_name": "string",
    },
    s3_connector={
        "access_key": "string",
        "bucket": "string",
        "display_name": "string",
        "path": "string",
        "region": "string",
        "secret_access_key": "string",
        "compress_logs": False,
    },
    datadog_connector={
        "auth_token": "string",
        "display_name": "string",
        "endpoint": "string",
        "compress_logs": False,
        "service": "string",
        "source": "string",
        "tags": "string",
    },
    collect_midgress=False,
    gcs_connector={
        "bucket": "string",
        "display_name": "string",
        "private_key": "string",
        "project_id": "string",
        "service_account_name": "string",
        "compress_logs": False,
        "path": "string",
    })
const datastreamResource = new akamai.Datastream("datastreamResource", {
    properties: ["string"],
    active: false,
    streamName: "string",
    contractId: "string",
    groupId: "string",
    datasetFields: [0],
    deliveryConfiguration: {
        format: "string",
        frequency: {
            intervalInSecs: 0,
        },
        fieldDelimiter: "string",
        uploadFilePrefix: "string",
        uploadFileSuffix: "string",
    },
    splunkConnector: {
        displayName: "string",
        endpoint: "string",
        eventCollectorToken: "string",
        caCert: "string",
        clientCert: "string",
        clientKey: "string",
        compressLogs: false,
        customHeaderName: "string",
        customHeaderValue: "string",
        mTls: false,
        tlsHostname: "string",
    },
    sumologicConnector: {
        collectorCode: "string",
        displayName: "string",
        endpoint: "string",
        compressLogs: false,
        contentType: "string",
        customHeaderName: "string",
        customHeaderValue: "string",
    },
    azureConnector: {
        accessKey: "string",
        accountName: "string",
        containerName: "string",
        displayName: "string",
        path: "string",
        compressLogs: false,
    },
    elasticsearchConnector: {
        displayName: "string",
        userName: "string",
        password: "string",
        indexName: "string",
        endpoint: "string",
        contentType: "string",
        customHeaderValue: "string",
        customHeaderName: "string",
        caCert: "string",
        mTls: false,
        clientKey: "string",
        tlsHostname: "string",
        clientCert: "string",
    },
    logglyConnector: {
        authToken: "string",
        displayName: "string",
        endpoint: "string",
        contentType: "string",
        customHeaderName: "string",
        customHeaderValue: "string",
        tags: "string",
    },
    newRelicConnector: {
        authToken: "string",
        displayName: "string",
        endpoint: "string",
        contentType: "string",
        customHeaderName: "string",
        customHeaderValue: "string",
    },
    notificationEmails: ["string"],
    oracleConnector: {
        accessKey: "string",
        bucket: "string",
        displayName: "string",
        namespace: "string",
        path: "string",
        region: "string",
        secretAccessKey: "string",
        compressLogs: false,
    },
    httpsConnector: {
        authenticationType: "string",
        endpoint: "string",
        displayName: "string",
        clientKey: "string",
        compressLogs: false,
        contentType: "string",
        customHeaderName: "string",
        customHeaderValue: "string",
        clientCert: "string",
        caCert: "string",
        mTls: false,
        password: "string",
        tlsHostname: "string",
        userName: "string",
    },
    s3Connector: {
        accessKey: "string",
        bucket: "string",
        displayName: "string",
        path: "string",
        region: "string",
        secretAccessKey: "string",
        compressLogs: false,
    },
    datadogConnector: {
        authToken: "string",
        displayName: "string",
        endpoint: "string",
        compressLogs: false,
        service: "string",
        source: "string",
        tags: "string",
    },
    collectMidgress: false,
    gcsConnector: {
        bucket: "string",
        displayName: "string",
        privateKey: "string",
        projectId: "string",
        serviceAccountName: "string",
        compressLogs: false,
        path: "string",
    },
});
type: akamai:Datastream
properties:
    active: false
    azureConnector:
        accessKey: string
        accountName: string
        compressLogs: false
        containerName: string
        displayName: string
        path: string
    collectMidgress: false
    contractId: string
    datadogConnector:
        authToken: string
        compressLogs: false
        displayName: string
        endpoint: string
        service: string
        source: string
        tags: string
    datasetFields:
        - 0
    deliveryConfiguration:
        fieldDelimiter: string
        format: string
        frequency:
            intervalInSecs: 0
        uploadFilePrefix: string
        uploadFileSuffix: string
    elasticsearchConnector:
        caCert: string
        clientCert: string
        clientKey: string
        contentType: string
        customHeaderName: string
        customHeaderValue: string
        displayName: string
        endpoint: string
        indexName: string
        mTls: false
        password: string
        tlsHostname: string
        userName: string
    gcsConnector:
        bucket: string
        compressLogs: false
        displayName: string
        path: string
        privateKey: string
        projectId: string
        serviceAccountName: string
    groupId: string
    httpsConnector:
        authenticationType: string
        caCert: string
        clientCert: string
        clientKey: string
        compressLogs: false
        contentType: string
        customHeaderName: string
        customHeaderValue: string
        displayName: string
        endpoint: string
        mTls: false
        password: string
        tlsHostname: string
        userName: string
    logglyConnector:
        authToken: string
        contentType: string
        customHeaderName: string
        customHeaderValue: string
        displayName: string
        endpoint: string
        tags: string
    newRelicConnector:
        authToken: string
        contentType: string
        customHeaderName: string
        customHeaderValue: string
        displayName: string
        endpoint: string
    notificationEmails:
        - string
    oracleConnector:
        accessKey: string
        bucket: string
        compressLogs: false
        displayName: string
        namespace: string
        path: string
        region: string
        secretAccessKey: string
    properties:
        - string
    s3Connector:
        accessKey: string
        bucket: string
        compressLogs: false
        displayName: string
        path: string
        region: string
        secretAccessKey: string
    splunkConnector:
        caCert: string
        clientCert: string
        clientKey: string
        compressLogs: false
        customHeaderName: string
        customHeaderValue: string
        displayName: string
        endpoint: string
        eventCollectorToken: string
        mTls: false
        tlsHostname: string
    streamName: string
    sumologicConnector:
        collectorCode: string
        compressLogs: false
        contentType: string
        customHeaderName: string
        customHeaderValue: string
        displayName: string
        endpoint: string
Datastream Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.
Inputs
In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.
The Datastream resource accepts the following input properties:
- Active bool
- Defining if stream should be active or not
- ContractId string
- Identifies the contract that has access to the product
- DatasetFields List<int>
- A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- DeliveryConfiguration DatastreamDelivery Configuration 
- Provides information about the configuration related to logs (format, file names, delivery frequency)
- GroupId string
- Identifies the group that has access to the product and for which the stream configuration was created
- Properties List<string>
- Identifies the properties monitored in the stream
- StreamName string
- The name of the stream
- AzureConnector DatastreamAzure Connector 
- CollectMidgress bool
- Identifies if stream needs to collect midgress data
- DatadogConnector DatastreamDatadog Connector 
- ElasticsearchConnector DatastreamElasticsearch Connector 
- GcsConnector DatastreamGcs Connector 
- HttpsConnector DatastreamHttps Connector 
- LogglyConnector DatastreamLoggly Connector 
- NewRelic DatastreamConnector New Relic Connector 
- NotificationEmails List<string>
- List of email addresses where the system sends notifications about activations and deactivations of the stream
- OracleConnector DatastreamOracle Connector 
- S3Connector
DatastreamS3Connector 
- SplunkConnector DatastreamSplunk Connector 
- SumologicConnector DatastreamSumologic Connector 
- Active bool
- Defining if stream should be active or not
- ContractId string
- Identifies the contract that has access to the product
- DatasetFields []int
- A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- DeliveryConfiguration DatastreamDelivery Configuration Args 
- Provides information about the configuration related to logs (format, file names, delivery frequency)
- GroupId string
- Identifies the group that has access to the product and for which the stream configuration was created
- Properties []string
- Identifies the properties monitored in the stream
- StreamName string
- The name of the stream
- AzureConnector DatastreamAzure Connector Args 
- CollectMidgress bool
- Identifies if stream needs to collect midgress data
- DatadogConnector DatastreamDatadog Connector Args 
- ElasticsearchConnector DatastreamElasticsearch Connector Args 
- GcsConnector DatastreamGcs Connector Args 
- HttpsConnector DatastreamHttps Connector Args 
- LogglyConnector DatastreamLoggly Connector Args 
- NewRelic DatastreamConnector New Relic Connector Args 
- NotificationEmails []string
- List of email addresses where the system sends notifications about activations and deactivations of the stream
- OracleConnector DatastreamOracle Connector Args 
- S3Connector
DatastreamS3Connector Args 
- SplunkConnector DatastreamSplunk Connector Args 
- SumologicConnector DatastreamSumologic Connector Args 
- active Boolean
- Defining if stream should be active or not
- contractId String
- Identifies the contract that has access to the product
- datasetFields List<Integer>
- A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- deliveryConfiguration DatastreamDelivery Configuration 
- Provides information about the configuration related to logs (format, file names, delivery frequency)
- groupId String
- Identifies the group that has access to the product and for which the stream configuration was created
- properties List<String>
- Identifies the properties monitored in the stream
- streamName String
- The name of the stream
- azureConnector DatastreamAzure Connector 
- collectMidgress Boolean
- Identifies if stream needs to collect midgress data
- datadogConnector DatastreamDatadog Connector 
- elasticsearchConnector DatastreamElasticsearch Connector 
- gcsConnector DatastreamGcs Connector 
- httpsConnector DatastreamHttps Connector 
- logglyConnector DatastreamLoggly Connector 
- newRelic DatastreamConnector New Relic Connector 
- notificationEmails List<String>
- List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracleConnector DatastreamOracle Connector 
- s3Connector
DatastreamS3Connector 
- splunkConnector DatastreamSplunk Connector 
- sumologicConnector DatastreamSumologic Connector 
- active boolean
- Defining if stream should be active or not
- contractId string
- Identifies the contract that has access to the product
- datasetFields number[]
- A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- deliveryConfiguration DatastreamDelivery Configuration 
- Provides information about the configuration related to logs (format, file names, delivery frequency)
- groupId string
- Identifies the group that has access to the product and for which the stream configuration was created
- properties string[]
- Identifies the properties monitored in the stream
- streamName string
- The name of the stream
- azureConnector DatastreamAzure Connector 
- collectMidgress boolean
- Identifies if stream needs to collect midgress data
- datadogConnector DatastreamDatadog Connector 
- elasticsearchConnector DatastreamElasticsearch Connector 
- gcsConnector DatastreamGcs Connector 
- httpsConnector DatastreamHttps Connector 
- logglyConnector DatastreamLoggly Connector 
- newRelic DatastreamConnector New Relic Connector 
- notificationEmails string[]
- List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracleConnector DatastreamOracle Connector 
- s3Connector
DatastreamS3Connector 
- splunkConnector DatastreamSplunk Connector 
- sumologicConnector DatastreamSumologic Connector 
- active bool
- Defining if stream should be active or not
- contract_id str
- Identifies the contract that has access to the product
- dataset_fields Sequence[int]
- A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- delivery_configuration DatastreamDelivery Configuration Args 
- Provides information about the configuration related to logs (format, file names, delivery frequency)
- group_id str
- Identifies the group that has access to the product and for which the stream configuration was created
- properties Sequence[str]
- Identifies the properties monitored in the stream
- stream_name str
- The name of the stream
- azure_connector DatastreamAzure Connector Args 
- collect_midgress bool
- Identifies if stream needs to collect midgress data
- datadog_connector DatastreamDatadog Connector Args 
- elasticsearch_connector DatastreamElasticsearch Connector Args 
- gcs_connector DatastreamGcs Connector Args 
- https_connector DatastreamHttps Connector Args 
- loggly_connector DatastreamLoggly Connector Args 
- new_relic_ Datastreamconnector New Relic Connector Args 
- notification_emails Sequence[str]
- List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracle_connector DatastreamOracle Connector Args 
- s3_connector DatastreamS3Connector Args 
- splunk_connector DatastreamSplunk Connector Args 
- sumologic_connector DatastreamSumologic Connector Args 
- active Boolean
- Defining if stream should be active or not
- contractId String
- Identifies the contract that has access to the product
- datasetFields List<Number>
- A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- deliveryConfiguration Property Map
- Provides information about the configuration related to logs (format, file names, delivery frequency)
- groupId String
- Identifies the group that has access to the product and for which the stream configuration was created
- properties List<String>
- Identifies the properties monitored in the stream
- streamName String
- The name of the stream
- azureConnector Property Map
- collectMidgress Boolean
- Identifies if stream needs to collect midgress data
- datadogConnector Property Map
- elasticsearchConnector Property Map
- gcsConnector Property Map
- httpsConnector Property Map
- logglyConnector Property Map
- newRelic Property MapConnector 
- notificationEmails List<String>
- List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracleConnector Property Map
- s3Connector Property Map
- splunkConnector Property Map
- sumologicConnector Property Map
Outputs
All input properties are implicitly available as output properties. Additionally, the Datastream resource produces the following output properties:
- CreatedBy string
- The username who created the stream
- CreatedDate string
- The date and time when the stream was created
- Id string
- The provider-assigned unique ID for this managed resource.
- LatestVersion int
- Identifies the latest active configuration version of the stream
- ModifiedBy string
- The username who modified the stream
- ModifiedDate string
- The date and time when the stream was modified
- PapiJson string
- The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- ProductId string
- The ID of the product for which the stream was created
- StreamVersion int
- Identifies the configuration version of the stream
- CreatedBy string
- The username who created the stream
- CreatedDate string
- The date and time when the stream was created
- Id string
- The provider-assigned unique ID for this managed resource.
- LatestVersion int
- Identifies the latest active configuration version of the stream
- ModifiedBy string
- The username who modified the stream
- ModifiedDate string
- The date and time when the stream was modified
- PapiJson string
- The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- ProductId string
- The ID of the product for which the stream was created
- StreamVersion int
- Identifies the configuration version of the stream
- createdBy String
- The username who created the stream
- createdDate String
- The date and time when the stream was created
- id String
- The provider-assigned unique ID for this managed resource.
- latestVersion Integer
- Identifies the latest active configuration version of the stream
- modifiedBy String
- The username who modified the stream
- modifiedDate String
- The date and time when the stream was modified
- papiJson String
- The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- productId String
- The ID of the product for which the stream was created
- streamVersion Integer
- Identifies the configuration version of the stream
- createdBy string
- The username who created the stream
- createdDate string
- The date and time when the stream was created
- id string
- The provider-assigned unique ID for this managed resource.
- latestVersion number
- Identifies the latest active configuration version of the stream
- modifiedBy string
- The username who modified the stream
- modifiedDate string
- The date and time when the stream was modified
- papiJson string
- The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- productId string
- The ID of the product for which the stream was created
- streamVersion number
- Identifies the configuration version of the stream
- created_by str
- The username who created the stream
- created_date str
- The date and time when the stream was created
- id str
- The provider-assigned unique ID for this managed resource.
- latest_version int
- Identifies the latest active configuration version of the stream
- modified_by str
- The username who modified the stream
- modified_date str
- The date and time when the stream was modified
- papi_json str
- The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product_id str
- The ID of the product for which the stream was created
- stream_version int
- Identifies the configuration version of the stream
- createdBy String
- The username who created the stream
- createdDate String
- The date and time when the stream was created
- id String
- The provider-assigned unique ID for this managed resource.
- latestVersion Number
- Identifies the latest active configuration version of the stream
- modifiedBy String
- The username who modified the stream
- modifiedDate String
- The date and time when the stream was modified
- papiJson String
- The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- productId String
- The ID of the product for which the stream was created
- streamVersion Number
- Identifies the configuration version of the stream
Look up Existing Datastream Resource
Get an existing Datastream resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.
public static get(name: string, id: Input<ID>, state?: DatastreamState, opts?: CustomResourceOptions): Datastream@staticmethod
def get(resource_name: str,
        id: str,
        opts: Optional[ResourceOptions] = None,
        active: Optional[bool] = None,
        azure_connector: Optional[DatastreamAzureConnectorArgs] = None,
        collect_midgress: Optional[bool] = None,
        contract_id: Optional[str] = None,
        created_by: Optional[str] = None,
        created_date: Optional[str] = None,
        datadog_connector: Optional[DatastreamDatadogConnectorArgs] = None,
        dataset_fields: Optional[Sequence[int]] = None,
        delivery_configuration: Optional[DatastreamDeliveryConfigurationArgs] = None,
        elasticsearch_connector: Optional[DatastreamElasticsearchConnectorArgs] = None,
        gcs_connector: Optional[DatastreamGcsConnectorArgs] = None,
        group_id: Optional[str] = None,
        https_connector: Optional[DatastreamHttpsConnectorArgs] = None,
        latest_version: Optional[int] = None,
        loggly_connector: Optional[DatastreamLogglyConnectorArgs] = None,
        modified_by: Optional[str] = None,
        modified_date: Optional[str] = None,
        new_relic_connector: Optional[DatastreamNewRelicConnectorArgs] = None,
        notification_emails: Optional[Sequence[str]] = None,
        oracle_connector: Optional[DatastreamOracleConnectorArgs] = None,
        papi_json: Optional[str] = None,
        product_id: Optional[str] = None,
        properties: Optional[Sequence[str]] = None,
        s3_connector: Optional[DatastreamS3ConnectorArgs] = None,
        splunk_connector: Optional[DatastreamSplunkConnectorArgs] = None,
        stream_name: Optional[str] = None,
        stream_version: Optional[int] = None,
        sumologic_connector: Optional[DatastreamSumologicConnectorArgs] = None) -> Datastreamfunc GetDatastream(ctx *Context, name string, id IDInput, state *DatastreamState, opts ...ResourceOption) (*Datastream, error)public static Datastream Get(string name, Input<string> id, DatastreamState? state, CustomResourceOptions? opts = null)public static Datastream get(String name, Output<String> id, DatastreamState state, CustomResourceOptions options)resources:  _:    type: akamai:Datastream    get:      id: ${id}- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- resource_name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- Active bool
- Defining if stream should be active or not
- AzureConnector DatastreamAzure Connector 
- CollectMidgress bool
- Identifies if stream needs to collect midgress data
- ContractId string
- Identifies the contract that has access to the product
- CreatedBy string
- The username who created the stream
- CreatedDate string
- The date and time when the stream was created
- DatadogConnector DatastreamDatadog Connector 
- DatasetFields List<int>
- A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- DeliveryConfiguration DatastreamDelivery Configuration 
- Provides information about the configuration related to logs (format, file names, delivery frequency)
- ElasticsearchConnector DatastreamElasticsearch Connector 
- GcsConnector DatastreamGcs Connector 
- GroupId string
- Identifies the group that has access to the product and for which the stream configuration was created
- HttpsConnector DatastreamHttps Connector 
- LatestVersion int
- Identifies the latest active configuration version of the stream
- LogglyConnector DatastreamLoggly Connector 
- ModifiedBy string
- The username who modified the stream
- ModifiedDate string
- The date and time when the stream was modified
- NewRelic DatastreamConnector New Relic Connector 
- NotificationEmails List<string>
- List of email addresses where the system sends notifications about activations and deactivations of the stream
- OracleConnector DatastreamOracle Connector 
- PapiJson string
- The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- ProductId string
- The ID of the product for which the stream was created
- Properties List<string>
- Identifies the properties monitored in the stream
- S3Connector
DatastreamS3Connector 
- SplunkConnector DatastreamSplunk Connector 
- StreamName string
- The name of the stream
- StreamVersion int
- Identifies the configuration version of the stream
- SumologicConnector DatastreamSumologic Connector 
- Active bool
- Defining if stream should be active or not
- AzureConnector DatastreamAzure Connector Args 
- CollectMidgress bool
- Identifies if stream needs to collect midgress data
- ContractId string
- Identifies the contract that has access to the product
- CreatedBy string
- The username who created the stream
- CreatedDate string
- The date and time when the stream was created
- DatadogConnector DatastreamDatadog Connector Args 
- DatasetFields []int
- A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- DeliveryConfiguration DatastreamDelivery Configuration Args 
- Provides information about the configuration related to logs (format, file names, delivery frequency)
- ElasticsearchConnector DatastreamElasticsearch Connector Args 
- GcsConnector DatastreamGcs Connector Args 
- GroupId string
- Identifies the group that has access to the product and for which the stream configuration was created
- HttpsConnector DatastreamHttps Connector Args 
- LatestVersion int
- Identifies the latest active configuration version of the stream
- LogglyConnector DatastreamLoggly Connector Args 
- ModifiedBy string
- The username who modified the stream
- ModifiedDate string
- The date and time when the stream was modified
- NewRelic DatastreamConnector New Relic Connector Args 
- NotificationEmails []string
- List of email addresses where the system sends notifications about activations and deactivations of the stream
- OracleConnector DatastreamOracle Connector Args 
- PapiJson string
- The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- ProductId string
- The ID of the product for which the stream was created
- Properties []string
- Identifies the properties monitored in the stream
- S3Connector
DatastreamS3Connector Args 
- SplunkConnector DatastreamSplunk Connector Args 
- StreamName string
- The name of the stream
- StreamVersion int
- Identifies the configuration version of the stream
- SumologicConnector DatastreamSumologic Connector Args 
- active Boolean
- Defining if stream should be active or not
- azureConnector DatastreamAzure Connector 
- collectMidgress Boolean
- Identifies if stream needs to collect midgress data
- contractId String
- Identifies the contract that has access to the product
- createdBy String
- The username who created the stream
- createdDate String
- The date and time when the stream was created
- datadogConnector DatastreamDatadog Connector 
- datasetFields List<Integer>
- A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- deliveryConfiguration DatastreamDelivery Configuration 
- Provides information about the configuration related to logs (format, file names, delivery frequency)
- elasticsearchConnector DatastreamElasticsearch Connector 
- gcsConnector DatastreamGcs Connector 
- groupId String
- Identifies the group that has access to the product and for which the stream configuration was created
- httpsConnector DatastreamHttps Connector 
- latestVersion Integer
- Identifies the latest active configuration version of the stream
- logglyConnector DatastreamLoggly Connector 
- modifiedBy String
- The username who modified the stream
- modifiedDate String
- The date and time when the stream was modified
- newRelic DatastreamConnector New Relic Connector 
- notificationEmails List<String>
- List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracleConnector DatastreamOracle Connector 
- papiJson String
- The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- productId String
- The ID of the product for which the stream was created
- properties List<String>
- Identifies the properties monitored in the stream
- s3Connector
DatastreamS3Connector 
- splunkConnector DatastreamSplunk Connector 
- streamName String
- The name of the stream
- streamVersion Integer
- Identifies the configuration version of the stream
- sumologicConnector DatastreamSumologic Connector 
- active boolean
- Defining if stream should be active or not
- azureConnector DatastreamAzure Connector 
- collectMidgress boolean
- Identifies if stream needs to collect midgress data
- contractId string
- Identifies the contract that has access to the product
- createdBy string
- The username who created the stream
- createdDate string
- The date and time when the stream was created
- datadogConnector DatastreamDatadog Connector 
- datasetFields number[]
- A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- deliveryConfiguration DatastreamDelivery Configuration 
- Provides information about the configuration related to logs (format, file names, delivery frequency)
- elasticsearchConnector DatastreamElasticsearch Connector 
- gcsConnector DatastreamGcs Connector 
- groupId string
- Identifies the group that has access to the product and for which the stream configuration was created
- httpsConnector DatastreamHttps Connector 
- latestVersion number
- Identifies the latest active configuration version of the stream
- logglyConnector DatastreamLoggly Connector 
- modifiedBy string
- The username who modified the stream
- modifiedDate string
- The date and time when the stream was modified
- newRelic DatastreamConnector New Relic Connector 
- notificationEmails string[]
- List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracleConnector DatastreamOracle Connector 
- papiJson string
- The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- productId string
- The ID of the product for which the stream was created
- properties string[]
- Identifies the properties monitored in the stream
- s3Connector
DatastreamS3Connector 
- splunkConnector DatastreamSplunk Connector 
- streamName string
- The name of the stream
- streamVersion number
- Identifies the configuration version of the stream
- sumologicConnector DatastreamSumologic Connector 
- active bool
- Defining if stream should be active or not
- azure_connector DatastreamAzure Connector Args 
- collect_midgress bool
- Identifies if stream needs to collect midgress data
- contract_id str
- Identifies the contract that has access to the product
- created_by str
- The username who created the stream
- created_date str
- The date and time when the stream was created
- datadog_connector DatastreamDatadog Connector Args 
- dataset_fields Sequence[int]
- A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- delivery_configuration DatastreamDelivery Configuration Args 
- Provides information about the configuration related to logs (format, file names, delivery frequency)
- elasticsearch_connector DatastreamElasticsearch Connector Args 
- gcs_connector DatastreamGcs Connector Args 
- group_id str
- Identifies the group that has access to the product and for which the stream configuration was created
- https_connector DatastreamHttps Connector Args 
- latest_version int
- Identifies the latest active configuration version of the stream
- loggly_connector DatastreamLoggly Connector Args 
- modified_by str
- The username who modified the stream
- modified_date str
- The date and time when the stream was modified
- new_relic_ Datastreamconnector New Relic Connector Args 
- notification_emails Sequence[str]
- List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracle_connector DatastreamOracle Connector Args 
- papi_json str
- The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product_id str
- The ID of the product for which the stream was created
- properties Sequence[str]
- Identifies the properties monitored in the stream
- s3_connector DatastreamS3Connector Args 
- splunk_connector DatastreamSplunk Connector Args 
- stream_name str
- The name of the stream
- stream_version int
- Identifies the configuration version of the stream
- sumologic_connector DatastreamSumologic Connector Args 
- active Boolean
- Defining if stream should be active or not
- azureConnector Property Map
- collectMidgress Boolean
- Identifies if stream needs to collect midgress data
- contractId String
- Identifies the contract that has access to the product
- createdBy String
- The username who created the stream
- createdDate String
- The date and time when the stream was created
- datadogConnector Property Map
- datasetFields List<Number>
- A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- deliveryConfiguration Property Map
- Provides information about the configuration related to logs (format, file names, delivery frequency)
- elasticsearchConnector Property Map
- gcsConnector Property Map
- groupId String
- Identifies the group that has access to the product and for which the stream configuration was created
- httpsConnector Property Map
- latestVersion Number
- Identifies the latest active configuration version of the stream
- logglyConnector Property Map
- modifiedBy String
- The username who modified the stream
- modifiedDate String
- The date and time when the stream was modified
- newRelic Property MapConnector 
- notificationEmails List<String>
- List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracleConnector Property Map
- papiJson String
- The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- productId String
- The ID of the product for which the stream was created
- properties List<String>
- Identifies the properties monitored in the stream
- s3Connector Property Map
- splunkConnector Property Map
- streamName String
- The name of the stream
- streamVersion Number
- Identifies the configuration version of the stream
- sumologicConnector Property Map
Supporting Types
DatastreamAzureConnector, DatastreamAzureConnectorArgs      
- AccessKey string
- Access keys associated with Azure Storage account
- AccountName string
- Specifies the Azure Storage account name
- ContainerName string
- Specifies the Azure Storage container name
- DisplayName string
- The name of the connector
- Path string
- The path to the folder within Azure Storage container where logs will be stored
- CompressLogs bool
- Indicates whether the logs should be compressed
- AccessKey string
- Access keys associated with Azure Storage account
- AccountName string
- Specifies the Azure Storage account name
- ContainerName string
- Specifies the Azure Storage container name
- DisplayName string
- The name of the connector
- Path string
- The path to the folder within Azure Storage container where logs will be stored
- CompressLogs bool
- Indicates whether the logs should be compressed
- accessKey String
- Access keys associated with Azure Storage account
- accountName String
- Specifies the Azure Storage account name
- containerName String
- Specifies the Azure Storage container name
- displayName String
- The name of the connector
- path String
- The path to the folder within Azure Storage container where logs will be stored
- compressLogs Boolean
- Indicates whether the logs should be compressed
- accessKey string
- Access keys associated with Azure Storage account
- accountName string
- Specifies the Azure Storage account name
- containerName string
- Specifies the Azure Storage container name
- displayName string
- The name of the connector
- path string
- The path to the folder within Azure Storage container where logs will be stored
- compressLogs boolean
- Indicates whether the logs should be compressed
- access_key str
- Access keys associated with Azure Storage account
- account_name str
- Specifies the Azure Storage account name
- container_name str
- Specifies the Azure Storage container name
- display_name str
- The name of the connector
- path str
- The path to the folder within Azure Storage container where logs will be stored
- compress_logs bool
- Indicates whether the logs should be compressed
- accessKey String
- Access keys associated with Azure Storage account
- accountName String
- Specifies the Azure Storage account name
- containerName String
- Specifies the Azure Storage container name
- displayName String
- The name of the connector
- path String
- The path to the folder within Azure Storage container where logs will be stored
- compressLogs Boolean
- Indicates whether the logs should be compressed
DatastreamDatadogConnector, DatastreamDatadogConnectorArgs      
- AuthToken string
- The API key associated with Datadog account
- DisplayName string
- The name of the connector
- Endpoint string
- The Datadog endpoint where logs will be stored
- CompressLogs bool
- Indicates whether the logs should be compressed
- Service string
- The service of the Datadog connector
- Source string
- The source of the Datadog connector
- string
- The tags of the Datadog connector
- AuthToken string
- The API key associated with Datadog account
- DisplayName string
- The name of the connector
- Endpoint string
- The Datadog endpoint where logs will be stored
- CompressLogs bool
- Indicates whether the logs should be compressed
- Service string
- The service of the Datadog connector
- Source string
- The source of the Datadog connector
- string
- The tags of the Datadog connector
- authToken String
- The API key associated with Datadog account
- displayName String
- The name of the connector
- endpoint String
- The Datadog endpoint where logs will be stored
- compressLogs Boolean
- Indicates whether the logs should be compressed
- service String
- The service of the Datadog connector
- source String
- The source of the Datadog connector
- String
- The tags of the Datadog connector
- authToken string
- The API key associated with Datadog account
- displayName string
- The name of the connector
- endpoint string
- The Datadog endpoint where logs will be stored
- compressLogs boolean
- Indicates whether the logs should be compressed
- service string
- The service of the Datadog connector
- source string
- The source of the Datadog connector
- string
- The tags of the Datadog connector
- auth_token str
- The API key associated with Datadog account
- display_name str
- The name of the connector
- endpoint str
- The Datadog endpoint where logs will be stored
- compress_logs bool
- Indicates whether the logs should be compressed
- service str
- The service of the Datadog connector
- source str
- The source of the Datadog connector
- str
- The tags of the Datadog connector
- authToken String
- The API key associated with Datadog account
- displayName String
- The name of the connector
- endpoint String
- The Datadog endpoint where logs will be stored
- compressLogs Boolean
- Indicates whether the logs should be compressed
- service String
- The service of the Datadog connector
- source String
- The source of the Datadog connector
- String
- The tags of the Datadog connector
DatastreamDeliveryConfiguration, DatastreamDeliveryConfigurationArgs      
- Format string
- The format in which logs will be received
- Frequency
DatastreamDelivery Configuration Frequency 
- The frequency of collecting logs from each uploader and sending these logs to a destination
- FieldDelimiter string
- A delimiter that you use to separate data set fields in log lines
- UploadFile stringPrefix 
- The prefix of the log file that will be send to a destination
- UploadFile stringSuffix 
- The suffix of the log file that will be send to a destination
- Format string
- The format in which logs will be received
- Frequency
DatastreamDelivery Configuration Frequency 
- The frequency of collecting logs from each uploader and sending these logs to a destination
- FieldDelimiter string
- A delimiter that you use to separate data set fields in log lines
- UploadFile stringPrefix 
- The prefix of the log file that will be send to a destination
- UploadFile stringSuffix 
- The suffix of the log file that will be send to a destination
- format String
- The format in which logs will be received
- frequency
DatastreamDelivery Configuration Frequency 
- The frequency of collecting logs from each uploader and sending these logs to a destination
- fieldDelimiter String
- A delimiter that you use to separate data set fields in log lines
- uploadFile StringPrefix 
- The prefix of the log file that will be send to a destination
- uploadFile StringSuffix 
- The suffix of the log file that will be send to a destination
- format string
- The format in which logs will be received
- frequency
DatastreamDelivery Configuration Frequency 
- The frequency of collecting logs from each uploader and sending these logs to a destination
- fieldDelimiter string
- A delimiter that you use to separate data set fields in log lines
- uploadFile stringPrefix 
- The prefix of the log file that will be send to a destination
- uploadFile stringSuffix 
- The suffix of the log file that will be send to a destination
- format str
- The format in which logs will be received
- frequency
DatastreamDelivery Configuration Frequency 
- The frequency of collecting logs from each uploader and sending these logs to a destination
- field_delimiter str
- A delimiter that you use to separate data set fields in log lines
- upload_file_ strprefix 
- The prefix of the log file that will be send to a destination
- upload_file_ strsuffix 
- The suffix of the log file that will be send to a destination
- format String
- The format in which logs will be received
- frequency Property Map
- The frequency of collecting logs from each uploader and sending these logs to a destination
- fieldDelimiter String
- A delimiter that you use to separate data set fields in log lines
- uploadFile StringPrefix 
- The prefix of the log file that will be send to a destination
- uploadFile StringSuffix 
- The suffix of the log file that will be send to a destination
DatastreamDeliveryConfigurationFrequency, DatastreamDeliveryConfigurationFrequencyArgs        
- IntervalIn intSecs 
- The time in seconds after which the system bundles log lines into a file and sends it to a destination
- IntervalIn intSecs 
- The time in seconds after which the system bundles log lines into a file and sends it to a destination
- intervalIn IntegerSecs 
- The time in seconds after which the system bundles log lines into a file and sends it to a destination
- intervalIn numberSecs 
- The time in seconds after which the system bundles log lines into a file and sends it to a destination
- interval_in_ intsecs 
- The time in seconds after which the system bundles log lines into a file and sends it to a destination
- intervalIn NumberSecs 
- The time in seconds after which the system bundles log lines into a file and sends it to a destination
DatastreamElasticsearchConnector, DatastreamElasticsearchConnectorArgs      
- DisplayName string
- The name of the connector.
- Endpoint string
- The Elasticsearch bulk endpoint URL in the https://hostname.elastic-cloud.com:9243/_bulk/ format. Set indexName in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Elasticsearch.
- IndexName string
- The index name of the Elastic cloud where you want to store log files.
- Password string
- The Elasticsearch basic access authentication password.
- UserName string
- The Elasticsearch basic access authentication username.
- CaCert string
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- ClientCert string
- The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- ClientKey string
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- ContentType string
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- CustomHeader stringName 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- CustomHeader stringValue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- MTls bool
- Indicates whether mTLS is enabled or not.
- TlsHostname string
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- DisplayName string
- The name of the connector.
- Endpoint string
- The Elasticsearch bulk endpoint URL in the https://hostname.elastic-cloud.com:9243/_bulk/ format. Set indexName in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Elasticsearch.
- IndexName string
- The index name of the Elastic cloud where you want to store log files.
- Password string
- The Elasticsearch basic access authentication password.
- UserName string
- The Elasticsearch basic access authentication username.
- CaCert string
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- ClientCert string
- The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- ClientKey string
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- ContentType string
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- CustomHeader stringName 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- CustomHeader stringValue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- MTls bool
- Indicates whether mTLS is enabled or not.
- TlsHostname string
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- displayName String
- The name of the connector.
- endpoint String
- The Elasticsearch bulk endpoint URL in the https://hostname.elastic-cloud.com:9243/_bulk/ format. Set indexName in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Elasticsearch.
- indexName String
- The index name of the Elastic cloud where you want to store log files.
- password String
- The Elasticsearch basic access authentication password.
- userName String
- The Elasticsearch basic access authentication username.
- caCert String
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- clientCert String
- The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- clientKey String
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- contentType String
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- customHeader StringName 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- customHeader StringValue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- mTls Boolean
- Indicates whether mTLS is enabled or not.
- tlsHostname String
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- displayName string
- The name of the connector.
- endpoint string
- The Elasticsearch bulk endpoint URL in the https://hostname.elastic-cloud.com:9243/_bulk/ format. Set indexName in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Elasticsearch.
- indexName string
- The index name of the Elastic cloud where you want to store log files.
- password string
- The Elasticsearch basic access authentication password.
- userName string
- The Elasticsearch basic access authentication username.
- caCert string
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- clientCert string
- The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- clientKey string
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- contentType string
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- customHeader stringName 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- customHeader stringValue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- mTls boolean
- Indicates whether mTLS is enabled or not.
- tlsHostname string
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- display_name str
- The name of the connector.
- endpoint str
- The Elasticsearch bulk endpoint URL in the https://hostname.elastic-cloud.com:9243/_bulk/ format. Set indexName in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Elasticsearch.
- index_name str
- The index name of the Elastic cloud where you want to store log files.
- password str
- The Elasticsearch basic access authentication password.
- user_name str
- The Elasticsearch basic access authentication username.
- ca_cert str
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- client_cert str
- The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client_key str
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- content_type str
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- custom_header_ strname 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- custom_header_ strvalue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- m_tls bool
- Indicates whether mTLS is enabled or not.
- tls_hostname str
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- displayName String
- The name of the connector.
- endpoint String
- The Elasticsearch bulk endpoint URL in the https://hostname.elastic-cloud.com:9243/_bulk/ format. Set indexName in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Elasticsearch.
- indexName String
- The index name of the Elastic cloud where you want to store log files.
- password String
- The Elasticsearch basic access authentication password.
- userName String
- The Elasticsearch basic access authentication username.
- caCert String
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- clientCert String
- The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- clientKey String
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- contentType String
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- customHeader StringName 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- customHeader StringValue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- mTls Boolean
- Indicates whether mTLS is enabled or not.
- tlsHostname String
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
DatastreamGcsConnector, DatastreamGcsConnectorArgs      
- Bucket string
- The name of the storage bucket created in Google Cloud account
- DisplayName string
- The name of the connector
- PrivateKey string
- The contents of the JSON private key generated and downloaded in Google Cloud Storage account
- ProjectId string
- The unique ID of Google Cloud project
- ServiceAccount stringName 
- The name of the service account with the storage.object.create permission or Storage Object Creator role
- CompressLogs bool
- Indicates whether the logs should be compressed
- Path string
- The path to the folder within Google Cloud bucket where logs will be stored
- Bucket string
- The name of the storage bucket created in Google Cloud account
- DisplayName string
- The name of the connector
- PrivateKey string
- The contents of the JSON private key generated and downloaded in Google Cloud Storage account
- ProjectId string
- The unique ID of Google Cloud project
- ServiceAccount stringName 
- The name of the service account with the storage.object.create permission or Storage Object Creator role
- CompressLogs bool
- Indicates whether the logs should be compressed
- Path string
- The path to the folder within Google Cloud bucket where logs will be stored
- bucket String
- The name of the storage bucket created in Google Cloud account
- displayName String
- The name of the connector
- privateKey String
- The contents of the JSON private key generated and downloaded in Google Cloud Storage account
- projectId String
- The unique ID of Google Cloud project
- serviceAccount StringName 
- The name of the service account with the storage.object.create permission or Storage Object Creator role
- compressLogs Boolean
- Indicates whether the logs should be compressed
- path String
- The path to the folder within Google Cloud bucket where logs will be stored
- bucket string
- The name of the storage bucket created in Google Cloud account
- displayName string
- The name of the connector
- privateKey string
- The contents of the JSON private key generated and downloaded in Google Cloud Storage account
- projectId string
- The unique ID of Google Cloud project
- serviceAccount stringName 
- The name of the service account with the storage.object.create permission or Storage Object Creator role
- compressLogs boolean
- Indicates whether the logs should be compressed
- path string
- The path to the folder within Google Cloud bucket where logs will be stored
- bucket str
- The name of the storage bucket created in Google Cloud account
- display_name str
- The name of the connector
- private_key str
- The contents of the JSON private key generated and downloaded in Google Cloud Storage account
- project_id str
- The unique ID of Google Cloud project
- service_account_ strname 
- The name of the service account with the storage.object.create permission or Storage Object Creator role
- compress_logs bool
- Indicates whether the logs should be compressed
- path str
- The path to the folder within Google Cloud bucket where logs will be stored
- bucket String
- The name of the storage bucket created in Google Cloud account
- displayName String
- The name of the connector
- privateKey String
- The contents of the JSON private key generated and downloaded in Google Cloud Storage account
- projectId String
- The unique ID of Google Cloud project
- serviceAccount StringName 
- The name of the service account with the storage.object.create permission or Storage Object Creator role
- compressLogs Boolean
- Indicates whether the logs should be compressed
- path String
- The path to the folder within Google Cloud bucket where logs will be stored
DatastreamHttpsConnector, DatastreamHttpsConnectorArgs      
- AuthenticationType string
- Either NONE for no authentication, or BASIC for username and password authentication
- DisplayName string
- The name of the connector
- Endpoint string
- URL where logs will be stored
- CaCert string
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- ClientCert string
- The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- ClientKey string
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- CompressLogs bool
- Indicates whether the logs should be compressed
- ContentType string
- Content type to pass in the log file header
- CustomHeader stringName 
- The name of custom header passed with the request to the destination
- CustomHeader stringValue 
- The custom header's contents passed with the request to the destination
- MTls bool
- Indicates whether mTLS is enabled or not.
- Password string
- Password set for custom HTTPS endpoint for authentication
- TlsHostname string
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- UserName string
- Username used for authentication
- AuthenticationType string
- Either NONE for no authentication, or BASIC for username and password authentication
- DisplayName string
- The name of the connector
- Endpoint string
- URL where logs will be stored
- CaCert string
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- ClientCert string
- The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- ClientKey string
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- CompressLogs bool
- Indicates whether the logs should be compressed
- ContentType string
- Content type to pass in the log file header
- CustomHeader stringName 
- The name of custom header passed with the request to the destination
- CustomHeader stringValue 
- The custom header's contents passed with the request to the destination
- MTls bool
- Indicates whether mTLS is enabled or not.
- Password string
- Password set for custom HTTPS endpoint for authentication
- TlsHostname string
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- UserName string
- Username used for authentication
- authenticationType String
- Either NONE for no authentication, or BASIC for username and password authentication
- displayName String
- The name of the connector
- endpoint String
- URL where logs will be stored
- caCert String
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- clientCert String
- The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- clientKey String
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compressLogs Boolean
- Indicates whether the logs should be compressed
- contentType String
- Content type to pass in the log file header
- customHeader StringName 
- The name of custom header passed with the request to the destination
- customHeader StringValue 
- The custom header's contents passed with the request to the destination
- mTls Boolean
- Indicates whether mTLS is enabled or not.
- password String
- Password set for custom HTTPS endpoint for authentication
- tlsHostname String
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- userName String
- Username used for authentication
- authenticationType string
- Either NONE for no authentication, or BASIC for username and password authentication
- displayName string
- The name of the connector
- endpoint string
- URL where logs will be stored
- caCert string
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- clientCert string
- The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- clientKey string
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compressLogs boolean
- Indicates whether the logs should be compressed
- contentType string
- Content type to pass in the log file header
- customHeader stringName 
- The name of custom header passed with the request to the destination
- customHeader stringValue 
- The custom header's contents passed with the request to the destination
- mTls boolean
- Indicates whether mTLS is enabled or not.
- password string
- Password set for custom HTTPS endpoint for authentication
- tlsHostname string
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- userName string
- Username used for authentication
- authentication_type str
- Either NONE for no authentication, or BASIC for username and password authentication
- display_name str
- The name of the connector
- endpoint str
- URL where logs will be stored
- ca_cert str
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- client_cert str
- The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- client_key str
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress_logs bool
- Indicates whether the logs should be compressed
- content_type str
- Content type to pass in the log file header
- custom_header_ strname 
- The name of custom header passed with the request to the destination
- custom_header_ strvalue 
- The custom header's contents passed with the request to the destination
- m_tls bool
- Indicates whether mTLS is enabled or not.
- password str
- Password set for custom HTTPS endpoint for authentication
- tls_hostname str
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- user_name str
- Username used for authentication
- authenticationType String
- Either NONE for no authentication, or BASIC for username and password authentication
- displayName String
- The name of the connector
- endpoint String
- URL where logs will be stored
- caCert String
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- clientCert String
- The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- clientKey String
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compressLogs Boolean
- Indicates whether the logs should be compressed
- contentType String
- Content type to pass in the log file header
- customHeader StringName 
- The name of custom header passed with the request to the destination
- customHeader StringValue 
- The custom header's contents passed with the request to the destination
- mTls Boolean
- Indicates whether mTLS is enabled or not.
- password String
- Password set for custom HTTPS endpoint for authentication
- tlsHostname String
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- userName String
- Username used for authentication
DatastreamLogglyConnector, DatastreamLogglyConnectorArgs      
- AuthToken string
- The unique HTTP code for your Loggly bulk endpoint.
- DisplayName string
- The name of the connector.
- Endpoint string
- The Loggly bulk endpoint URL in the https://hostname.loggly.com/bulk/ format. Set the endpoint code in the authToken field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Loggly.
- ContentType string
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- CustomHeader stringName 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- CustomHeader stringValue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- string
- The tags you can use to segment and filter log events in Loggly. See Tags in the Loggly documentation.
- AuthToken string
- The unique HTTP code for your Loggly bulk endpoint.
- DisplayName string
- The name of the connector.
- Endpoint string
- The Loggly bulk endpoint URL in the https://hostname.loggly.com/bulk/ format. Set the endpoint code in the authToken field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Loggly.
- ContentType string
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- CustomHeader stringName 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- CustomHeader stringValue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- string
- The tags you can use to segment and filter log events in Loggly. See Tags in the Loggly documentation.
- authToken String
- The unique HTTP code for your Loggly bulk endpoint.
- displayName String
- The name of the connector.
- endpoint String
- The Loggly bulk endpoint URL in the https://hostname.loggly.com/bulk/ format. Set the endpoint code in the authToken field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Loggly.
- contentType String
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- customHeader StringName 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- customHeader StringValue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- String
- The tags you can use to segment and filter log events in Loggly. See Tags in the Loggly documentation.
- authToken string
- The unique HTTP code for your Loggly bulk endpoint.
- displayName string
- The name of the connector.
- endpoint string
- The Loggly bulk endpoint URL in the https://hostname.loggly.com/bulk/ format. Set the endpoint code in the authToken field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Loggly.
- contentType string
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- customHeader stringName 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- customHeader stringValue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- string
- The tags you can use to segment and filter log events in Loggly. See Tags in the Loggly documentation.
- auth_token str
- The unique HTTP code for your Loggly bulk endpoint.
- display_name str
- The name of the connector.
- endpoint str
- The Loggly bulk endpoint URL in the https://hostname.loggly.com/bulk/ format. Set the endpoint code in the authToken field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Loggly.
- content_type str
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- custom_header_ strname 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- custom_header_ strvalue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- str
- The tags you can use to segment and filter log events in Loggly. See Tags in the Loggly documentation.
- authToken String
- The unique HTTP code for your Loggly bulk endpoint.
- displayName String
- The name of the connector.
- endpoint String
- The Loggly bulk endpoint URL in the https://hostname.loggly.com/bulk/ format. Set the endpoint code in the authToken field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Loggly.
- contentType String
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- customHeader StringName 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- customHeader StringValue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- String
- The tags you can use to segment and filter log events in Loggly. See Tags in the Loggly documentation.
DatastreamNewRelicConnector, DatastreamNewRelicConnectorArgs        
- AuthToken string
- Your Log API token for your account in New Relic.
- DisplayName string
- The name of the connector.
- Endpoint string
- A New Relic endpoint URL you want to send your logs to. The endpoint URL should follow the https://<newrelic.com>/log/v1/ format format. See Introduction to the Log API https://docs.newrelic.com/docs/logs/log-api/introduction-log-api/ if you want to retrieve your New Relic endpoint URL.
- ContentType string
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- CustomHeader stringName 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- CustomHeader stringValue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- AuthToken string
- Your Log API token for your account in New Relic.
- DisplayName string
- The name of the connector.
- Endpoint string
- A New Relic endpoint URL you want to send your logs to. The endpoint URL should follow the https://<newrelic.com>/log/v1/ format format. See Introduction to the Log API https://docs.newrelic.com/docs/logs/log-api/introduction-log-api/ if you want to retrieve your New Relic endpoint URL.
- ContentType string
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- CustomHeader stringName 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- CustomHeader stringValue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- authToken String
- Your Log API token for your account in New Relic.
- displayName String
- The name of the connector.
- endpoint String
- A New Relic endpoint URL you want to send your logs to. The endpoint URL should follow the https://<newrelic.com>/log/v1/ format format. See Introduction to the Log API https://docs.newrelic.com/docs/logs/log-api/introduction-log-api/ if you want to retrieve your New Relic endpoint URL.
- contentType String
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- customHeader StringName 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- customHeader StringValue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- authToken string
- Your Log API token for your account in New Relic.
- displayName string
- The name of the connector.
- endpoint string
- A New Relic endpoint URL you want to send your logs to. The endpoint URL should follow the https://<newrelic.com>/log/v1/ format format. See Introduction to the Log API https://docs.newrelic.com/docs/logs/log-api/introduction-log-api/ if you want to retrieve your New Relic endpoint URL.
- contentType string
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- customHeader stringName 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- customHeader stringValue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- auth_token str
- Your Log API token for your account in New Relic.
- display_name str
- The name of the connector.
- endpoint str
- A New Relic endpoint URL you want to send your logs to. The endpoint URL should follow the https://<newrelic.com>/log/v1/ format format. See Introduction to the Log API https://docs.newrelic.com/docs/logs/log-api/introduction-log-api/ if you want to retrieve your New Relic endpoint URL.
- content_type str
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- custom_header_ strname 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- custom_header_ strvalue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- authToken String
- Your Log API token for your account in New Relic.
- displayName String
- The name of the connector.
- endpoint String
- A New Relic endpoint URL you want to send your logs to. The endpoint URL should follow the https://<newrelic.com>/log/v1/ format format. See Introduction to the Log API https://docs.newrelic.com/docs/logs/log-api/introduction-log-api/ if you want to retrieve your New Relic endpoint URL.
- contentType String
- The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- customHeader StringName 
- A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- customHeader StringValue 
- The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
DatastreamOracleConnector, DatastreamOracleConnectorArgs      
- AccessKey string
- The access key identifier used to authenticate requests to the Oracle Cloud account
- Bucket string
- The name of the Oracle Cloud Storage bucket
- DisplayName string
- The name of the connector
- Namespace string
- The namespace of Oracle Cloud Storage account
- Path string
- The path to the folder within your Oracle Cloud Storage bucket where logs will be stored
- Region string
- The Oracle Cloud Storage region where bucket resides
- SecretAccess stringKey 
- The secret access key identifier used to authenticate requests to the Oracle Cloud account
- CompressLogs bool
- Indicates whether the logs should be compressed
- AccessKey string
- The access key identifier used to authenticate requests to the Oracle Cloud account
- Bucket string
- The name of the Oracle Cloud Storage bucket
- DisplayName string
- The name of the connector
- Namespace string
- The namespace of Oracle Cloud Storage account
- Path string
- The path to the folder within your Oracle Cloud Storage bucket where logs will be stored
- Region string
- The Oracle Cloud Storage region where bucket resides
- SecretAccess stringKey 
- The secret access key identifier used to authenticate requests to the Oracle Cloud account
- CompressLogs bool
- Indicates whether the logs should be compressed
- accessKey String
- The access key identifier used to authenticate requests to the Oracle Cloud account
- bucket String
- The name of the Oracle Cloud Storage bucket
- displayName String
- The name of the connector
- namespace String
- The namespace of Oracle Cloud Storage account
- path String
- The path to the folder within your Oracle Cloud Storage bucket where logs will be stored
- region String
- The Oracle Cloud Storage region where bucket resides
- secretAccess StringKey 
- The secret access key identifier used to authenticate requests to the Oracle Cloud account
- compressLogs Boolean
- Indicates whether the logs should be compressed
- accessKey string
- The access key identifier used to authenticate requests to the Oracle Cloud account
- bucket string
- The name of the Oracle Cloud Storage bucket
- displayName string
- The name of the connector
- namespace string
- The namespace of Oracle Cloud Storage account
- path string
- The path to the folder within your Oracle Cloud Storage bucket where logs will be stored
- region string
- The Oracle Cloud Storage region where bucket resides
- secretAccess stringKey 
- The secret access key identifier used to authenticate requests to the Oracle Cloud account
- compressLogs boolean
- Indicates whether the logs should be compressed
- access_key str
- The access key identifier used to authenticate requests to the Oracle Cloud account
- bucket str
- The name of the Oracle Cloud Storage bucket
- display_name str
- The name of the connector
- namespace str
- The namespace of Oracle Cloud Storage account
- path str
- The path to the folder within your Oracle Cloud Storage bucket where logs will be stored
- region str
- The Oracle Cloud Storage region where bucket resides
- secret_access_ strkey 
- The secret access key identifier used to authenticate requests to the Oracle Cloud account
- compress_logs bool
- Indicates whether the logs should be compressed
- accessKey String
- The access key identifier used to authenticate requests to the Oracle Cloud account
- bucket String
- The name of the Oracle Cloud Storage bucket
- displayName String
- The name of the connector
- namespace String
- The namespace of Oracle Cloud Storage account
- path String
- The path to the folder within your Oracle Cloud Storage bucket where logs will be stored
- region String
- The Oracle Cloud Storage region where bucket resides
- secretAccess StringKey 
- The secret access key identifier used to authenticate requests to the Oracle Cloud account
- compressLogs Boolean
- Indicates whether the logs should be compressed
DatastreamS3Connector, DatastreamS3ConnectorArgs    
- AccessKey string
- The access key identifier used to authenticate requests to the Amazon S3 account
- Bucket string
- The name of the Amazon S3 bucket
- DisplayName string
- The name of the connector
- Path string
- The path to the folder within Amazon S3 bucket where logs will be stored
- Region string
- The AWS region where Amazon S3 bucket resides
- SecretAccess stringKey 
- The secret access key identifier used to authenticate requests to the Amazon S3 account
- CompressLogs bool
- Indicates whether the logs should be compressed
- AccessKey string
- The access key identifier used to authenticate requests to the Amazon S3 account
- Bucket string
- The name of the Amazon S3 bucket
- DisplayName string
- The name of the connector
- Path string
- The path to the folder within Amazon S3 bucket where logs will be stored
- Region string
- The AWS region where Amazon S3 bucket resides
- SecretAccess stringKey 
- The secret access key identifier used to authenticate requests to the Amazon S3 account
- CompressLogs bool
- Indicates whether the logs should be compressed
- accessKey String
- The access key identifier used to authenticate requests to the Amazon S3 account
- bucket String
- The name of the Amazon S3 bucket
- displayName String
- The name of the connector
- path String
- The path to the folder within Amazon S3 bucket where logs will be stored
- region String
- The AWS region where Amazon S3 bucket resides
- secretAccess StringKey 
- The secret access key identifier used to authenticate requests to the Amazon S3 account
- compressLogs Boolean
- Indicates whether the logs should be compressed
- accessKey string
- The access key identifier used to authenticate requests to the Amazon S3 account
- bucket string
- The name of the Amazon S3 bucket
- displayName string
- The name of the connector
- path string
- The path to the folder within Amazon S3 bucket where logs will be stored
- region string
- The AWS region where Amazon S3 bucket resides
- secretAccess stringKey 
- The secret access key identifier used to authenticate requests to the Amazon S3 account
- compressLogs boolean
- Indicates whether the logs should be compressed
- access_key str
- The access key identifier used to authenticate requests to the Amazon S3 account
- bucket str
- The name of the Amazon S3 bucket
- display_name str
- The name of the connector
- path str
- The path to the folder within Amazon S3 bucket where logs will be stored
- region str
- The AWS region where Amazon S3 bucket resides
- secret_access_ strkey 
- The secret access key identifier used to authenticate requests to the Amazon S3 account
- compress_logs bool
- Indicates whether the logs should be compressed
- accessKey String
- The access key identifier used to authenticate requests to the Amazon S3 account
- bucket String
- The name of the Amazon S3 bucket
- displayName String
- The name of the connector
- path String
- The path to the folder within Amazon S3 bucket where logs will be stored
- region String
- The AWS region where Amazon S3 bucket resides
- secretAccess StringKey 
- The secret access key identifier used to authenticate requests to the Amazon S3 account
- compressLogs Boolean
- Indicates whether the logs should be compressed
DatastreamSplunkConnector, DatastreamSplunkConnectorArgs      
- DisplayName string
- The name of the connector
- Endpoint string
- The raw event Splunk URL where logs will be stored
- EventCollector stringToken 
- The Event Collector token associated with Splunk account
- CaCert string
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- ClientCert string
- The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- ClientKey string
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- CompressLogs bool
- Indicates whether the logs should be compressed
- CustomHeader stringName 
- The name of custom header passed with the request to the destination
- CustomHeader stringValue 
- The custom header's contents passed with the request to the destination
- MTls bool
- Indicates whether mTLS is enabled or not.
- TlsHostname string
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- DisplayName string
- The name of the connector
- Endpoint string
- The raw event Splunk URL where logs will be stored
- EventCollector stringToken 
- The Event Collector token associated with Splunk account
- CaCert string
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- ClientCert string
- The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- ClientKey string
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- CompressLogs bool
- Indicates whether the logs should be compressed
- CustomHeader stringName 
- The name of custom header passed with the request to the destination
- CustomHeader stringValue 
- The custom header's contents passed with the request to the destination
- MTls bool
- Indicates whether mTLS is enabled or not.
- TlsHostname string
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- displayName String
- The name of the connector
- endpoint String
- The raw event Splunk URL where logs will be stored
- eventCollector StringToken 
- The Event Collector token associated with Splunk account
- caCert String
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- clientCert String
- The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- clientKey String
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compressLogs Boolean
- Indicates whether the logs should be compressed
- customHeader StringName 
- The name of custom header passed with the request to the destination
- customHeader StringValue 
- The custom header's contents passed with the request to the destination
- mTls Boolean
- Indicates whether mTLS is enabled or not.
- tlsHostname String
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- displayName string
- The name of the connector
- endpoint string
- The raw event Splunk URL where logs will be stored
- eventCollector stringToken 
- The Event Collector token associated with Splunk account
- caCert string
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- clientCert string
- The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- clientKey string
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compressLogs boolean
- Indicates whether the logs should be compressed
- customHeader stringName 
- The name of custom header passed with the request to the destination
- customHeader stringValue 
- The custom header's contents passed with the request to the destination
- mTls boolean
- Indicates whether mTLS is enabled or not.
- tlsHostname string
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- display_name str
- The name of the connector
- endpoint str
- The raw event Splunk URL where logs will be stored
- event_collector_ strtoken 
- The Event Collector token associated with Splunk account
- ca_cert str
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- client_cert str
- The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- client_key str
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress_logs bool
- Indicates whether the logs should be compressed
- custom_header_ strname 
- The name of custom header passed with the request to the destination
- custom_header_ strvalue 
- The custom header's contents passed with the request to the destination
- m_tls bool
- Indicates whether mTLS is enabled or not.
- tls_hostname str
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- displayName String
- The name of the connector
- endpoint String
- The raw event Splunk URL where logs will be stored
- eventCollector StringToken 
- The Event Collector token associated with Splunk account
- caCert String
- The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- clientCert String
- The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- clientKey String
- The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compressLogs Boolean
- Indicates whether the logs should be compressed
- customHeader StringName 
- The name of custom header passed with the request to the destination
- customHeader StringValue 
- The custom header's contents passed with the request to the destination
- mTls Boolean
- Indicates whether mTLS is enabled or not.
- tlsHostname String
- The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
DatastreamSumologicConnector, DatastreamSumologicConnectorArgs      
- CollectorCode string
- The unique HTTP collector code of Sumo Logic endpoint
- DisplayName string
- The name of the connector
- Endpoint string
- The Sumo Logic collection endpoint where logs will be stored
- CompressLogs bool
- Indicates whether the logs should be compressed
- ContentType string
- Content type to pass in the log file header
- CustomHeader stringName 
- The name of custom header passed with the request to the destination
- CustomHeader stringValue 
- The custom header's contents passed with the request to the destination
- CollectorCode string
- The unique HTTP collector code of Sumo Logic endpoint
- DisplayName string
- The name of the connector
- Endpoint string
- The Sumo Logic collection endpoint where logs will be stored
- CompressLogs bool
- Indicates whether the logs should be compressed
- ContentType string
- Content type to pass in the log file header
- CustomHeader stringName 
- The name of custom header passed with the request to the destination
- CustomHeader stringValue 
- The custom header's contents passed with the request to the destination
- collectorCode String
- The unique HTTP collector code of Sumo Logic endpoint
- displayName String
- The name of the connector
- endpoint String
- The Sumo Logic collection endpoint where logs will be stored
- compressLogs Boolean
- Indicates whether the logs should be compressed
- contentType String
- Content type to pass in the log file header
- customHeader StringName 
- The name of custom header passed with the request to the destination
- customHeader StringValue 
- The custom header's contents passed with the request to the destination
- collectorCode string
- The unique HTTP collector code of Sumo Logic endpoint
- displayName string
- The name of the connector
- endpoint string
- The Sumo Logic collection endpoint where logs will be stored
- compressLogs boolean
- Indicates whether the logs should be compressed
- contentType string
- Content type to pass in the log file header
- customHeader stringName 
- The name of custom header passed with the request to the destination
- customHeader stringValue 
- The custom header's contents passed with the request to the destination
- collector_code str
- The unique HTTP collector code of Sumo Logic endpoint
- display_name str
- The name of the connector
- endpoint str
- The Sumo Logic collection endpoint where logs will be stored
- compress_logs bool
- Indicates whether the logs should be compressed
- content_type str
- Content type to pass in the log file header
- custom_header_ strname 
- The name of custom header passed with the request to the destination
- custom_header_ strvalue 
- The custom header's contents passed with the request to the destination
- collectorCode String
- The unique HTTP collector code of Sumo Logic endpoint
- displayName String
- The name of the connector
- endpoint String
- The Sumo Logic collection endpoint where logs will be stored
- compressLogs Boolean
- Indicates whether the logs should be compressed
- contentType String
- Content type to pass in the log file header
- customHeader StringName 
- The name of custom header passed with the request to the destination
- customHeader StringValue 
- The custom header's contents passed with the request to the destination
Package Details
- Repository
- Akamai pulumi/pulumi-akamai
- License
- Apache-2.0
- Notes
- This Pulumi package is based on the akamaiTerraform Provider.