libZSservicesZSamazonka-dmsZSamazonka-dms
Copyright(c) 2013-2021 Brendan Hay
LicenseMozilla Public License, v. 2.0.
MaintainerBrendan Hay <brendan.g.hay+amazonka@gmail.com>
Stabilityauto-generated
Portabilitynon-portable (GHC extensions)
Safe HaskellNone

Amazonka.DMS.Types.KafkaSettings

Description

 
Synopsis

Documentation

data KafkaSettings Source #

Provides information that describes an Apache Kafka endpoint. This information includes the output format of records applied to the endpoint and details of transaction and control table data information.

See: newKafkaSettings smart constructor.

Constructors

KafkaSettings' 

Fields

  • sslClientKeyArn :: Maybe Text

    The Amazon Resource Name (ARN) for the client private key used to securely connect to a Kafka target endpoint.

  • includeTransactionDetails :: Maybe Bool

    Provides detailed transaction information from the source database. This information includes a commit timestamp, a log position, and values for transaction_id, previous transaction_id, and transaction_record_id (the record offset within a transaction). The default is false.

  • includeTableAlterOperations :: Maybe Bool

    Includes any data definition language (DDL) operations that change the table in the control data, such as rename-table, drop-table, add-column, drop-column, and rename-column. The default is false.

  • sslClientCertificateArn :: Maybe Text

    The Amazon Resource Name (ARN) of the client certificate used to securely connect to a Kafka target endpoint.

  • sslCaCertificateArn :: Maybe Text

    The Amazon Resource Name (ARN) for the private certificate authority (CA) cert that DMS uses to securely connect to your Kafka target endpoint.

  • partitionIncludeSchemaTable :: Maybe Bool

    Prefixes schema and table names to partition values, when the partition type is primary-key-type. Doing this increases data distribution among Kafka partitions. For example, suppose that a SysBench schema has thousands of tables and each table has only limited range for a primary key. In this case, the same primary key is sent from thousands of tables to the same partition, which causes throttling. The default is false.

  • topic :: Maybe Text

    The topic to which you migrate the data. If you don't specify a topic, DMS specifies "kafka-default-topic" as the migration topic.

  • includeControlDetails :: Maybe Bool

    Shows detailed control information for table definition, column definition, and table and column changes in the Kafka message output. The default is false.

  • noHexPrefix :: Maybe Bool

    Set this optional parameter to true to avoid adding a '0x' prefix to raw data in hexadecimal format. For example, by default, DMS adds a '0x' prefix to the LOB column type in hexadecimal format moving from an Oracle source to a Kafka target. Use the NoHexPrefix endpoint setting to enable migration of RAW data type columns without adding the '0x' prefix.

  • saslPassword :: Maybe (Sensitive Text)

    The secure password you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.

  • sslClientKeyPassword :: Maybe (Sensitive Text)

    The password for the client private key used to securely connect to a Kafka target endpoint.

  • includePartitionValue :: Maybe Bool

    Shows the partition value within the Kafka message output unless the partition type is schema-table-type. The default is false.

  • messageFormat :: Maybe MessageFormatValue

    The output format for the records created on the endpoint. The message format is JSON (default) or JSON_UNFORMATTED (a single line with no tab).

  • securityProtocol :: Maybe KafkaSecurityProtocol

    Set secure connection to a Kafka target endpoint using Transport Layer Security (TLS). Options include ssl-encryption, ssl-authentication, and sasl-ssl. sasl-ssl requires SaslUsername and SaslPassword.

  • saslUsername :: Maybe Text

    The secure user name you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.

  • broker :: Maybe Text

    A comma-separated list of one or more broker locations in your Kafka cluster that host your Kafka instance. Specify each broker location in the form broker-hostname-or-ip:port . For example, "ec2-12-345-678-901.compute-1.amazonaws.com:2345". For more information and examples of specifying a list of broker locations, see Using Apache Kafka as a target for Database Migration Service in the Database Migration Service User Guide.

  • messageMaxBytes :: Maybe Int

    The maximum size in bytes for records created on the endpoint The default is 1,000,000.

  • includeNullAndEmpty :: Maybe Bool

    Include NULL and empty columns for records migrated to the endpoint. The default is false.

Instances

Instances details
Eq KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

Show KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

Generic KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

Associated Types

type Rep KafkaSettings :: Type -> Type #

NFData KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

Methods

rnf :: KafkaSettings -> () #

Hashable KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

ToJSON KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

FromJSON KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

type Rep KafkaSettings Source # 
Instance details

Defined in Amazonka.DMS.Types.KafkaSettings

type Rep KafkaSettings = D1 ('MetaData "KafkaSettings" "Amazonka.DMS.Types.KafkaSettings" "libZSservicesZSamazonka-dmsZSamazonka-dms" 'False) (C1 ('MetaCons "KafkaSettings'" 'PrefixI 'True) ((((S1 ('MetaSel ('Just "sslClientKeyArn") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Text)) :*: S1 ('MetaSel ('Just "includeTransactionDetails") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Bool))) :*: (S1 ('MetaSel ('Just "includeTableAlterOperations") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Bool)) :*: S1 ('MetaSel ('Just "sslClientCertificateArn") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Text)))) :*: ((S1 ('MetaSel ('Just "sslCaCertificateArn") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Text)) :*: S1 ('MetaSel ('Just "partitionIncludeSchemaTable") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Bool))) :*: (S1 ('MetaSel ('Just "topic") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Text)) :*: (S1 ('MetaSel ('Just "includeControlDetails") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Bool)) :*: S1 ('MetaSel ('Just "noHexPrefix") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Bool)))))) :*: (((S1 ('MetaSel ('Just "saslPassword") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe (Sensitive Text))) :*: S1 ('MetaSel ('Just "sslClientKeyPassword") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe (Sensitive Text)))) :*: (S1 ('MetaSel ('Just "includePartitionValue") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Bool)) :*: S1 ('MetaSel ('Just "messageFormat") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe MessageFormatValue)))) :*: ((S1 ('MetaSel ('Just "securityProtocol") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe KafkaSecurityProtocol)) :*: S1 ('MetaSel ('Just "saslUsername") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Text))) :*: (S1 ('MetaSel ('Just "broker") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Text)) :*: (S1 ('MetaSel ('Just "messageMaxBytes") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Int)) :*: S1 ('MetaSel ('Just "includeNullAndEmpty") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedStrict) (Rec0 (Maybe Bool))))))))

newKafkaSettings :: KafkaSettings Source #

Create a value of KafkaSettings with all optional fields omitted.

Use generic-lens or optics to modify other optional fields.

The following record fields are available, with the corresponding lenses provided for backwards compatibility:

$sel:sslClientKeyArn:KafkaSettings', kafkaSettings_sslClientKeyArn - The Amazon Resource Name (ARN) for the client private key used to securely connect to a Kafka target endpoint.

$sel:includeTransactionDetails:KafkaSettings', kafkaSettings_includeTransactionDetails - Provides detailed transaction information from the source database. This information includes a commit timestamp, a log position, and values for transaction_id, previous transaction_id, and transaction_record_id (the record offset within a transaction). The default is false.

$sel:includeTableAlterOperations:KafkaSettings', kafkaSettings_includeTableAlterOperations - Includes any data definition language (DDL) operations that change the table in the control data, such as rename-table, drop-table, add-column, drop-column, and rename-column. The default is false.

$sel:sslClientCertificateArn:KafkaSettings', kafkaSettings_sslClientCertificateArn - The Amazon Resource Name (ARN) of the client certificate used to securely connect to a Kafka target endpoint.

$sel:sslCaCertificateArn:KafkaSettings', kafkaSettings_sslCaCertificateArn - The Amazon Resource Name (ARN) for the private certificate authority (CA) cert that DMS uses to securely connect to your Kafka target endpoint.

$sel:partitionIncludeSchemaTable:KafkaSettings', kafkaSettings_partitionIncludeSchemaTable - Prefixes schema and table names to partition values, when the partition type is primary-key-type. Doing this increases data distribution among Kafka partitions. For example, suppose that a SysBench schema has thousands of tables and each table has only limited range for a primary key. In this case, the same primary key is sent from thousands of tables to the same partition, which causes throttling. The default is false.

$sel:topic:KafkaSettings', kafkaSettings_topic - The topic to which you migrate the data. If you don't specify a topic, DMS specifies "kafka-default-topic" as the migration topic.

$sel:includeControlDetails:KafkaSettings', kafkaSettings_includeControlDetails - Shows detailed control information for table definition, column definition, and table and column changes in the Kafka message output. The default is false.

$sel:noHexPrefix:KafkaSettings', kafkaSettings_noHexPrefix - Set this optional parameter to true to avoid adding a '0x' prefix to raw data in hexadecimal format. For example, by default, DMS adds a '0x' prefix to the LOB column type in hexadecimal format moving from an Oracle source to a Kafka target. Use the NoHexPrefix endpoint setting to enable migration of RAW data type columns without adding the '0x' prefix.

$sel:saslPassword:KafkaSettings', kafkaSettings_saslPassword - The secure password you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.

$sel:sslClientKeyPassword:KafkaSettings', kafkaSettings_sslClientKeyPassword - The password for the client private key used to securely connect to a Kafka target endpoint.

$sel:includePartitionValue:KafkaSettings', kafkaSettings_includePartitionValue - Shows the partition value within the Kafka message output unless the partition type is schema-table-type. The default is false.

$sel:messageFormat:KafkaSettings', kafkaSettings_messageFormat - The output format for the records created on the endpoint. The message format is JSON (default) or JSON_UNFORMATTED (a single line with no tab).

$sel:securityProtocol:KafkaSettings', kafkaSettings_securityProtocol - Set secure connection to a Kafka target endpoint using Transport Layer Security (TLS). Options include ssl-encryption, ssl-authentication, and sasl-ssl. sasl-ssl requires SaslUsername and SaslPassword.

$sel:saslUsername:KafkaSettings', kafkaSettings_saslUsername - The secure user name you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.

$sel:broker:KafkaSettings', kafkaSettings_broker - A comma-separated list of one or more broker locations in your Kafka cluster that host your Kafka instance. Specify each broker location in the form broker-hostname-or-ip:port . For example, "ec2-12-345-678-901.compute-1.amazonaws.com:2345". For more information and examples of specifying a list of broker locations, see Using Apache Kafka as a target for Database Migration Service in the Database Migration Service User Guide.

$sel:messageMaxBytes:KafkaSettings', kafkaSettings_messageMaxBytes - The maximum size in bytes for records created on the endpoint The default is 1,000,000.

$sel:includeNullAndEmpty:KafkaSettings', kafkaSettings_includeNullAndEmpty - Include NULL and empty columns for records migrated to the endpoint. The default is false.

kafkaSettings_sslClientKeyArn :: Lens' KafkaSettings (Maybe Text) Source #

The Amazon Resource Name (ARN) for the client private key used to securely connect to a Kafka target endpoint.

kafkaSettings_includeTransactionDetails :: Lens' KafkaSettings (Maybe Bool) Source #

Provides detailed transaction information from the source database. This information includes a commit timestamp, a log position, and values for transaction_id, previous transaction_id, and transaction_record_id (the record offset within a transaction). The default is false.

kafkaSettings_includeTableAlterOperations :: Lens' KafkaSettings (Maybe Bool) Source #

Includes any data definition language (DDL) operations that change the table in the control data, such as rename-table, drop-table, add-column, drop-column, and rename-column. The default is false.

kafkaSettings_sslClientCertificateArn :: Lens' KafkaSettings (Maybe Text) Source #

The Amazon Resource Name (ARN) of the client certificate used to securely connect to a Kafka target endpoint.

kafkaSettings_sslCaCertificateArn :: Lens' KafkaSettings (Maybe Text) Source #

The Amazon Resource Name (ARN) for the private certificate authority (CA) cert that DMS uses to securely connect to your Kafka target endpoint.

kafkaSettings_partitionIncludeSchemaTable :: Lens' KafkaSettings (Maybe Bool) Source #

Prefixes schema and table names to partition values, when the partition type is primary-key-type. Doing this increases data distribution among Kafka partitions. For example, suppose that a SysBench schema has thousands of tables and each table has only limited range for a primary key. In this case, the same primary key is sent from thousands of tables to the same partition, which causes throttling. The default is false.

kafkaSettings_topic :: Lens' KafkaSettings (Maybe Text) Source #

The topic to which you migrate the data. If you don't specify a topic, DMS specifies "kafka-default-topic" as the migration topic.

kafkaSettings_includeControlDetails :: Lens' KafkaSettings (Maybe Bool) Source #

Shows detailed control information for table definition, column definition, and table and column changes in the Kafka message output. The default is false.

kafkaSettings_noHexPrefix :: Lens' KafkaSettings (Maybe Bool) Source #

Set this optional parameter to true to avoid adding a '0x' prefix to raw data in hexadecimal format. For example, by default, DMS adds a '0x' prefix to the LOB column type in hexadecimal format moving from an Oracle source to a Kafka target. Use the NoHexPrefix endpoint setting to enable migration of RAW data type columns without adding the '0x' prefix.

kafkaSettings_saslPassword :: Lens' KafkaSettings (Maybe Text) Source #

The secure password you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.

kafkaSettings_sslClientKeyPassword :: Lens' KafkaSettings (Maybe Text) Source #

The password for the client private key used to securely connect to a Kafka target endpoint.

kafkaSettings_includePartitionValue :: Lens' KafkaSettings (Maybe Bool) Source #

Shows the partition value within the Kafka message output unless the partition type is schema-table-type. The default is false.

kafkaSettings_messageFormat :: Lens' KafkaSettings (Maybe MessageFormatValue) Source #

The output format for the records created on the endpoint. The message format is JSON (default) or JSON_UNFORMATTED (a single line with no tab).

kafkaSettings_securityProtocol :: Lens' KafkaSettings (Maybe KafkaSecurityProtocol) Source #

Set secure connection to a Kafka target endpoint using Transport Layer Security (TLS). Options include ssl-encryption, ssl-authentication, and sasl-ssl. sasl-ssl requires SaslUsername and SaslPassword.

kafkaSettings_saslUsername :: Lens' KafkaSettings (Maybe Text) Source #

The secure user name you created when you first set up your MSK cluster to validate a client identity and make an encrypted connection between server and client using SASL-SSL authentication.

kafkaSettings_broker :: Lens' KafkaSettings (Maybe Text) Source #

A comma-separated list of one or more broker locations in your Kafka cluster that host your Kafka instance. Specify each broker location in the form broker-hostname-or-ip:port . For example, "ec2-12-345-678-901.compute-1.amazonaws.com:2345". For more information and examples of specifying a list of broker locations, see Using Apache Kafka as a target for Database Migration Service in the Database Migration Service User Guide.

kafkaSettings_messageMaxBytes :: Lens' KafkaSettings (Maybe Int) Source #

The maximum size in bytes for records created on the endpoint The default is 1,000,000.

kafkaSettings_includeNullAndEmpty :: Lens' KafkaSettings (Maybe Bool) Source #

Include NULL and empty columns for records migrated to the endpoint. The default is false.