libZSservicesZSamazonka-lookoutequipmentZSamazonka-lookoutequipment
Copyright(c) 2013-2021 Brendan Hay
LicenseMozilla Public License, v. 2.0.
MaintainerBrendan Hay <brendan.g.hay+amazonka@gmail.com>
Stabilityauto-generated
Portabilitynon-portable (GHC extensions)
Safe HaskellNone

Amazonka.LookoutEquipment.Lens

Description

 
Synopsis

Operations

StartInferenceScheduler

startInferenceScheduler_inferenceSchedulerName :: Lens' StartInferenceScheduler Text Source #

The name of the inference scheduler to be started.

startInferenceSchedulerResponse_modelArn :: Lens' StartInferenceSchedulerResponse (Maybe Text) Source #

The Amazon Resource Name (ARN) of the ML model being used by the inference scheduler.

startInferenceSchedulerResponse_modelName :: Lens' StartInferenceSchedulerResponse (Maybe Text) Source #

The name of the ML model being used by the inference scheduler.

startInferenceSchedulerResponse_inferenceSchedulerArn :: Lens' StartInferenceSchedulerResponse (Maybe Text) Source #

The Amazon Resource Name (ARN) of the inference scheduler being started.

DescribeDataset

describeDataset_datasetName :: Lens' DescribeDataset Text Source #

The name of the dataset to be described.

describeDatasetResponse_ingestionInputConfiguration :: Lens' DescribeDatasetResponse (Maybe IngestionInputConfiguration) Source #

Specifies the S3 location configuration for the data input for the data ingestion job.

describeDatasetResponse_datasetArn :: Lens' DescribeDatasetResponse (Maybe Text) Source #

The Amazon Resource Name (ARN) of the dataset being described.

describeDatasetResponse_lastUpdatedAt :: Lens' DescribeDatasetResponse (Maybe UTCTime) Source #

Specifies the time the dataset was last updated, if it was.

describeDatasetResponse_createdAt :: Lens' DescribeDatasetResponse (Maybe UTCTime) Source #

Specifies the time the dataset was created in Amazon Lookout for Equipment.

describeDatasetResponse_schema :: Lens' DescribeDatasetResponse (Maybe Text) Source #

A JSON description of the data that is in each time series dataset, including names, column names, and data types.

describeDatasetResponse_serverSideKmsKeyId :: Lens' DescribeDatasetResponse (Maybe Text) Source #

Provides the identifier of the KMS key used to encrypt dataset data by Amazon Lookout for Equipment.

ListTagsForResource

listTagsForResource_resourceArn :: Lens' ListTagsForResource Text Source #

The Amazon Resource Name (ARN) of the resource (such as the dataset or model) that is the focus of the ListTagsForResource operation.

DescribeDataIngestionJob

describeDataIngestionJobResponse_ingestionInputConfiguration :: Lens' DescribeDataIngestionJobResponse (Maybe IngestionInputConfiguration) Source #

Specifies the S3 location configuration for the data input for the data ingestion job.

describeDataIngestionJobResponse_datasetArn :: Lens' DescribeDataIngestionJobResponse (Maybe Text) Source #

The Amazon Resource Name (ARN) of the dataset being used in the data ingestion job.

describeDataIngestionJobResponse_failedReason :: Lens' DescribeDataIngestionJobResponse (Maybe Text) Source #

Specifies the reason for failure when a data ingestion job has failed.

describeDataIngestionJobResponse_roleArn :: Lens' DescribeDataIngestionJobResponse (Maybe Text) Source #

The Amazon Resource Name (ARN) of an IAM role with permission to access the data source being ingested.

CreateModel

createModel_dataPreProcessingConfiguration :: Lens' CreateModel (Maybe DataPreProcessingConfiguration) Source #

The configuration is the TargetSamplingRate, which is the sampling rate of the data after post processing by Amazon Lookout for Equipment. For example, if you provide data that has been collected at a 1 second level and you want the system to resample the data at a 1 minute rate before training, the TargetSamplingRate is 1 minute.

When providing a value for the TargetSamplingRate, you must attach the prefix "PT" to the rate you want. The value for a 1 second rate is therefore PT1S, the value for a 15 minute rate is PT15M, and the value for a 1 hour rate is PT1H

createModel_trainingDataEndTime :: Lens' CreateModel (Maybe UTCTime) Source #

Indicates the time reference in the dataset that should be used to end the subset of training data for the ML model.

createModel_datasetSchema :: Lens' CreateModel (Maybe DatasetSchema) Source #

The data schema for the ML model being created.

createModel_evaluationDataStartTime :: Lens' CreateModel (Maybe UTCTime) Source #

Indicates the time reference in the dataset that should be used to begin the subset of evaluation data for the ML model.

createModel_offCondition :: Lens' CreateModel (Maybe Text) Source #

Indicates that the asset associated with this sensor has been shut off. As long as this condition is met, Lookout for Equipment will not use data from this asset for training, evaluation, or inference.

createModel_evaluationDataEndTime :: Lens' CreateModel (Maybe UTCTime) Source #

Indicates the time reference in the dataset that should be used to end the subset of evaluation data for the ML model.

createModel_trainingDataStartTime :: Lens' CreateModel (Maybe UTCTime) Source #

Indicates the time reference in the dataset that should be used to begin the subset of training data for the ML model.

createModel_labelsInputConfiguration :: Lens' CreateModel (Maybe LabelsInputConfiguration) Source #

The input configuration for the labels being used for the ML model that's being created.

createModel_tags :: Lens' CreateModel (Maybe [Tag]) Source #

Any tags associated with the ML model being created.

createModel_serverSideKmsKeyId :: Lens' CreateModel (Maybe Text) Source #

Provides the identifier of the KMS key used to encrypt model data by Amazon Lookout for Equipment.

createModel_roleArn :: Lens' CreateModel (Maybe Text) Source #

The Amazon Resource Name (ARN) of a role with permission to access the data source being used to create the ML model.

createModel_modelName :: Lens' CreateModel Text Source #

The name for the ML model to be created.

createModel_datasetName :: Lens' CreateModel Text Source #

The name of the dataset for the ML model being created.

createModel_clientToken :: Lens' CreateModel Text Source #

A unique identifier for the request. If you do not set the client request token, Amazon Lookout for Equipment generates one.

createModelResponse_status :: Lens' CreateModelResponse (Maybe ModelStatus) Source #

Indicates the status of the CreateModel operation.

createModelResponse_modelArn :: Lens' CreateModelResponse (Maybe Text) Source #

The Amazon Resource Name (ARN) of the model being created.

DeleteDataset

deleteDataset_datasetName :: Lens' DeleteDataset Text Source #

The name of the dataset to be deleted.

CreateDataset

createDataset_tags :: Lens' CreateDataset (Maybe [Tag]) Source #

Any tags associated with the ingested data described in the dataset.

createDataset_serverSideKmsKeyId :: Lens' CreateDataset (Maybe Text) Source #

Provides the identifier of the KMS key used to encrypt dataset data by Amazon Lookout for Equipment.

createDataset_datasetName :: Lens' CreateDataset Text Source #

The name of the dataset being created.

createDataset_datasetSchema :: Lens' CreateDataset DatasetSchema Source #

A JSON description of the data that is in each time series dataset, including names, column names, and data types.

createDataset_clientToken :: Lens' CreateDataset Text Source #

A unique identifier for the request. If you do not set the client request token, Amazon Lookout for Equipment generates one.

createDatasetResponse_status :: Lens' CreateDatasetResponse (Maybe DatasetStatus) Source #

Indicates the status of the CreateDataset operation.

createDatasetResponse_datasetArn :: Lens' CreateDatasetResponse (Maybe Text) Source #

The Amazon Resource Name (ARN) of the dataset being created.

DeleteModel

deleteModel_modelName :: Lens' DeleteModel Text Source #

The name of the ML model to be deleted.

ListModels

listModels_status :: Lens' ListModels (Maybe ModelStatus) Source #

The status of the ML model.

listModels_nextToken :: Lens' ListModels (Maybe Text) Source #

An opaque pagination token indicating where to continue the listing of ML models.

listModels_datasetNameBeginsWith :: Lens' ListModels (Maybe Text) Source #

The beginning of the name of the dataset of the ML models to be listed.

listModels_modelNameBeginsWith :: Lens' ListModels (Maybe Text) Source #

The beginning of the name of the ML models being listed.

listModels_maxResults :: Lens' ListModels (Maybe Natural) Source #

Specifies the maximum number of ML models to list.

listModelsResponse_nextToken :: Lens' ListModelsResponse (Maybe Text) Source #

An opaque pagination token indicating where to continue the listing of ML models.

listModelsResponse_modelSummaries :: Lens' ListModelsResponse (Maybe [ModelSummary]) Source #

Provides information on the specified model, including created time, model and dataset ARNs, and status.

StopInferenceScheduler

stopInferenceScheduler_inferenceSchedulerName :: Lens' StopInferenceScheduler Text Source #

The name of the inference scheduler to be stopped.

stopInferenceSchedulerResponse_modelArn :: Lens' StopInferenceSchedulerResponse (Maybe Text) Source #

The Amazon Resource Name (ARN) of the ML model used by the inference scheduler being stopped.

stopInferenceSchedulerResponse_modelName :: Lens' StopInferenceSchedulerResponse (Maybe Text) Source #

The name of the ML model used by the inference scheduler being stopped.

stopInferenceSchedulerResponse_inferenceSchedulerArn :: Lens' StopInferenceSchedulerResponse (Maybe Text) Source #

The Amazon Resource Name (ARN) of the inference schedule being stopped.

ListDataIngestionJobs

listDataIngestionJobs_status :: Lens' ListDataIngestionJobs (Maybe IngestionJobStatus) Source #

Indicates the status of the data ingestion job.

listDataIngestionJobs_nextToken :: Lens' ListDataIngestionJobs (Maybe Text) Source #

An opaque pagination token indicating where to continue the listing of data ingestion jobs.

listDataIngestionJobs_datasetName :: Lens' ListDataIngestionJobs (Maybe Text) Source #

The name of the dataset being used for the data ingestion job.

listDataIngestionJobs_maxResults :: Lens' ListDataIngestionJobs (Maybe Natural) Source #

Specifies the maximum number of data ingestion jobs to list.

listDataIngestionJobsResponse_nextToken :: Lens' ListDataIngestionJobsResponse (Maybe Text) Source #

An opaque pagination token indicating where to continue the listing of data ingestion jobs.

listDataIngestionJobsResponse_dataIngestionJobSummaries :: Lens' ListDataIngestionJobsResponse (Maybe [DataIngestionJobSummary]) Source #

Specifies information about the specific data ingestion job, including dataset name and status.

DescribeModel

describeModel_modelName :: Lens' DescribeModel Text Source #

The name of the ML model to be described.

describeModelResponse_status :: Lens' DescribeModelResponse (Maybe ModelStatus) Source #

Specifies the current status of the model being described. Status describes the status of the most recent action of the model.

describeModelResponse_dataPreProcessingConfiguration :: Lens' DescribeModelResponse (Maybe DataPreProcessingConfiguration) Source #

The configuration is the TargetSamplingRate, which is the sampling rate of the data after post processing by Amazon Lookout for Equipment. For example, if you provide data that has been collected at a 1 second level and you want the system to resample the data at a 1 minute rate before training, the TargetSamplingRate is 1 minute.

When providing a value for the TargetSamplingRate, you must attach the prefix "PT" to the rate you want. The value for a 1 second rate is therefore PT1S, the value for a 15 minute rate is PT15M, and the value for a 1 hour rate is PT1H

describeModelResponse_trainingExecutionStartTime :: Lens' DescribeModelResponse (Maybe UTCTime) Source #

Indicates the time at which the training of the ML model began.

describeModelResponse_datasetArn :: Lens' DescribeModelResponse (Maybe Text) Source #

The Amazon Resouce Name (ARN) of the dataset used to create the ML model being described.

describeModelResponse_failedReason :: Lens' DescribeModelResponse (Maybe Text) Source #

If the training of the ML model failed, this indicates the reason for that failure.

describeModelResponse_modelArn :: Lens' DescribeModelResponse (Maybe Text) Source #

The Amazon Resource Name (ARN) of the ML model being described.

describeModelResponse_lastUpdatedTime :: Lens' DescribeModelResponse (Maybe UTCTime) Source #

Indicates the last time the ML model was updated. The type of update is not specified.

describeModelResponse_trainingDataEndTime :: Lens' DescribeModelResponse (Maybe UTCTime) Source #

Indicates the time reference in the dataset that was used to end the subset of training data for the ML model.

describeModelResponse_createdAt :: Lens' DescribeModelResponse (Maybe UTCTime) Source #

Indicates the time and date at which the ML model was created.

describeModelResponse_modelName :: Lens' DescribeModelResponse (Maybe Text) Source #

The name of the ML model being described.

describeModelResponse_modelMetrics :: Lens' DescribeModelResponse (Maybe Text) Source #

The Model Metrics show an aggregated summary of the model's performance within the evaluation time range. This is the JSON content of the metrics created when evaluating the model.

describeModelResponse_evaluationDataStartTime :: Lens' DescribeModelResponse (Maybe UTCTime) Source #

Indicates the time reference in the dataset that was used to begin the subset of evaluation data for the ML model.

describeModelResponse_schema :: Lens' DescribeModelResponse (Maybe Text) Source #

A JSON description of the data that is in each time series dataset, including names, column names, and data types.

describeModelResponse_offCondition :: Lens' DescribeModelResponse (Maybe Text) Source #

Indicates that the asset associated with this sensor has been shut off. As long as this condition is met, Lookout for Equipment will not use data from this asset for training, evaluation, or inference.

describeModelResponse_evaluationDataEndTime :: Lens' DescribeModelResponse (Maybe UTCTime) Source #

Indicates the time reference in the dataset that was used to end the subset of evaluation data for the ML model.

describeModelResponse_datasetName :: Lens' DescribeModelResponse (Maybe Text) Source #

The name of the dataset being used by the ML being described.

describeModelResponse_trainingDataStartTime :: Lens' DescribeModelResponse (Maybe UTCTime) Source #

Indicates the time reference in the dataset that was used to begin the subset of training data for the ML model.

describeModelResponse_trainingExecutionEndTime :: Lens' DescribeModelResponse (Maybe UTCTime) Source #

Indicates the time at which the training of the ML model was completed.

describeModelResponse_labelsInputConfiguration :: Lens' DescribeModelResponse (Maybe LabelsInputConfiguration) Source #

Specifies configuration information about the labels input, including its S3 location.

describeModelResponse_serverSideKmsKeyId :: Lens' DescribeModelResponse (Maybe Text) Source #

Provides the identifier of the KMS key used to encrypt model data by Amazon Lookout for Equipment.

describeModelResponse_roleArn :: Lens' DescribeModelResponse (Maybe Text) Source #

The Amazon Resource Name (ARN) of a role with permission to access the data source for the ML model being described.

StartDataIngestionJob

startDataIngestionJob_datasetName :: Lens' StartDataIngestionJob Text Source #

The name of the dataset being used by the data ingestion job.

startDataIngestionJob_ingestionInputConfiguration :: Lens' StartDataIngestionJob IngestionInputConfiguration Source #

Specifies information for the input data for the data ingestion job, including dataset S3 location.

startDataIngestionJob_roleArn :: Lens' StartDataIngestionJob Text Source #

The Amazon Resource Name (ARN) of a role with permission to access the data source for the data ingestion job.

startDataIngestionJob_clientToken :: Lens' StartDataIngestionJob Text Source #

A unique identifier for the request. If you do not set the client request token, Amazon Lookout for Equipment generates one.

startDataIngestionJobResponse_status :: Lens' StartDataIngestionJobResponse (Maybe IngestionJobStatus) Source #

Indicates the status of the StartDataIngestionJob operation.

startDataIngestionJobResponse_jobId :: Lens' StartDataIngestionJobResponse (Maybe Text) Source #

Indicates the job ID of the data ingestion job.

ListInferenceSchedulers

listInferenceSchedulers_modelName :: Lens' ListInferenceSchedulers (Maybe Text) Source #

The name of the ML model used by the inference scheduler to be listed.

listInferenceSchedulers_nextToken :: Lens' ListInferenceSchedulers (Maybe Text) Source #

An opaque pagination token indicating where to continue the listing of inference schedulers.

listInferenceSchedulers_inferenceSchedulerNameBeginsWith :: Lens' ListInferenceSchedulers (Maybe Text) Source #

The beginning of the name of the inference schedulers to be listed.

listInferenceSchedulers_maxResults :: Lens' ListInferenceSchedulers (Maybe Natural) Source #

Specifies the maximum number of inference schedulers to list.

listInferenceSchedulersResponse_inferenceSchedulerSummaries :: Lens' ListInferenceSchedulersResponse (Maybe [InferenceSchedulerSummary]) Source #

Provides information about the specified inference scheduler, including data upload frequency, model name and ARN, and status.

listInferenceSchedulersResponse_nextToken :: Lens' ListInferenceSchedulersResponse (Maybe Text) Source #

An opaque pagination token indicating where to continue the listing of inference schedulers.

UpdateInferenceScheduler

updateInferenceScheduler_dataUploadFrequency :: Lens' UpdateInferenceScheduler (Maybe DataUploadFrequency) Source #

How often data is uploaded to the source S3 bucket for the input data. The value chosen is the length of time between data uploads. For instance, if you select 5 minutes, Amazon Lookout for Equipment will upload the real-time data to the source bucket once every 5 minutes. This frequency also determines how often Amazon Lookout for Equipment starts a scheduled inference on your data. In this example, it starts once every 5 minutes.

updateInferenceScheduler_dataDelayOffsetInMinutes :: Lens' UpdateInferenceScheduler (Maybe Natural) Source #

A period of time (in minutes) by which inference on the data is delayed after the data starts. For instance, if you select an offset delay time of five minutes, inference will not begin on the data until the first data measurement after the five minute mark. For example, if five minutes is selected, the inference scheduler will wake up at the configured frequency with the additional five minute delay time to check the customer S3 bucket. The customer can upload data at the same frequency and they don't need to stop and restart the scheduler when uploading new data.

updateInferenceScheduler_dataOutputConfiguration :: Lens' UpdateInferenceScheduler (Maybe InferenceOutputConfiguration) Source #

Specifies information for the output results from the inference scheduler, including the output S3 location.

updateInferenceScheduler_dataInputConfiguration :: Lens' UpdateInferenceScheduler (Maybe InferenceInputConfiguration) Source #

Specifies information for the input data for the inference scheduler, including delimiter, format, and dataset location.

updateInferenceScheduler_roleArn :: Lens' UpdateInferenceScheduler (Maybe Text) Source #

The Amazon Resource Name (ARN) of a role with permission to access the data source for the inference scheduler.

updateInferenceScheduler_inferenceSchedulerName :: Lens' UpdateInferenceScheduler Text Source #

The name of the inference scheduler to be updated.

DeleteInferenceScheduler

deleteInferenceScheduler_inferenceSchedulerName :: Lens' DeleteInferenceScheduler Text Source #

The name of the inference scheduler to be deleted.

TagResource

tagResource_resourceArn :: Lens' TagResource Text Source #

The Amazon Resource Name (ARN) of the specific resource to which the tag should be associated.

tagResource_tags :: Lens' TagResource [Tag] Source #

The tag or tags to be associated with a specific resource. Both the tag key and value are specified.

ListInferenceExecutions

listInferenceExecutions_dataEndTimeBefore :: Lens' ListInferenceExecutions (Maybe UTCTime) Source #

The time reference in the inferenced dataset before which Amazon Lookout for Equipment stopped the inference execution.

listInferenceExecutions_nextToken :: Lens' ListInferenceExecutions (Maybe Text) Source #

An opaque pagination token indicating where to continue the listing of inference executions.

listInferenceExecutions_maxResults :: Lens' ListInferenceExecutions (Maybe Natural) Source #

Specifies the maximum number of inference executions to list.

listInferenceExecutions_dataStartTimeAfter :: Lens' ListInferenceExecutions (Maybe UTCTime) Source #

The time reference in the inferenced dataset after which Amazon Lookout for Equipment started the inference execution.

listInferenceExecutions_inferenceSchedulerName :: Lens' ListInferenceExecutions Text Source #

The name of the inference scheduler for the inference execution listed.

listInferenceExecutionsResponse_nextToken :: Lens' ListInferenceExecutionsResponse (Maybe Text) Source #

An opaque pagination token indicating where to continue the listing of inference executions.

listInferenceExecutionsResponse_inferenceExecutionSummaries :: Lens' ListInferenceExecutionsResponse (Maybe [InferenceExecutionSummary]) Source #

Provides an array of information about the individual inference executions returned from the ListInferenceExecutions operation, including model used, inference scheduler, data configuration, and so on.

CreateInferenceScheduler

createInferenceScheduler_dataDelayOffsetInMinutes :: Lens' CreateInferenceScheduler (Maybe Natural) Source #

A period of time (in minutes) by which inference on the data is delayed after the data starts. For instance, if you select an offset delay time of five minutes, inference will not begin on the data until the first data measurement after the five minute mark. For example, if five minutes is selected, the inference scheduler will wake up at the configured frequency with the additional five minute delay time to check the customer S3 bucket. The customer can upload data at the same frequency and they don't need to stop and restart the scheduler when uploading new data.

createInferenceScheduler_tags :: Lens' CreateInferenceScheduler (Maybe [Tag]) Source #

Any tags associated with the inference scheduler.

createInferenceScheduler_serverSideKmsKeyId :: Lens' CreateInferenceScheduler (Maybe Text) Source #

Provides the identifier of the KMS key used to encrypt inference scheduler data by Amazon Lookout for Equipment.

createInferenceScheduler_modelName :: Lens' CreateInferenceScheduler Text Source #

The name of the previously trained ML model being used to create the inference scheduler.

createInferenceScheduler_inferenceSchedulerName :: Lens' CreateInferenceScheduler Text Source #

The name of the inference scheduler being created.

createInferenceScheduler_dataUploadFrequency :: Lens' CreateInferenceScheduler DataUploadFrequency Source #

How often data is uploaded to the source S3 bucket for the input data. The value chosen is the length of time between data uploads. For instance, if you select 5 minutes, Amazon Lookout for Equipment will upload the real-time data to the source bucket once every 5 minutes. This frequency also determines how often Amazon Lookout for Equipment starts a scheduled inference on your data. In this example, it starts once every 5 minutes.

createInferenceScheduler_dataInputConfiguration :: Lens' CreateInferenceScheduler InferenceInputConfiguration Source #

Specifies configuration information for the input data for the inference scheduler, including delimiter, format, and dataset location.

createInferenceScheduler_dataOutputConfiguration :: Lens' CreateInferenceScheduler InferenceOutputConfiguration Source #

Specifies configuration information for the output results for the inference scheduler, including the S3 location for the output.

createInferenceScheduler_roleArn :: Lens' CreateInferenceScheduler Text Source #

The Amazon Resource Name (ARN) of a role with permission to access the data source being used for the inference.

createInferenceScheduler_clientToken :: Lens' CreateInferenceScheduler Text Source #

A unique identifier for the request. If you do not set the client request token, Amazon Lookout for Equipment generates one.

createInferenceSchedulerResponse_inferenceSchedulerArn :: Lens' CreateInferenceSchedulerResponse (Maybe Text) Source #

The Amazon Resource Name (ARN) of the inference scheduler being created.

ListDatasets

listDatasets_nextToken :: Lens' ListDatasets (Maybe Text) Source #

An opaque pagination token indicating where to continue the listing of datasets.

listDatasets_datasetNameBeginsWith :: Lens' ListDatasets (Maybe Text) Source #

The beginning of the name of the datasets to be listed.

listDatasets_maxResults :: Lens' ListDatasets (Maybe Natural) Source #

Specifies the maximum number of datasets to list.

listDatasetsResponse_nextToken :: Lens' ListDatasetsResponse (Maybe Text) Source #

An opaque pagination token indicating where to continue the listing of datasets.

listDatasetsResponse_datasetSummaries :: Lens' ListDatasetsResponse (Maybe [DatasetSummary]) Source #

Provides information about the specified dataset, including creation time, dataset ARN, and status.

UntagResource

untagResource_resourceArn :: Lens' UntagResource Text Source #

The Amazon Resource Name (ARN) of the resource to which the tag is currently associated.

untagResource_tagKeys :: Lens' UntagResource [Text] Source #

Specifies the key of the tag to be removed from a specified resource.

DescribeInferenceScheduler

describeInferenceSchedulerResponse_dataUploadFrequency :: Lens' DescribeInferenceSchedulerResponse (Maybe DataUploadFrequency) Source #

Specifies how often data is uploaded to the source S3 bucket for the input data. This value is the length of time between data uploads. For instance, if you select 5 minutes, Amazon Lookout for Equipment will upload the real-time data to the source bucket once every 5 minutes. This frequency also determines how often Amazon Lookout for Equipment starts a scheduled inference on your data. In this example, it starts once every 5 minutes.

describeInferenceSchedulerResponse_dataDelayOffsetInMinutes :: Lens' DescribeInferenceSchedulerResponse (Maybe Natural) Source #

A period of time (in minutes) by which inference on the data is delayed after the data starts. For instance, if you select an offset delay time of five minutes, inference will not begin on the data until the first data measurement after the five minute mark. For example, if five minutes is selected, the inference scheduler will wake up at the configured frequency with the additional five minute delay time to check the customer S3 bucket. The customer can upload data at the same frequency and they don't need to stop and restart the scheduler when uploading new data.

describeInferenceSchedulerResponse_modelArn :: Lens' DescribeInferenceSchedulerResponse (Maybe Text) Source #

The Amazon Resource Name (ARN) of the ML model of the inference scheduler being described.

describeInferenceSchedulerResponse_createdAt :: Lens' DescribeInferenceSchedulerResponse (Maybe UTCTime) Source #

Specifies the time at which the inference scheduler was created.

describeInferenceSchedulerResponse_modelName :: Lens' DescribeInferenceSchedulerResponse (Maybe Text) Source #

The name of the ML model of the inference scheduler being described.

describeInferenceSchedulerResponse_inferenceSchedulerArn :: Lens' DescribeInferenceSchedulerResponse (Maybe Text) Source #

The Amazon Resource Name (ARN) of the inference scheduler being described.

describeInferenceSchedulerResponse_dataOutputConfiguration :: Lens' DescribeInferenceSchedulerResponse (Maybe InferenceOutputConfiguration) Source #

Specifies information for the output results for the inference scheduler, including the output S3 location.

describeInferenceSchedulerResponse_updatedAt :: Lens' DescribeInferenceSchedulerResponse (Maybe UTCTime) Source #

Specifies the time at which the inference scheduler was last updated, if it was.

describeInferenceSchedulerResponse_dataInputConfiguration :: Lens' DescribeInferenceSchedulerResponse (Maybe InferenceInputConfiguration) Source #

Specifies configuration information for the input data for the inference scheduler, including delimiter, format, and dataset location.

describeInferenceSchedulerResponse_serverSideKmsKeyId :: Lens' DescribeInferenceSchedulerResponse (Maybe Text) Source #

Provides the identifier of the KMS key used to encrypt inference scheduler data by Amazon Lookout for Equipment.

describeInferenceSchedulerResponse_roleArn :: Lens' DescribeInferenceSchedulerResponse (Maybe Text) Source #

The Amazon Resource Name (ARN) of a role with permission to access the data source for the inference scheduler being described.

Types

DataIngestionJobSummary

dataIngestionJobSummary_ingestionInputConfiguration :: Lens' DataIngestionJobSummary (Maybe IngestionInputConfiguration) Source #

Specifies information for the input data for the data inference job, including data S3 location parameters.

dataIngestionJobSummary_datasetArn :: Lens' DataIngestionJobSummary (Maybe Text) Source #

The Amazon Resource Name (ARN) of the dataset used in the data ingestion job.

dataIngestionJobSummary_jobId :: Lens' DataIngestionJobSummary (Maybe Text) Source #

Indicates the job ID of the data ingestion job.

dataIngestionJobSummary_datasetName :: Lens' DataIngestionJobSummary (Maybe Text) Source #

The name of the dataset used for the data ingestion job.

DataPreProcessingConfiguration

dataPreProcessingConfiguration_targetSamplingRate :: Lens' DataPreProcessingConfiguration (Maybe TargetSamplingRate) Source #

The sampling rate of the data after post processing by Amazon Lookout for Equipment. For example, if you provide data that has been collected at a 1 second level and you want the system to resample the data at a 1 minute rate before training, the TargetSamplingRate is 1 minute.

When providing a value for the TargetSamplingRate, you must attach the prefix "PT" to the rate you want. The value for a 1 second rate is therefore PT1S, the value for a 15 minute rate is PT15M, and the value for a 1 hour rate is PT1H

DatasetSchema

DatasetSummary

datasetSummary_status :: Lens' DatasetSummary (Maybe DatasetStatus) Source #

Indicates the status of the dataset.

datasetSummary_datasetArn :: Lens' DatasetSummary (Maybe Text) Source #

The Amazon Resource Name (ARN) of the specified dataset.

datasetSummary_createdAt :: Lens' DatasetSummary (Maybe UTCTime) Source #

The time at which the dataset was created in Amazon Lookout for Equipment.

InferenceExecutionSummary

inferenceExecutionSummary_failedReason :: Lens' InferenceExecutionSummary (Maybe Text) Source #

Specifies the reason for failure when an inference execution has failed.

inferenceExecutionSummary_modelArn :: Lens' InferenceExecutionSummary (Maybe Text) Source #

The Amazon Resource Name (ARN) of the ML model used for the inference execution.

inferenceExecutionSummary_dataStartTime :: Lens' InferenceExecutionSummary (Maybe UTCTime) Source #

Indicates the time reference in the dataset at which the inference execution began.

inferenceExecutionSummary_modelName :: Lens' InferenceExecutionSummary (Maybe Text) Source #

The name of the ML model being used for the inference execution.

inferenceExecutionSummary_inferenceSchedulerArn :: Lens' InferenceExecutionSummary (Maybe Text) Source #

The Amazon Resource Name (ARN) of the inference scheduler being used for the inference execution.

inferenceExecutionSummary_scheduledStartTime :: Lens' InferenceExecutionSummary (Maybe UTCTime) Source #

Indicates the start time at which the inference scheduler began the specific inference execution.

inferenceExecutionSummary_dataOutputConfiguration :: Lens' InferenceExecutionSummary (Maybe InferenceOutputConfiguration) Source #

Specifies configuration information for the output results from for the inference execution, including the output S3 location.

inferenceExecutionSummary_dataEndTime :: Lens' InferenceExecutionSummary (Maybe UTCTime) Source #

Indicates the time reference in the dataset at which the inference execution stopped.

inferenceExecutionSummary_inferenceSchedulerName :: Lens' InferenceExecutionSummary (Maybe Text) Source #

The name of the inference scheduler being used for the inference execution.

inferenceExecutionSummary_dataInputConfiguration :: Lens' InferenceExecutionSummary (Maybe InferenceInputConfiguration) Source #

Specifies configuration information for the input data for the inference scheduler, including delimiter, format, and dataset location.

InferenceInputConfiguration

inferenceInputConfiguration_inputTimeZoneOffset :: Lens' InferenceInputConfiguration (Maybe Text) Source #

Indicates the difference between your time zone and Greenwich Mean Time (GMT).

inferenceInputConfiguration_s3InputConfiguration :: Lens' InferenceInputConfiguration (Maybe InferenceS3InputConfiguration) Source #

Specifies configuration information for the input data for the inference, including S3 location of input data..

inferenceInputConfiguration_inferenceInputNameConfiguration :: Lens' InferenceInputConfiguration (Maybe InferenceInputNameConfiguration) Source #

Specifies configuration information for the input data for the inference, including timestamp format and delimiter.

InferenceInputNameConfiguration

inferenceInputNameConfiguration_timestampFormat :: Lens' InferenceInputNameConfiguration (Maybe Text) Source #

The format of the timestamp, whether Epoch time, or standard, with or without hyphens (-).

inferenceInputNameConfiguration_componentTimestampDelimiter :: Lens' InferenceInputNameConfiguration (Maybe Text) Source #

Indicates the delimiter character used between items in the data.

InferenceOutputConfiguration

inferenceOutputConfiguration_kmsKeyId :: Lens' InferenceOutputConfiguration (Maybe Text) Source #

The ID number for the AWS KMS key used to encrypt the inference output.

inferenceOutputConfiguration_s3OutputConfiguration :: Lens' InferenceOutputConfiguration InferenceS3OutputConfiguration Source #

Specifies configuration information for the output results from for the inference, output S3 location.

InferenceS3InputConfiguration

inferenceS3InputConfiguration_prefix :: Lens' InferenceS3InputConfiguration (Maybe Text) Source #

The prefix for the S3 bucket used for the input data for the inference.

inferenceS3InputConfiguration_bucket :: Lens' InferenceS3InputConfiguration Text Source #

The bucket containing the input dataset for the inference.

InferenceS3OutputConfiguration

inferenceS3OutputConfiguration_prefix :: Lens' InferenceS3OutputConfiguration (Maybe Text) Source #

The prefix for the S3 bucket used for the output results from the inference.

inferenceS3OutputConfiguration_bucket :: Lens' InferenceS3OutputConfiguration Text Source #

The bucket containing the output results from the inference

InferenceSchedulerSummary

inferenceSchedulerSummary_dataUploadFrequency :: Lens' InferenceSchedulerSummary (Maybe DataUploadFrequency) Source #

How often data is uploaded to the source S3 bucket for the input data. This value is the length of time between data uploads. For instance, if you select 5 minutes, Amazon Lookout for Equipment will upload the real-time data to the source bucket once every 5 minutes. This frequency also determines how often Amazon Lookout for Equipment starts a scheduled inference on your data. In this example, it starts once every 5 minutes.

inferenceSchedulerSummary_dataDelayOffsetInMinutes :: Lens' InferenceSchedulerSummary (Maybe Natural) Source #

A period of time (in minutes) by which inference on the data is delayed after the data starts. For instance, if an offset delay time of five minutes was selected, inference will not begin on the data until the first data measurement after the five minute mark. For example, if five minutes is selected, the inference scheduler will wake up at the configured frequency with the additional five minute delay time to check the customer S3 bucket. The customer can upload data at the same frequency and they don't need to stop and restart the scheduler when uploading new data.

inferenceSchedulerSummary_modelArn :: Lens' InferenceSchedulerSummary (Maybe Text) Source #

The Amazon Resource Name (ARN) of the ML model used by the inference scheduler.

inferenceSchedulerSummary_modelName :: Lens' InferenceSchedulerSummary (Maybe Text) Source #

The name of the ML model used for the inference scheduler.

inferenceSchedulerSummary_inferenceSchedulerArn :: Lens' InferenceSchedulerSummary (Maybe Text) Source #

The Amazon Resource Name (ARN) of the inference scheduler.

IngestionInputConfiguration

ingestionInputConfiguration_s3InputConfiguration :: Lens' IngestionInputConfiguration IngestionS3InputConfiguration Source #

The location information for the S3 bucket used for input data for the data ingestion.

IngestionS3InputConfiguration

ingestionS3InputConfiguration_prefix :: Lens' IngestionS3InputConfiguration (Maybe Text) Source #

The prefix for the S3 location being used for the input data for the data ingestion.

ingestionS3InputConfiguration_bucket :: Lens' IngestionS3InputConfiguration Text Source #

The name of the S3 bucket used for the input data for the data ingestion.

LabelsInputConfiguration

labelsInputConfiguration_s3InputConfiguration :: Lens' LabelsInputConfiguration LabelsS3InputConfiguration Source #

Contains location information for the S3 location being used for label data.

LabelsS3InputConfiguration

labelsS3InputConfiguration_prefix :: Lens' LabelsS3InputConfiguration (Maybe Text) Source #

The prefix for the S3 bucket used for the label data.

labelsS3InputConfiguration_bucket :: Lens' LabelsS3InputConfiguration Text Source #

The name of the S3 bucket holding the label data.

ModelSummary

modelSummary_status :: Lens' ModelSummary (Maybe ModelStatus) Source #

Indicates the status of the ML model.

modelSummary_datasetArn :: Lens' ModelSummary (Maybe Text) Source #

The Amazon Resource Name (ARN) of the dataset used to create the model.

modelSummary_modelArn :: Lens' ModelSummary (Maybe Text) Source #

The Amazon Resource Name (ARN) of the ML model.

modelSummary_createdAt :: Lens' ModelSummary (Maybe UTCTime) Source #

The time at which the specific model was created.

modelSummary_datasetName :: Lens' ModelSummary (Maybe Text) Source #

The name of the dataset being used for the ML model.

S3Object

s3Object_bucket :: Lens' S3Object Text Source #

The name of the specific S3 bucket.

s3Object_key :: Lens' S3Object Text Source #

The AWS Key Management Service (AWS KMS) key being used to encrypt the S3 object. Without this key, data in the bucket is not accessible.

Tag

tag_key :: Lens' Tag Text Source #

The key for the specified tag.

tag_value :: Lens' Tag Text Source #

The value for the specified tag.