You can use psql command to import CSV data file to automate the process of importing data.
You can create kubernetes cluster using kops command in your existing VPC and hosted zone. Kops will create rest of the required AWS resources.
You do not need to create SSO in different AWS account to restrict or Deny permissions to users for different services especially IAM and SSO itself.
To filter Athena query data between two dates, you have to convert the date column data (saved as string) into timestamp and then type cast using date function.
How to Stream Data from Amazon DynamoDB to Amazon S3 using AWS Lambda and Amazon Kinesis Firehose and analyse using Microsoft Power BI
With DynamoDB Streams and the data-transformation feature of Amazon Kinesis Firehose, you have a powerful and scalable way to replicate data from DynamoDB into data sources such as S3 and then analyse using Power BI
Create lambda function to buffer items newly added to the DynamoDB table and then send a batch of these items (JSON-formatted source records) to Amazon Kinesis Firehose delivery stream
This lambda function takes all messages in the DynamoDB stream and forwards them to the Firehose delivery stream.
Kinesis Firehose delivery streams continuously collect, transform, and load streaming data into the destinations that you specify.
Crawlers can crawl both file-based and table-based data stores. Crawlers can crawl the following data stores – Amazon Simple Storage Service (Amazon S3) & Amazon DynamoDB
You can also enable or disable a stream on an existing table, or change the settings of a stream. DynamoDB Streams operates asynchronously, so there is no performance impact on a table if you enable a stream.
This blog post will cover how to integrate AWS CodePipeline with Azure Repos Git.