Creates an Amazon Managed Workflows for Apache Airflow (MWAA) environment.
See https://www.paws-r-sdk.com/docs/mwaa_create_environment/ for full documentation.
mwaa_create_environment(
AirflowConfigurationOptions = NULL,
AirflowVersion = NULL,
DagS3Path,
EnvironmentClass = NULL,
ExecutionRoleArn,
KmsKey = NULL,
LoggingConfiguration = NULL,
MaxWorkers = NULL,
MinWorkers = NULL,
Name,
NetworkConfiguration,
PluginsS3ObjectVersion = NULL,
PluginsS3Path = NULL,
RequirementsS3ObjectVersion = NULL,
RequirementsS3Path = NULL,
Schedulers = NULL,
SourceBucketArn,
StartupScriptS3ObjectVersion = NULL,
StartupScriptS3Path = NULL,
Tags = NULL,
WebserverAccessMode = NULL,
WeeklyMaintenanceWindowStart = NULL
)
A list of key-value pairs containing the Apache Airflow configuration options you want to attach to your environment. For more information, see Apache Airflow configuration options.
The Apache Airflow version for your environment. If no value is
specified, it defaults to the latest version. Valid values: 1.10.12
,
2.0.2
, 2.2.2
, 2.4.3
, and 2.5.1
. For more information, see
Apache Airflow versions on Amazon Managed Workflows for Apache Airflow (MWAA).
[required] The relative path to the DAGs folder on your Amazon S3 bucket. For
example, dags
. For more information, see Adding or updating DAGs.
The environment class type. Valid values: mw1.small
, mw1.medium
,
mw1.large
. For more information, see Amazon MWAA environment class.
[required] The Amazon Resource Name (ARN) of the execution role for your
environment. An execution role is an Amazon Web Services Identity and
Access Management (IAM) role that grants MWAA permission to access
Amazon Web Services services and resources used by your environment. For
example, arn:aws:iam::123456789:role/my-execution-role
. For more
information, see Amazon MWAA Execution role.
The Amazon Web Services Key Management Service (KMS) key to encrypt the data in your environment. You can use an Amazon Web Services owned CMK, or a Customer managed CMK (advanced). For more information, see Create an Amazon MWAA environment.
Defines the Apache Airflow logs to send to CloudWatch Logs.
The maximum number of workers that you want to run in your environment.
MWAA scales the number of Apache Airflow workers up to the number you
specify in the MaxWorkers
field. For example, 20
. When there are no
more tasks running, and no more in the queue, MWAA disposes of the extra
workers leaving the one worker that is included with your environment,
or the number you specify in MinWorkers
.
The minimum number of workers that you want to run in your environment.
MWAA scales the number of Apache Airflow workers up to the number you
specify in the MaxWorkers
field. When there are no more tasks running,
and no more in the queue, MWAA disposes of the extra workers leaving the
worker count you specify in the MinWorkers
field. For example, 2
.
[required] The name of the Amazon MWAA environment. For example,
MyMWAAEnvironment
.
[required] The VPC networking components used to secure and enable network traffic between the Amazon Web Services resources for your environment. For more information, see About networking on Amazon MWAA.
The version of the plugins.zip file on your Amazon S3 bucket. You must specify a version each time a plugins.zip file is updated. For more information, see How S3 Versioning works.
The relative path to the plugins.zip
file on your Amazon S3 bucket.
For example, plugins.zip
. If specified, then the plugins.zip
version
is required. For more information, see Installing custom plugins.
The version of the requirements.txt
file on your Amazon S3 bucket. You
must specify a version each time a requirements.txt file is updated. For
more information, see How S3 Versioning works.
The relative path to the requirements.txt
file on your Amazon S3
bucket. For example, requirements.txt
. If specified, then a version is
required. For more information, see Installing Python dependencies.
The number of Apache Airflow schedulers to run in your environment. Valid values:
v2 - Accepts between 2 to 5. Defaults to 2.
v1 - Accepts 1.
[required] The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG
code and supporting files are stored. For example,
arn:aws:s3:::my-airflow-bucket-unique-name
. For more information, see
Create an Amazon S3 bucket for Amazon MWAA.
The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.
Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:
3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo
For more information, see Using a startup script.
The relative path to the startup shell script in your Amazon S3 bucket.
For example, s3://mwaa-environment/startup.sh
.
Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script.
The key-value tag pairs you want to associate to your environment. For
example, "Environment": "Staging"
. For more information, see Tagging Amazon Web Services resources.
The Apache Airflow Web server access mode. For more information, see Apache Airflow access modes.
The day and time of the week in Coordinated Universal Time (UTC) 24-hour
standard time to start weekly maintenance updates of your environment in
the following format: DAY:HH:MM
. For example: TUE:03:30
. You can
specify a start time in 30 minute increments only.