Creates a new job definition.
glue_create_job(Name, Description, LogUri, Role, ExecutionProperty,
Command, DefaultArguments, Connections, MaxRetries, AllocatedCapacity,
Timeout, MaxCapacity, NotificationProperty, SecurityConfiguration, Tags)
[required] The name you assign to this job definition. It must be unique in your account.
Description of the job being defined.
This field is reserved for future use.
[required] The name or ARN of the IAM role associated with this job.
An ExecutionProperty specifying the maximum number of concurrent runs allowed for this job.
[required] The JobCommand that executes this job.
The default arguments for this job.
You can specify arguments here that your own job-execution script consumes, as well as arguments that AWS Glue itself consumes.
For information about how to specify and consume your own Job arguments, see the Calling AWS Glue APIs in Python topic in the developer guide.
For information about the key-value pairs that AWS Glue consumes to set up your job, see the Special Parameters Used by AWS Glue topic in the developer guide.
The connections used for this job.
The maximum number of times to retry this job if it fails.
This parameter is deprecated. Use MaxCapacity
instead.
The number of AWS Glue data processing units (DPUs) to allocate to this Job. From 2 to 100 DPUs can be allocated; the default is 10. A DPU is a relative measure of processing power that consists of 4 vCPUs of compute capacity and 16 GB of memory. For more information, see the AWS Glue pricing page.
The job timeout in minutes. This is the maximum time that a job run can
consume resources before it is terminated and enters TIMEOUT
status.
The default is 2,880 minutes (48 hours).
The number of AWS Glue data processing units (DPUs) that can be allocated when this job runs. A DPU is a relative measure of processing power that consists of 4 vCPUs of compute capacity and 16 GB of memory. For more information, see the AWS Glue pricing page.
The value that can be allocated for MaxCapacity
depends on whether you
are running a python shell job, or an Apache Spark ETL job:
When you specify a python shell job
(JobCommand.Name
="pythonshell"), you can allocate either 0.0625
or 1 DPU. The default is 0.0625 DPU.
When you specify an Apache Spark ETL job
(JobCommand.Name
="glueetl"), you can allocate from 2 to 100
DPUs. The default is 10 DPUs. This job type cannot have a fractional
DPU allocation.
Specifies configuration properties of a job notification.
The name of the SecurityConfiguration structure to be used with this job.
The tags to use with this job. You may use tags to limit access to the job. For more information about tags in AWS Glue, see AWS Tags in AWS Glue in the developer guide.
svc$create_job( Name = "string", Description = "string", LogUri = "string", Role = "string", ExecutionProperty = list( MaxConcurrentRuns = 123 ), Command = list( Name = "string", ScriptLocation = "string" ), DefaultArguments = list( "string" ), Connections = list( Connections = list( "string" ) ), MaxRetries = 123, AllocatedCapacity = 123, Timeout = 123, MaxCapacity = 123.0, NotificationProperty = list( NotifyDelayAfter = 123 ), SecurityConfiguration = "string", Tags = list( "string" ) )