0.0.0+develop
flytekitplugins.spark.models
Directory
Classes
Class | Description |
---|---|
SparkJob |
flytekitplugins.spark.models.SparkJob
class SparkJob(
spark_type: <enum 'SparkType'>,
application_file: str,
main_class: str,
spark_conf: typing.Dict[str, str],
hadoop_conf: typing.Dict[str, str],
executor_path: str,
databricks_conf: typing.Optional[typing.Dict[str, typing.Dict[str, typing.Dict]]],
databricks_instance: typing.Optional[str],
driver_pod: typing.Optional[flytekit.models.task.K8sPod],
executor_pod: typing.Optional[flytekit.models.task.K8sPod],
)
This defines a SparkJob target. It will execute the appropriate SparkJob.
Parameter | Type |
---|---|
spark_type |
<enum 'SparkType'> |
application_file |
str |
main_class |
str |
spark_conf |
typing.Dict[str, str] |
hadoop_conf |
typing.Dict[str, str] |
executor_path |
str |
databricks_conf |
typing.Optional[typing.Dict[str, typing.Dict[str, typing.Dict]]] |
databricks_instance |
typing.Optional[str] |
driver_pod |
typing.Optional[flytekit.models.task.K8sPod] |
executor_pod |
typing.Optional[flytekit.models.task.K8sPod] |
Methods
Method | Description |
---|---|
from_flyte_idl() |
. |
serialize_to_string() |
|
short_string() |
:rtype: Text. |
to_flyte_idl() |
:rtype: flyteidl. |
verbose_string() |
:rtype: Text. |
with_overrides() |
from_flyte_idl()
def from_flyte_idl(
pb2_object,
) -> e: SparkJob
Parameter | Type |
---|---|
pb2_object |
serialize_to_string()
def serialize_to_string()
short_string()
def short_string()
:rtype: Text
to_flyte_idl()
def to_flyte_idl()
:rtype: flyteidl.plugins.spark_pb2.SparkJob
verbose_string()
def verbose_string()
:rtype: Text
with_overrides()
def with_overrides(
new_spark_conf: typing.Optional[typing.Dict[str, str]],
new_hadoop_conf: typing.Optional[typing.Dict[str, str]],
new_databricks_conf: typing.Optional[typing.Dict[str, typing.Dict]],
) -> SparkJob
Parameter | Type |
---|---|
new_spark_conf |
typing.Optional[typing.Dict[str, str]] |
new_hadoop_conf |
typing.Optional[typing.Dict[str, str]] |
new_databricks_conf |
typing.Optional[typing.Dict[str, typing.Dict]] |
Properties
Property | Type | Description |
---|---|---|
application_file |
The main application file to execute :rtype: Text |
|
databricks_conf |
databricks_conf: Databricks job configuration. Config structure can be found here. https://docs.databricks.com/dev-tools/api/2.0/jobs.html#request-structure :rtype: dict[Text, dict[Text, Text]] |
|
databricks_instance |
Domain name of your deployment. Use the form <account>.cloud.databricks.com. :rtype: str |
|
driver_pod |
Additional pod specs for driver pod. :rtype: K8sPod |
|
executor_path |
The python executable to use :rtype: Text |
|
executor_pod |
Additional pod specs for the worker node pods. :rtype: K8sPod |
|
hadoop_conf |
A definition of key-value pairs for hadoop config for the job. rtype: dict[Text, Text] |
|
is_empty |
||
main_class |
The main class to execute :rtype: Text |
|
spark_conf |
A definition of key-value pairs for spark config for the job. :rtype: dict[Text, Text] |
|
spark_type |
Spark Job Type :rtype: Text |