table of contents
SAHARACLIENT(1) | python-saharaclient | SAHARACLIENT(1) |
NAME¶
saharaclient - Sahara Client
This is a client for OpenStack Sahara API. There's a Python API client (the saharaclient module), and a command-line utility (installed as an OpenStackClient plugin). Each implements the entire OpenStack Sahara API.
You'll need credentials for an OpenStack cloud that implements the Data Processing API, in order to use the sahara client.
You may want to read the OpenStack Sahara Docs -- the overview, at least -- to get an idea of the concepts. By understanding the concepts this library should make more sense.
Contents:
REFERENCE GUIDE¶
Python Sahara client¶
Overview¶
Sahara Client provides a list of Python interfaces to communicate with the Sahara REST API. Sahara Client enables users to perform most of the existing operations like retrieving template lists, creating Clusters, submitting EDP Jobs, etc.
Instantiating a Client¶
To start using the Sahara Client users have to create an instance of the Client class. The client constructor has a list of parameters to authenticate and locate Sahara endpoint.
- Important!
- It is not a mandatory rule to provide all of the parameters above. The minimum number should be enough to determine Sahara endpoint, check user authentication and tenant to operate in.
Authentication check¶
Passing authentication parameters to Sahara Client is deprecated. Keystone Session object should be used for this purpose. For example:
from keystoneauth1.identity import v2 from keystoneauth1 import session from saharaclient import client auth = v2.Password(auth_url=AUTH_URL,
username=USERNAME,
password=PASSWORD,
tenant_name=PROJECT_ID) ses = session.Session(auth=auth) sahara = client.Client('1.1', session=ses)
For more information about Keystone Sessions, see Using Sessions.
Sahara endpoint discovery¶
If user has a direct URL pointing to Sahara REST API, it may be specified as sahara_url. If this parameter is missing, Sahara client will use Keystone Service Catalog to find the endpoint. There are two parameters: service_type and endpoint_type to configure endpoint search. Both parameters have default values.
from keystoneauth1.identity import v2 from keystoneauth1 import session from saharaclient import client auth = v2.Password(auth_url=AUTH_URL,
username=USERNAME,
password=PASSWORD,
tenant_name=PROJECT_ID) ses = session.Session(auth=auth) sahara = client.Client('1.1', session=ses,
service_type="non-default-service-type",
endpoint_type="internalURL")
Object managers¶
Sahara Client has a list of fields to operate with:
- plugins
- clusters
- cluster_templates
- node_group_templates
- images
- data_sources
- job_binaries
- job_binary_internals
- job_executions
- job_types
Each of this fields is a reference to a Manager for a corresponding group of REST calls.
Supported operations¶
Plugin ops¶
- convert_to_cluster_template(plugin_name, hadoop_version, template_name, filecontent)
- Convert to cluster template
Create Cluster Template directly, avoiding Cluster Template mechanism.
- get(plugin_name)
- Get information about a Plugin.
- get_version_details(plugin_name, hadoop_version)
- Get version details
Get the list of Services and Service Parameters for a specified Plugin and Plugin Version.
- list(search_opts=None)
- Get a list of Plugins.
- update(plugin_name, values)
- Update plugin and then return updated result to user
Image Registry ops¶
- get(id)
- Get information about an image
- list(search_opts=None)
- Get a list of registered images.
- unregister_image(image_id)
- Remove an Image from Sahara Image Registry.
- update_image(image_id, user_name, desc=None)
- Create or update an Image in Image Registry.
Node Group Template ops¶
- delete(ng_template_id)
- Delete a Node Group Template.
- export(ng_template_id)
- Export a Node Group Template.
- get(ng_template_id)
- Get information about a Node Group Template.
- list(search_opts=None, marker=None, limit=None, sort_by=None, reverse=None)
- Get a list of Node Group Templates.
Cluster Template ops¶
- delete(cluster_template_id)
- Delete a Cluster Template.
- export(cluster_template_id)
- Export a Cluster Template.
- get(cluster_template_id)
- Get information about a Cluster Template.
- list(search_opts=None, marker=None, limit=None, sort_by=None, reverse=None)
- Get list of Cluster Templates.
Cluster ops¶
- delete(cluster_id)
- Delete a Cluster.
- get(cluster_id, show_progress=False)
- Get information about a Cluster.
- list(search_opts=None, limit=None, marker=None, sort_by=None, reverse=None)
- Get a list of Clusters.
- scale(cluster_id, scale_object)
- Scale an existing Cluster.
- Parameters
- scale_object -- dict that describes scaling operation
- Example
The following scale_object can be used to change the number of instances in the node group and add instances of new node group to existing cluster:
{
"add_node_groups": [
{
"count": 3,
"name": "new_ng",
"node_group_template_id": "ngt_id"
}
],
"resize_node_groups": [
{
"count": 2,
"name": "old_ng"
}
] }
- verification_update(cluster_id, status)
- Start a verification for a Cluster.
Data Source ops¶
- delete(data_source_id)
- Delete a Data Source.
- get(data_source_id)
- Get information about a Data Source.
- list(search_opts=None, limit=None, marker=None, sort_by=None, reverse=None)
- Get a list of Data Sources.
- update(data_source_id, update_data)
- Update a Data Source.
- Parameters
- update_data (dict) -- dict that contains fields that should be updated with new values.
Fields that can be updated:
- name
- description
- type
- url
- is_public
- is_protected
- credentials - dict with the keys user and password for data source in Swift, or with the keys accesskey, secretkey, endpoint, ssl, and bucket_in_path for data source in S3
Job Binary Internal ops¶
- update(job_binary_id, name=NotUpdated, is_public=NotUpdated, is_protected=NotUpdated)
- Update a Job Binary Internal.
Job Binary ops¶
- create(name, url, description=None, extra=None, is_public=None, is_protected=None)
- Create a Job Binary.
- Parameters
- extra (dict) -- authentication info needed for some job binaries, containing the keys user and password for job binary in Swift or the keys accesskey, secretkey, and endpoint for job binary in S3
- delete(job_binary_id)
- Delete a Job Binary.
- get(job_binary_id)
- Get information about a Job Binary.
- get_file(job_binary_id)
- Download a Job Binary.
- list(search_opts=None, limit=None, marker=None, sort_by=None, reverse=None)
- Get a list of Job Binaries.
- update(job_binary_id, data)
- Update Job Binary.
- Parameters
- data (dict) -- dict that contains fields that should be updated with new values.
Fields that can be updated:
- name
- description
- url
- is_public
- is_protected
- extra - dict with the keys user and password for job binary in Swift, or with the keys accesskey, secretkey, and endpoint for job binary in S3
Job ops¶
- delete(job_id)
- Delete a Job
- get(job_id)
- Get information about a Job
- get_configs(job_type)
- Get config hints for a specified Job type.
- list(search_opts=None, limit=None, marker=None, sort_by=None, reverse=None)
- Get a list of Jobs.
Job Execution ops¶
- delete(obj_id)
- Delete a Job Execution.
- get(obj_id)
- Get information about a Job Execution.
- list(search_opts=None, marker=None, limit=None, sort_by=None, reverse=None)
- Get a list of Job Executions.
- update(obj_id, is_public=NotUpdated, is_protected=NotUpdated)
- Update a Job Execution.
Job Types ops¶
- list(search_opts=None)
- Get a list of job types supported by plugins.
Python Sahara client for APIv2¶
Overview¶
There is also support for Sahara's experimental APIv2.
Supported operations¶
Plugin ops¶
- get(plugin_name)
- Get information about a Plugin.
- get_version_details(plugin_name, plugin_version)
- Get version details
Get the list of Services and Service Parameters for a specified Plugin and Plugin Version.
- list(search_opts=None)
- Get a list of Plugins.
- update(plugin_name, values)
- Update plugin and then return updated result to user
Image Registry ops¶
- get(id)
- Get information about an image
- list(search_opts=None)
- Get a list of registered images.
- unregister_image(image_id)
- Remove an Image from Sahara Image Registry.
- update_image(image_id, user_name, desc=None)
- Create or update an Image in Image Registry.
Node Group Template ops¶
- delete(ng_template_id)
- Delete a Node Group Template.
- export(ng_template_id)
- Export a Node Group Template.
- get(ng_template_id)
- Get information about a Node Group Template.
- list(search_opts=None, marker=None, limit=None, sort_by=None, reverse=None)
- Get a list of Node Group Templates.
Cluster Template ops¶
- delete(cluster_template_id)
- Delete a Cluster Template.
- export(cluster_template_id)
- Export a Cluster Template.
- get(cluster_template_id)
- Get information about a Cluster Template.
- list(search_opts=None, marker=None, limit=None, sort_by=None, reverse=None)
- Get list of Cluster Templates.
Cluster ops¶
- delete(cluster_id)
- Delete a Cluster.
- force_delete(cluster_id)
- Force Delete a Cluster.
- get(cluster_id, show_progress=False)
- Get information about a Cluster.
- list(search_opts=None, limit=None, marker=None, sort_by=None, reverse=None)
- Get a list of Clusters.
- scale(cluster_id, scale_object)
- Scale an existing Cluster.
- Parameters
- scale_object -- dict that describes scaling operation
- Example
The following scale_object can be used to change the number of instances in the node group (optionally specifiying which instances to delete) or add instances of a new node group to an existing cluster:
{
"add_node_groups": [
{
"count": 3,
"name": "new_ng",
"node_group_template_id": "ngt_id"
}
],
"resize_node_groups": [
{
"count": 2,
"name": "old_ng",
"instances": ["instance_id1", "instance_id2"]
}
] }
- verification_update(cluster_id, status)
- Start a verification for a Cluster.
Data Source ops¶
- delete(data_source_id)
- Delete a Data Source.
- get(data_source_id)
- Get information about a Data Source.
- list(search_opts=None, limit=None, marker=None, sort_by=None, reverse=None)
- Get a list of Data Sources.
- update(data_source_id, update_data)
- Update a Data Source.
- Parameters
- update_data (dict) -- dict that contains fields that should be updated with new values.
Fields that can be updated:
- name
- description
- type
- url
- is_public
- is_protected
- credentials - dict with the keys user and password for data source in Swift, or with the keys accesskey, secretkey, endpoint, ssl, and bucket_in_path for data source in S3
Job Binary ops¶
- create(name, url, description=None, extra=None, is_public=None, is_protected=None)
- Create a Job Binary.
- Parameters
- extra (dict) -- authentication info needed for some job binaries, containing the keys user and password for job binary in Swift or the keys accesskey, secretkey, and endpoint for job binary in S3
- delete(job_binary_id)
- Delete a Job Binary.
- get(job_binary_id)
- Get information about a Job Binary.
- get_file(job_binary_id)
- Download a Job Binary.
- list(search_opts=None, limit=None, marker=None, sort_by=None, reverse=None)
- Get a list of Job Binaries.
- update(job_binary_id, data)
- Update Job Binary.
- Parameters
- data (dict) -- dict that contains fields that should be updated with new values.
Fields that can be updated:
- name
- description
- url
- is_public
- is_protected
- extra - dict with the keys user and password for job binary in Swift, or with the keys accesskey, secretkey, and endpoint for job binary in S3
Job Template ops¶
- delete(job_id)
- Delete a Job Template.
- get(job_id)
- Get information about a Job Template.
- get_configs(job_type)
- Get config hints for a specified Job Template type.
- list(search_opts=None, limit=None, marker=None, sort_by=None, reverse=None)
- Get a list of Job Templates.
Job ops¶
- delete(obj_id)
- Delete a Job.
- get(obj_id)
- Get information about a Job.
- list(search_opts=None, marker=None, limit=None, sort_by=None, reverse=None)
- Get a list of Jobs.
- refresh_status(obj_id)
- Refresh Job Status.
Job Types ops¶
- list(search_opts=None)
- Get a list of job types supported by plugins.
SAHARA CLI CLIENT¶
Introduction¶
The Sahara shell utility now is part of the OpenStackClient, so all shell commands take the following form:
$ openstack dataprocessing <command> [arguments...]
To get a list of all possible commands you can run:
$ openstack help dataprocessing
To get detailed help for the command you can run:
$ openstack help dataprocessing <command>
For more information about commands and their parameters you can refer to the Sahara CLI commands.
For more information about abilities and features of OpenStackClient CLI you can refer to OpenStackClient documentation
Configuration¶
The CLI is configured via environment variables and command-line options which are described in https://docs.openstack.org/python-openstackclient/latest/cli/authentication.html.
Authentication using username/password is most commonly used and can be provided with environment variables:
export OS_AUTH_URL=<url-to-openstack-identity> export OS_PROJECT_NAME=<project-name> export OS_USERNAME=<username> export OS_PASSWORD=<password> # (optional)
or command-line options:
--os-auth-url <url> --os-project-name <project-name> --os-username <username> [--os-password <password>]
Additionally sahara API url can be configured with parameter:
--os-data-processing-url
or with environment variable:
export OS_DATA_PROCESSING_URL=<url-to-sahara-API>
CLI Reference¶
The following commands are currently supported by the Sahara CLI:
Plugins¶
dataprocessing plugin configs get
Get plugin configs
usage: dataprocessing plugin configs get [-h] [--file <file>]
<plugin> <plugin_version>
This command is provided by the python-saharaclient plugin.
dataprocessing plugin list
Lists plugins
usage: dataprocessing plugin list [-h] [-f {csv,json,table,value,yaml}]
[-c COLUMN]
[--quote {all,minimal,none,nonnumeric}]
[--noindent] [--max-width <integer>]
[--fit-width] [--print-empty]
[--sort-column SORT_COLUMN] [--long]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: csv, json, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --quote=nonnumeric
- when to include quotes, defaults to nonnumeric
Possible choices: all, minimal, none, nonnumeric
- --noindent=False
- whether to disable indenting the JSON
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --sort-column=[]
- specify the column(s) to sort the data (columns specified first have a priority, non-existing columns are ignored), can be repeated
- --long=False
- List additional fields in output
This command is provided by the python-saharaclient plugin.
dataprocessing plugin show
Display plugin details
usage: dataprocessing plugin show [-h] [-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent] [--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty]
[--plugin-version <plugin_version>]
<plugin>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --plugin-version
- Version of the plugin to display
This command is provided by the python-saharaclient plugin.
dataprocessing plugin update
Command base class for displaying data about a single object.
usage: dataprocessing plugin update [-h] [-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent] [--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty]
<plugin> <json>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
This command is provided by the python-saharaclient plugin.
Images¶
dataprocessing image tags add
Add image tags
usage: dataprocessing image tags add [-h] [-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX] [--max-width <integer>]
[--fit-width] [--print-empty] --tags
<tag> [<tag> ...]
<image>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- Tag(s) to add [REQUIRED]
This command is provided by the python-saharaclient plugin.
dataprocessing image list
Lists registered images
usage: dataprocessing image list [-h] [-f {csv,json,table,value,yaml}]
[-c COLUMN]
[--quote {all,minimal,none,nonnumeric}]
[--noindent] [--max-width <integer>]
[--fit-width] [--print-empty]
[--sort-column SORT_COLUMN] [--long]
[--name <name-regex>]
[--tags <tag> [<tag> ...]]
[--username <username>]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: csv, json, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --quote=nonnumeric
- when to include quotes, defaults to nonnumeric
Possible choices: all, minimal, none, nonnumeric
- --noindent=False
- whether to disable indenting the JSON
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --sort-column=[]
- specify the column(s) to sort the data (columns specified first have a priority, non-existing columns are ignored), can be repeated
- --long=False
- List additional fields in output
- --name
- Regular expression to match image name
- List images with specific tag(s)
- --username
- List images with specific username
This command is provided by the python-saharaclient plugin.
dataprocessing image register
Register an image
usage: dataprocessing image register [-h] [-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX] [--max-width <integer>]
[--fit-width] [--print-empty] --username
<username> [--description <description>]
<image>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --username
- Username of privileged user in the image [REQUIRED]
- --description
- Description of the image. If not provided, description of the image will be reset to empty
This command is provided by the python-saharaclient plugin.
dataprocessing image tags remove
Remove image tags
usage: dataprocessing image tags remove [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty]
[--tags <tag> [<tag> ...] | --all]
<image>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- Tag(s) to remove
- --all=False
- Remove all tags from image
This command is provided by the python-saharaclient plugin.
dataprocessing image tags set
Set image tags (Replace current image tags with provided ones)
usage: dataprocessing image tags set [-h] [-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX] [--max-width <integer>]
[--fit-width] [--print-empty] --tags
<tag> [<tag> ...]
<image>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- Tag(s) to set [REQUIRED]
This command is provided by the python-saharaclient plugin.
dataprocessing image show
Display image details
usage: dataprocessing image show [-h] [-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent] [--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty]
<image>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
This command is provided by the python-saharaclient plugin.
dataprocessing image unregister
Unregister image(s)
usage: dataprocessing image unregister [-h] <image> [<image> ...]
This command is provided by the python-saharaclient plugin.
Node Group Templates¶
dataprocessing node group template create
Creates node group template
usage: dataprocessing node group template create [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>]
[--fit-width] [--print-empty]
[--name <name>]
[--plugin <plugin>]
[--plugin-version <plugin_version>]
[--processes <processes> [<processes> ...]]
[--flavor <flavor>]
[--security-groups <security-groups> [<security-groups> ...]]
[--auto-security-group]
[--availability-zone <availability-zone>]
[--floating-ip-pool <floating-ip-pool>]
[--volumes-per-node <volumes-per-node>]
[--volumes-size <volumes-size>]
[--volumes-type <volumes-type>]
[--volumes-availability-zone <volumes-availability-zone>]
[--volumes-mount-prefix <volumes-mount-prefix>]
[--volumes-locality]
[--description <description>]
[--autoconfig]
[--proxy-gateway] [--public]
[--protected]
[--json <filename>]
[--shares <filename>]
[--configs <filename>]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --name
- Name of the node group template [REQUIRED if JSON is not provided]
- --plugin
- Name of the plugin [REQUIRED if JSON is not provided]
- --plugin-version
- Version of the plugin [REQUIRED if JSON is not provided]
- --processes
- List of the processes that will be launched on each instance [REQUIRED if JSON is not provided]
- --flavor
- Name or ID of the flavor [REQUIRED if JSON is not provided]
- --security-groups
- List of the security groups for the instances in this node group
- --auto-security-group=False
- Indicates if an additional security group should be created for the node group
- --availability-zone
- Name of the availability zone where instances will be created
- --floating-ip-pool
- ID of the floating IP pool
- --volumes-per-node
- Number of volumes attached to every node
- --volumes-size
- Size of volumes attached to node (GB). This parameter will be taken into account only if volumes-per-node is set and non-zero
- --volumes-type
- Type of the volumes. This parameter will be taken into account only if volumes-per-node is set and non-zero
- --volumes-availability-zone
- Name of the availability zone where volumes will be created. This parameter will be taken into account only if volumes-per-node is set and non-zero
- --volumes-mount-prefix
- Prefix for mount point directory. This parameter will be taken into account only if volumes-per-node is set and non-zero
- --volumes-locality=False
- If enabled, instance and attached volumes will be created on the same physical host. This parameter will be taken into account only if volumes-per-node is set and non-zero
- --description
- Description of the node group template
- --autoconfig=False
- If enabled, instances of the node group will be automatically configured
- --proxy-gateway=False
- If enabled, instances of the node group will be used to access other instances in the cluster
- --public=False
- Make the node group template public (Visible from other projects)
- --protected=False
- Make the node group template protected
- --json
- JSON representation of the node group template. Other arguments will not be taken into account if this one is provided
- JSON representation of the manila shares
- --configs
- JSON representation of the node group template configs
This command is provided by the python-saharaclient plugin.
dataprocessing node group template delete
Deletes node group template
usage: dataprocessing node group template delete [-h]
<node-group-template>
[<node-group-template> ...]
This command is provided by the python-saharaclient plugin.
dataprocessing node group template export
Export node group template to JSON
usage: dataprocessing node group template export [-h] [--file <filename>]
<node-group-template>
This command is provided by the python-saharaclient plugin.
dataprocessing node group template import
Imports node group template
usage: dataprocessing node group template import [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>]
[--fit-width] [--print-empty]
[--name <name>]
[--security_groups <security_groups>]
[--floating_ip_pool <floating_ip_pool>]
--image_id <image_id>
--flavor_id <flavor_id>
<json>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --name
- Name of the node group template
- --security_groups
- Security groups of the node group template
- --floating_ip_pool
- Floating IP pool of the node group template
- --image_id
- Image ID of the node group template
- --flavor_id
- Flavor ID of the node group template
This command is provided by the python-saharaclient plugin.
dataprocessing node group template list
Lists node group templates
usage: dataprocessing node group template list [-h]
[-f {csv,json,table,value,yaml}]
[-c COLUMN]
[--quote {all,minimal,none,nonnumeric}]
[--noindent]
[--max-width <integer>]
[--fit-width] [--print-empty]
[--sort-column SORT_COLUMN]
[--long] [--plugin <plugin>]
[--plugin-version <plugin_version>]
[--name <name-substring>]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: csv, json, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --quote=nonnumeric
- when to include quotes, defaults to nonnumeric
Possible choices: all, minimal, none, nonnumeric
- --noindent=False
- whether to disable indenting the JSON
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --sort-column=[]
- specify the column(s) to sort the data (columns specified first have a priority, non-existing columns are ignored), can be repeated
- --long=False
- List additional fields in output
- --plugin
- List node group templates for specific plugin
- --plugin-version
- List node group templates with specific version of the plugin
- --name
- List node group templates with specific substring in the name
This command is provided by the python-saharaclient plugin.
dataprocessing node group template show
Display node group template details
usage: dataprocessing node group template show [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>]
[--fit-width] [--print-empty]
<node-group-template>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
This command is provided by the python-saharaclient plugin.
dataprocessing node group template update
Updates node group template
usage: dataprocessing node group template update [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>]
[--fit-width] [--print-empty]
[--name <name>]
[--plugin <plugin>]
[--plugin-version <plugin_version>]
[--processes <processes> [<processes> ...]]
[--security-groups <security-groups> [<security-groups> ...]]
[--auto-security-group-enable | --auto-security-group-disable]
[--availability-zone <availability-zone>]
[--flavor <flavor>]
[--floating-ip-pool <floating-ip-pool>]
[--volumes-per-node <volumes-per-node>]
[--volumes-size <volumes-size>]
[--volumes-type <volumes-type>]
[--volumes-availability-zone <volumes-availability-zone>]
[--volumes-mount-prefix <volumes-mount-prefix>]
[--volumes-locality-enable | --volumes-locality-disable]
[--description <description>]
[--autoconfig-enable | --autoconfig-disable]
[--proxy-gateway-enable | --proxy-gateway-disable]
[--public | --private]
[--protected | --unprotected]
[--json <filename>]
[--shares <filename>]
[--configs <filename>]
<node-group-template>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --name
- New name of the node group template
- --plugin
- Name of the plugin
- --plugin-version
- Version of the plugin
- --processes
- List of the processes that will be launched on each instance
- --security-groups
- List of the security groups for the instances in this node group
- --auto-security-group-enable
- Additional security group should be created for the node group
- --auto-security-group-disable
- Additional security group should not be created for the node group
- --availability-zone
- Name of the availability zone where instances will be created
- --flavor
- Name or ID of the flavor
- --floating-ip-pool
- ID of the floating IP pool
- --volumes-per-node
- Number of volumes attached to every node
- --volumes-size
- Size of volumes attached to node (GB). This parameter will be taken into account only if volumes-per-node is set and non-zero
- --volumes-type
- Type of the volumes. This parameter will be taken into account only if volumes-per-node is set and non-zero
- --volumes-availability-zone
- Name of the availability zone where volumes will be created. This parameter will be taken into account only if volumes-per-node is set and non-zero
- --volumes-mount-prefix
- Prefix for mount point directory. This parameter will be taken into account only if volumes-per-node is set and non-zero
- --volumes-locality-enable
- Instance and attached volumes will be created on the same physical host. This parameter will be taken into account only if volumes-per-node is set and non-zero
- --volumes-locality-disable
- Instance and attached volumes creation on the same physical host will not be regulated. This parameter will be takeninto account only if volumes-per-node is set and non-zero
- --description
- Description of the node group template
- --autoconfig-enable
- Instances of the node group will be automatically configured
- --autoconfig-disable
- Instances of the node group will not be automatically configured
- --proxy-gateway-enable
- Instances of the node group will be used to access other instances in the cluster
- --proxy-gateway-disable
- Instances of the node group will not be used to access other instances in the cluster
- --public
- Make the node group template public (Visible from other projects)
- --private
- Make the node group template private (Visible only from this project)
- --protected
- Make the node group template protected
- --unprotected
- Make the node group template unprotected
- --json
- JSON representation of the node group template update fields. Other arguments will not be taken into account if this one is provided
- JSON representation of the manila shares
- --configs
- JSON representation of the node group template configs
This command is provided by the python-saharaclient plugin.
Cluster Templates¶
dataprocessing cluster template create
Creates cluster template
usage: dataprocessing cluster template create [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>]
[--fit-width] [--print-empty]
[--name <name>]
[--node-groups <node-group:instances_count> [<node-group:instances_count> ...]]
[--anti-affinity <anti-affinity> [<anti-affinity> ...]]
[--description <description>]
[--autoconfig] [--public]
[--protected]
[--json <filename>]
[--shares <filename>]
[--configs <filename>]
[--domain-name <domain-name>]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --name
- Name of the cluster template [REQUIRED if JSON is not provided]
- --node-groups
- List of the node groups(names or IDs) and numbers of instances for each one of them [REQUIRED if JSON is not provided]
- --anti-affinity
- List of processes that should be added to an anti-affinity group
- --description
- Description of the cluster template
- --autoconfig=False
- If enabled, instances of the cluster will be automatically configured
- --public=False
- Make the cluster template public (Visible from other projects)
- --protected=False
- Make the cluster template protected
- --json
- JSON representation of the cluster template. Other arguments will not be taken into account if this one is provided
- JSON representation of the manila shares
- --configs
- JSON representation of the cluster template configs
- --domain-name
- Domain name for instances of this cluster template. This option is available if 'use_designate' config is True
This command is provided by the python-saharaclient plugin.
dataprocessing cluster template delete
Deletes cluster template
usage: dataprocessing cluster template delete [-h]
<cluster-template>
[<cluster-template> ...]
This command is provided by the python-saharaclient plugin.
dataprocessing cluster template export
Export cluster template to JSON
usage: dataprocessing cluster template export [-h] [--file <filename>]
<cluster-template>
This command is provided by the python-saharaclient plugin.
dataprocessing cluster template import
Imports cluster template
usage: dataprocessing cluster template import [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>]
[--fit-width] [--print-empty]
[--name <name>]
[--default-image-id <default_image_id>]
--node-groups
<node-group:instances_count>
[<node-group:instances_count> ...]
<json>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --name
- Name of the cluster template
- --default-image-id
- Default image ID to be used
- --node-groups
- List of the node groups(names or IDs) and numbers of instances for each one of them
This command is provided by the python-saharaclient plugin.
dataprocessing cluster template list
Lists cluster templates
usage: dataprocessing cluster template list [-h]
[-f {csv,json,table,value,yaml}]
[-c COLUMN]
[--quote {all,minimal,none,nonnumeric}]
[--noindent]
[--max-width <integer>]
[--fit-width] [--print-empty]
[--sort-column SORT_COLUMN]
[--long] [--plugin <plugin>]
[--plugin-version <plugin_version>]
[--name <name-substring>]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: csv, json, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --quote=nonnumeric
- when to include quotes, defaults to nonnumeric
Possible choices: all, minimal, none, nonnumeric
- --noindent=False
- whether to disable indenting the JSON
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --sort-column=[]
- specify the column(s) to sort the data (columns specified first have a priority, non-existing columns are ignored), can be repeated
- --long=False
- List additional fields in output
- --plugin
- List cluster templates for specific plugin
- --plugin-version
- List cluster templates with specific version of the plugin
- --name
- List cluster templates with specific substring in the name
This command is provided by the python-saharaclient plugin.
dataprocessing cluster template show
Display cluster template details
usage: dataprocessing cluster template show [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>]
[--fit-width] [--print-empty]
<cluster-template>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
This command is provided by the python-saharaclient plugin.
dataprocessing cluster template update
Updates cluster template
usage: dataprocessing cluster template update [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>]
[--fit-width] [--print-empty]
[--name <name>]
[--node-groups <node-group:instances_count> [<node-group:instances_count> ...]]
[--anti-affinity <anti-affinity> [<anti-affinity> ...]]
[--description <description>]
[--autoconfig-enable | --autoconfig-disable]
[--public | --private]
[--protected | --unprotected]
[--json <filename>]
[--shares <filename>]
[--configs <filename>]
[--domain-name <domain-name>]
<cluster-template>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --name
- New name of the cluster template
- --node-groups
- List of the node groups(names or IDs) and numbers ofinstances for each one of them
- --anti-affinity
- List of processes that should be added to an anti-affinity group
- --description
- Description of the cluster template
- --autoconfig-enable
- Instances of the cluster will be automatically configured
- --autoconfig-disable
- Instances of the cluster will not be automatically configured
- --public
- Make the cluster template public (Visible from other projects)
- --private
- Make the cluster template private (Visible only from this tenant)
- --protected
- Make the cluster template protected
- --unprotected
- Make the cluster template unprotected
- --json
- JSON representation of the cluster template. Other arguments will not be taken into account if this one is provided
- JSON representation of the manila shares
- --configs
- JSON representation of the cluster template configs
- --domain-name
- Domain name for instances of this cluster template. This option is available if 'use_designate' config is True
This command is provided by the python-saharaclient plugin.
Clusters¶
dataprocessing cluster create
Creates cluster
usage: dataprocessing cluster create [-h] [-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX] [--max-width <integer>]
[--fit-width] [--print-empty]
[--name <name>]
[--cluster-template <cluster-template>]
[--image <image>]
[--description <description>]
[--user-keypair <keypair>]
[--neutron-network <network>]
[--count <count>] [--public]
[--protected] [--transient]
[--json <filename>] [--wait]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --name
- Name of the cluster [REQUIRED if JSON is not provided]
- --cluster-template
- Cluster template name or ID [REQUIRED if JSON is not provided]
- --image
- Image that will be used for cluster deployment (Name or ID) [REQUIRED if JSON is not provided]
- --description
- Description of the cluster
- --user-keypair
- User keypair to get acces to VMs after cluster creation
- --neutron-network
- Instances of the cluster will get fixed IP addresses in this network. (Name or ID should be provided)
- --count
- Number of clusters to be created
- --public=False
- Make the cluster public (Visible from other projects)
- --protected=False
- Make the cluster protected
- --transient=False
- Create transient cluster
- --json
- JSON representation of the cluster. Other arguments (except for --wait) will not be taken into account if this one is provided
- --wait=False
- Wait for the cluster creation to complete
This command is provided by the python-saharaclient plugin.
dataprocessing cluster delete
Deletes cluster
usage: dataprocessing cluster delete [-h] [--wait] <cluster> [<cluster> ...]
This command is provided by the python-saharaclient plugin.
dataprocessing cluster list
Lists clusters
usage: dataprocessing cluster list [-h] [-f {csv,json,table,value,yaml}]
[-c COLUMN]
[--quote {all,minimal,none,nonnumeric}]
[--noindent] [--max-width <integer>]
[--fit-width] [--print-empty]
[--sort-column SORT_COLUMN] [--long]
[--plugin <plugin>]
[--plugin-version <plugin_version>]
[--name <name-substring>]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: csv, json, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --quote=nonnumeric
- when to include quotes, defaults to nonnumeric
Possible choices: all, minimal, none, nonnumeric
- --noindent=False
- whether to disable indenting the JSON
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --sort-column=[]
- specify the column(s) to sort the data (columns specified first have a priority, non-existing columns are ignored), can be repeated
- --long=False
- List additional fields in output
- --plugin
- List clusters with specific plugin
- --plugin-version
- List clusters with specific version of the plugin
- --name
- List clusters with specific substring in the name
This command is provided by the python-saharaclient plugin.
dataprocessing cluster scale
Scales cluster
usage: dataprocessing cluster scale [-h] [-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent] [--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty]
[--instances <node-group-template:instances_count> [<node-group-template:instances_count> ...]]
[--json <filename>] [--wait]
<cluster>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --instances
- Node group templates and number of their instances to be scale to [REQUIRED if JSON is not provided]
- --json
- JSON representation of the cluster scale object. Other arguments (except for --wait) will not be taken into account if this one is provided
- --wait=False
- Wait for the cluster scale to complete
This command is provided by the python-saharaclient plugin.
dataprocessing cluster show
Display cluster details
usage: dataprocessing cluster show [-h] [-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent] [--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty] [--verification]
[--show-progress] [--full-dump-events]
<cluster>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --verification=False
- List additional fields for verifications
- --show-progress=False
- Provides ability to show brief details of event logs.
- --full-dump-events=False
- Provides ability to make full dump with event log details.
This command is provided by the python-saharaclient plugin.
dataprocessing cluster verification
Updates cluster
usage: dataprocessing cluster verification [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>]
[--fit-width] [--print-empty]
[--name <name>]
[--description <description>]
[--shares <filename>]
[--public | --private]
[--protected | --unprotected]
<cluster>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --name
- New name of the cluster
- --description
- Description of the cluster
- JSON representation of the manila shares
- --public
- Make the cluster public (Visible from other projects)
- --private
- Make the cluster private (Visible only from this tenant)
- --protected
- Make the cluster protected
- --unprotected
- Make the cluster unprotected
This command is provided by the python-saharaclient plugin.
dataprocessing cluster verification
Updates cluster verifications
usage: dataprocessing cluster verification [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>]
[--fit-width] [--print-empty]
(--start | --show)
<cluster>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --start
- Start health verification for the cluster
- --show=False
- Show health of the cluster
This command is provided by the python-saharaclient plugin.
Data Sources¶
dataprocessing data source create
Creates data source
usage: dataprocessing data source create [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty] --type <type> --url
<url>
[--username <username> | --access-key <accesskey>]
[--password <password> | --secret-key <secretkey>]
[--s3-endpoint <endpoint>]
[--enable-s3-ssl | --disable-s3-ssl]
[--enable-s3-bucket-in-path | --disable-s3-bucket-in-path]
[--description <description>]
[--public] [--protected]
<name>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --type
- Type of the data source (swift, hdfs, maprfs, manila, s3) [REQUIRED]
Possible choices: swift, hdfs, maprfs, manila, s3
- --url
- URL for the data source [REQUIRED]
- --username
- Username for accessing the data source URL
- --access-key
- S3 access key for accessing the data source URL
- --password
- Password for accessing the data source URL
- --secret-key
- S3 secret key for accessing the data source URL
- --s3-endpoint
- S3 endpoint for accessing the data source URL (ignored if data source not in S3)
- --enable-s3-ssl=False
- Enable access to S3 endpoint using SSL (ignored if data source not in S3)
- --disable-s3-ssl=True
- Disable access to S3 endpoint using SSL (ignored if data source not in S3)
- --enable-s3-bucket-in-path=False
- Access S3 endpoint using bucket name in path (ignored if data source not in S3)
- --disable-s3-bucket-in-path=True
- Access S3 endpoint using bucket name in path (ignored if data source not in S3)
- --description
- Description of the data source
- --public=False
- Make the data source public
- --protected=False
- Make the data source protected
This command is provided by the python-saharaclient plugin.
dataprocessing data source delete
Delete data source
usage: dataprocessing data source delete [-h]
<data-source> [<data-source> ...]
This command is provided by the python-saharaclient plugin.
dataprocessing data source list
Lists data sources
usage: dataprocessing data source list [-h] [-f {csv,json,table,value,yaml}]
[-c COLUMN]
[--quote {all,minimal,none,nonnumeric}]
[--noindent] [--max-width <integer>]
[--fit-width] [--print-empty]
[--sort-column SORT_COLUMN] [--long]
[--type <type>]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: csv, json, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --quote=nonnumeric
- when to include quotes, defaults to nonnumeric
Possible choices: all, minimal, none, nonnumeric
- --noindent=False
- whether to disable indenting the JSON
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --sort-column=[]
- specify the column(s) to sort the data (columns specified first have a priority, non-existing columns are ignored), can be repeated
- --long=False
- List additional fields in output
- --type
- List data sources of specific type (swift, hdfs, maprfs, manila, s3)
Possible choices: swift, hdfs, maprfs, manila, s3
This command is provided by the python-saharaclient plugin.
dataprocessing data source show
Display data source details
usage: dataprocessing data source show [-h] [-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty]
<data-source>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
This command is provided by the python-saharaclient plugin.
dataprocessing data source update
Update data source
usage: dataprocessing data source update [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty] [--name <name>]
[--type <type>] [--url <url>]
[--username <username> | --access-key <accesskey>]
[--password <password> | --secret-key <secretkey>]
[--s3-endpoint <endpoint>]
[--enable-s3-ssl | --disable-s3-ssl]
[--enable-s3-bucket-in-path | --disable-s3-bucket-in-path]
[--description <description>]
[--public | --private]
[--protected | --unprotected]
<data-source>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --name
- New name of the data source
- --type
- Type of the data source (swift, hdfs, maprfs, manila, s3)
Possible choices: swift, hdfs, maprfs, manila, s3
- --url
- URL for the data source
- --username
- Username for accessing the data source URL
- --access-key
- S3 access key for accessing the data source URL
- --password
- Password for accessing the data source URL
- --secret-key
- S3 secret key for accessing the data source URL
- --s3-endpoint
- S3 endpoint for accessing the data source URL (ignored if data source not in S3)
- --enable-s3-ssl=False
- Enable access to S3 endpoint using SSL (ignored if data source not in S3)
- --disable-s3-ssl=True
- Disable access to S3 endpoint using SSL (ignored if data source not in S3)
- --enable-s3-bucket-in-path=False
- Access S3 endpoint using bucket name in path (ignored if data source not in S3)
- --disable-s3-bucket-in-path=True
- Access S3 endpoint using bucket name in path (ignored if data source not in S3)
- --description
- Description of the data source
- --public
- Make the data source public (Visible from other projects)
- --private
- Make the data source private (Visible only from this tenant)
- --protected
- Make the data source protected
- --unprotected
- Make the data source unprotected
This command is provided by the python-saharaclient plugin.
Job Binaries¶
dataprocessing job binary create
Creates job binary
usage: dataprocessing job binary create [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty] [--name <name>]
[--data <file> | --url <url>]
[--description <description>]
[--username <username> | --access-key <accesskey>]
[--password <password> | --secret-key <secretkey> | --password-prompt | --secret-key-prompt]
[--s3-endpoint <endpoint>] [--public]
[--protected] [--json <filename>]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --name
- Name of the job binary [REQUIRED if JSON is not provided]
- --data
- File that will be stored in the internal DB [REQUIRED if JSON and URL are not provided]
- --url
- URL for the job binary [REQUIRED if JSON and file are not provided]
- --description
- Description of the job binary
- --username
- Username for accessing the job binary URL
- --access-key
- S3 access key for accessing the job binary URL
- --password
- Password for accessing the job binary URL
- --secret-key
- S3 secret key for accessing the job binary URL
- --password-prompt=False
- Prompt interactively for password
- --secret-key-prompt=False
- Prompt interactively for S3 secret key
- --s3-endpoint
- S3 endpoint for accessing the job binary URL (ignored if binary not in S3
- --public=False
- Make the job binary public
- --protected=False
- Make the job binary protected
- --json
- JSON representation of the job binary. Other arguments will not be taken into account if this one is provided
This command is provided by the python-saharaclient plugin.
dataprocessing job binary delete
Deletes job binary
usage: dataprocessing job binary delete [-h] <job-binary> [<job-binary> ...]
This command is provided by the python-saharaclient plugin.
dataprocessing job binary download
Downloads job binary
usage: dataprocessing job binary download [-h] [--file <file>] <job-binary>
This command is provided by the python-saharaclient plugin.
dataprocessing job binary list
Lists job binaries
usage: dataprocessing job binary list [-h] [-f {csv,json,table,value,yaml}]
[-c COLUMN]
[--quote {all,minimal,none,nonnumeric}]
[--noindent] [--max-width <integer>]
[--fit-width] [--print-empty]
[--sort-column SORT_COLUMN] [--long]
[--name <name-substring>]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: csv, json, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --quote=nonnumeric
- when to include quotes, defaults to nonnumeric
Possible choices: all, minimal, none, nonnumeric
- --noindent=False
- whether to disable indenting the JSON
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --sort-column=[]
- specify the column(s) to sort the data (columns specified first have a priority, non-existing columns are ignored), can be repeated
- --long=False
- List additional fields in output
- --name
- List job binaries with specific substring in the name
This command is provided by the python-saharaclient plugin.
dataprocessing job binary show
Display job binary details
usage: dataprocessing job binary show [-h] [-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty]
<job-binary>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
This command is provided by the python-saharaclient plugin.
dataprocessing job binary update
Updates job binary
usage: dataprocessing job binary update [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty] [--name <name>]
[--url <url>]
[--description <description>]
[--username <username> | --access-key <accesskey>]
[--password <password> | --secret-key <secretkey> | --password-prompt | --secret-key-prompt]
[--s3-endpoint <endpoint>]
[--public | --private]
[--protected | --unprotected]
[--json <filename>]
<job-binary>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --name
- New name of the job binary
- --url
- URL for the job binary [Internal DB URL can not be updated]
- --description
- Description of the job binary
- --username
- Username for accessing the job binary URL
- --access-key
- S3 access key for accessing the job binary URL
- --password
- Password for accessing the job binary URL
- --secret-key
- S3 secret key for accessing the job binary URL
- --password-prompt=False
- Prompt interactively for password
- --secret-key-prompt=False
- Prompt interactively for S3 secret key
- --s3-endpoint
- S3 endpoint for accessing the job binary URL (ignored if binary not in S3
- --public
- Make the job binary public (Visible from other projects)
- --private
- Make the job binary private (Visible only from this project)
- --protected
- Make the job binary protected
- --unprotected
- Make the job binary unprotected
- --json
- JSON representation of the update object. Other arguments will not be taken into account if this one is provided
This command is provided by the python-saharaclient plugin.
Job Types¶
dataprocessing job type configs get
Get job type configs
usage: dataprocessing job type configs get [-h] [--file <file>] <job-type>
- Type of the job to provide config information about
-
Possible choices: Hive, Java, MapReduce, Storm, Storm.Pyleus, Pig, Shell, MapReduce.Streaming, Spark
- --file
- Destination file (defaults to job type)
This command is provided by the python-saharaclient plugin.
dataprocessing job type list
Lists job types supported by plugins
usage: dataprocessing job type list [-h] [-f {csv,json,table,value,yaml}]
[-c COLUMN]
[--quote {all,minimal,none,nonnumeric}]
[--noindent] [--max-width <integer>]
[--fit-width] [--print-empty]
[--sort-column SORT_COLUMN]
[--type <type>] [--plugin <plugin>]
[--plugin-version <plugin_version>]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: csv, json, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --quote=nonnumeric
- when to include quotes, defaults to nonnumeric
Possible choices: all, minimal, none, nonnumeric
- --noindent=False
- whether to disable indenting the JSON
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --sort-column=[]
- specify the column(s) to sort the data (columns specified first have a priority, non-existing columns are ignored), can be repeated
- --type
- Get information about specific job type
Possible choices: Hive, Java, MapReduce, Storm, Storm.Pyleus, Pig, Shell, MapReduce.Streaming, Spark
- --plugin
- Get only job types supported by this plugin
- --plugin-version
- Get only job types supported by specific version of the plugin. This parameter will be taken into account only if plugin is provided
This command is provided by the python-saharaclient plugin.
Job Templates¶
dataprocessing job template create
Creates job template
usage: dataprocessing job template create [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>]
[--fit-width] [--print-empty]
[--name <name>] [--type <type>]
[--mains <main> [<main> ...]]
[--libs <lib> [<lib> ...]]
[--description <description>]
[--public] [--protected]
[--interface <filename>]
[--json <filename>]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --name
- Name of the job template [REQUIRED if JSON is not provided]
- --type
- Type of the job (Hive, Java, MapReduce, Storm, Storm.Pyleus, Pig, Shell,
MapReduce.Streaming, Spark) [REQUIRED if JSON is not provided]
Possible choices: Hive, Java, MapReduce, Storm, Storm.Pyleus, Pig, Shell, MapReduce.Streaming, Spark
- --mains
- Name(s) or ID(s) for job's main job binary(s)
- --libs
- Name(s) or ID(s) for job's lib job binary(s)
- --description
- Description of the job template
- --public=False
- Make the job template public
- --protected=False
- Make the job template protected
- --interface
- JSON representation of the interface
- --json
- JSON representation of the job template
This command is provided by the python-saharaclient plugin.
dataprocessing job template delete
Deletes job template
usage: dataprocessing job template delete [-h]
<job-template> [<job-template> ...]
This command is provided by the python-saharaclient plugin.
dataprocessing job template list
Lists job templates
usage: dataprocessing job template list [-h] [-f {csv,json,table,value,yaml}]
[-c COLUMN]
[--quote {all,minimal,none,nonnumeric}]
[--noindent] [--max-width <integer>]
[--fit-width] [--print-empty]
[--sort-column SORT_COLUMN] [--long]
[--type <type>]
[--name <name-substring>]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: csv, json, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --quote=nonnumeric
- when to include quotes, defaults to nonnumeric
Possible choices: all, minimal, none, nonnumeric
- --noindent=False
- whether to disable indenting the JSON
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --sort-column=[]
- specify the column(s) to sort the data (columns specified first have a priority, non-existing columns are ignored), can be repeated
- --long=False
- List additional fields in output
- --type
- List job templates of specific type
Possible choices: Hive, Java, MapReduce, Storm, Storm.Pyleus, Pig, Shell, MapReduce.Streaming, Spark
- --name
- List job templates with specific substring in the name
This command is provided by the python-saharaclient plugin.
dataprocessing job template show
Display job template details
usage: dataprocessing job template show [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty]
<job-template>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
This command is provided by the python-saharaclient plugin.
dataprocessing job template update
Updates job template
usage: dataprocessing job template update [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>]
[--fit-width] [--print-empty]
[--name <name>]
[--description <description>]
[--public | --private]
[--protected | --unprotected]
<job-template>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --name
- New name of the job template
- --description
- Description of the job template
- --public
- Make the job template public (Visible from other projects)
- --private
- Make the job_template private (Visible only from this tenant)
- --protected
- Make the job template protected
- --unprotected
- Make the job template unprotected
This command is provided by the python-saharaclient plugin.
Jobs¶
dataprocessing job binary delete
Deletes job
usage: dataprocessing job binary delete [-h] [--wait] <job> [<job> ...]
This command is provided by the python-saharaclient plugin.
dataprocessing job execute
Executes job
usage: dataprocessing job execute [-h] [-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent] [--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty]
[--job-template <job-template>]
[--cluster <cluster>] [--input <input>]
[--output <output>]
[--params <name:value> [<name:value> ...]]
[--args <argument> [<argument> ...]]
[--public] [--protected]
[--config-json <filename> | --configs <name:value> [<name:value> ...]]
[--interface <filename>] [--json <filename>]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --job-template
- Name or ID of the job template [REQUIRED if JSON is not provided]
- --cluster
- Name or ID of the cluster [REQUIRED if JSON is not provided]
- --input
- Name or ID of the input data source
- --output
- Name or ID of the output data source
- --params
- Parameters to add to the job
- --args
- Arguments to add to the job
- --public=False
- Make the job public
- --protected=False
- Make the job protected
- --config-json
- JSON representation of the job configs
- --configs
- Configs to add to the job
- --interface
- JSON representation of the interface
- --json
- JSON representation of the job. Other arguments will not be taken into account if this one is provided
This command is provided by the python-saharaclient plugin.
dataprocessing job list
Lists jobs
usage: dataprocessing job list [-h] [-f {csv,json,table,value,yaml}]
[-c COLUMN]
[--quote {all,minimal,none,nonnumeric}]
[--noindent] [--max-width <integer>]
[--fit-width] [--print-empty]
[--sort-column SORT_COLUMN] [--long]
[--status <status>]
- -f=table, --format=table
- the output format, defaults to table
Possible choices: csv, json, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --quote=nonnumeric
- when to include quotes, defaults to nonnumeric
Possible choices: all, minimal, none, nonnumeric
- --noindent=False
- whether to disable indenting the JSON
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --sort-column=[]
- specify the column(s) to sort the data (columns specified first have a priority, non-existing columns are ignored), can be repeated
- --long=False
- List additional fields in output
- --status
- List jobs with specific status
Possible choices: done-with-error, failed, killed, pending, running, succeeded, to-be-killed
This command is provided by the python-saharaclient plugin.
dataprocessing job binary show
Display job details
usage: dataprocessing job binary show [-h] [-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty]
<job>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
This command is provided by the python-saharaclient plugin.
dataprocessing job binary update
Updates job
usage: dataprocessing job binary update [-h]
[-f {json,shell,table,value,yaml}]
[-c COLUMN] [--noindent]
[--prefix PREFIX]
[--max-width <integer>] [--fit-width]
[--print-empty] [--public | --private]
[--protected | --unprotected]
<job>
- -f=table, --format=table
- the output format, defaults to table
Possible choices: json, shell, table, value, yaml
- -c=[], --column=[]
- specify the column(s) to include, can be repeated
- --noindent=False
- whether to disable indenting the JSON
- --prefix=
- add a prefix to all variable names
- --max-width=0
- Maximum display width, <1 to disable. You can also use the CLIFF_MAX_TERM_WIDTH environment variable, but the parameter takes precedence.
- --fit-width=False
- Fit the table to the display width. Implied if --max-width greater than 0. Set the environment variable CLIFF_FIT_WIDTH=1 to always enable
- --print-empty=False
- Print empty table if there is no data to show.
- --public
- Make the job public (Visible from other projects)
- --private
- Make the job private (Visible only from this project)
- --protected
- Make the job protected
- --unprotected
- Make the job unprotected
This command is provided by the python-saharaclient plugin.
CONTRIBUTING¶
python-saharaclient is part of the Sahara project. It has a separate storyboard page which should be used to report bugs. Like the other projects of the OpenStack community, code contribution happens through gerrit.
Please refer to the Sahara documentation and its How to Participate section for more information on how to contribute to the project.
AUTHOR¶
OpenStack Foundation
COPYRIGHT¶
2019, OpenStack Foundation
March 11, 2019 | 2.0.0 |