Conductor Documentation

Fabric (SSH) Plugin

You can use the Fabric plugin to map operations to SSH commands or Fabric tasks that are included in your blueprint.

The plugin provides an agentless method for running operations on destination hosts. The source code for this plugin is found in github.

Plugin Requirements(1.X):

Execution Methods

There are four modes for working with this plugin.

Running Commands

In the following code, the run_commands plugin task is used and a list of commands is specified to be executed on the agent host.

imports:
    - http://www.getcloudify.org/spec/fabric-plugin/1.3/plugin.yaml

node_templates:
  example_node:
    type: cloudify.nodes.WebServer
    interfaces:
      cloudify.interfaces.lifecycle:
          start:
            implementation: fabric.fabric_plugin.tasks.run_commands
            inputs:
              commands:
                - echo "source ~/myfile" >> ~/.bashrc
                - apt-get install -y python-dev git
                - pip install my_module

Running Tasks

In the following code, the tasks file path relative to the blueprint’s directory is specified, together with the task’s name in that file and (optional) task properties that will be used when the task is called.

imports:
    - http://www.getcloudify.org/spec/fabric-plugin/1.3/plugin.yaml

node_templates:
  example_node:
    type: cloudify.nodes.WebServer
    interfaces:
      cloudify.interfaces.lifecycle:
        start:
          implementation: fabric.fabric_plugin.tasks.run_task
          inputs:
            tasks_file: my_tasks/tasks.py
            task_name: install_nginx
            task_properties:
              important_prop1: very_important
              important_prop2: 300

Examples

Fabric 1.X:

#my_tasks/tasks.py
from fabric.api import run, put
from cloudify import ctx

def install_nginx(important_prop1, important_prop2):
    ctx.logger.info('Installing nginx. Some important props:'
                    ' prop1: {0}, prop2: {1}'
                    .format(important_prop1, important_prop2))
    run('sudo apt-get install nginx')


def configure_nginx(config_file_path):
    # configure the webserver to run with our premade configuration file.
    conf_file = ctx.download_resource(config_file_path)
    put(conf_file, '/etc/nginx/conf.d/')


def start_nginx(ctx):
    run('sudo service nginx restart')

Fabric 2.X:

#my_tasks/tasks.py
from fabric2 import task
from cloudify import ctx

@task
def install_nginx(connection, important_prop1, important_prop2):
    ctx.logger.info('Installing nginx. Some important props:'
                    ' prop1: {0}, prop2: {1}'
                    .format(important_prop1, important_prop2))
    connection.run('sudo apt-get install nginx')

@task
def configure_nginx(connection, config_file_path):
    # configure the webserver to run with our premade configuration file.
    conf_file = ctx.download_resource(config_file_path)
    connection.put(conf_file, '/etc/nginx/conf.d/')

@task
def start_nginx(connection, ctx):
    connection.run('sudo service nginx restart')

Running Module Tasks

This example is similar to the previous one, with the exception that, if the Fabric task that you want to execute is already installed in the Python environment in which the operation will run, you can specify the Python path to the function.

imports:
    - http://www.getcloudify.org/spec/fabric-plugin/1.3/plugin.yaml

node_templates:
  example_node:
    type: cloudify.nodes.WebServer
    interfaces:
      cloudify.interfaces.lifecycle:
        start:
          implementation: fabric.fabric_plugin.tasks.run_module_task
          inputs:
            task_mapping: some_package.some_module.install_nginx
            task_properties:
              important_prop1: very_important
              important_prop2: 300

Running Scripts

The Fabric plugin can execute scripts remotely and provides access to the ctx API for interacting with Studio Conductor in the same manner as with the script plugin.

Example:

node_templates:
  example_node:
    type: cloudify.nodes.WebServer
    interfaces:
      cloudify.interfaces.lifecycle:
        start:
          implementation: fabric.fabric_plugin.tasks.run_script
          inputs:
            # Path to the script relative to the blueprint directory
            script_path: scripts/start.sh
            MY_ENV_VAR: some-value

Operation Inputs

Operation inputs passed to the run_script task are available as environment variables in the script’s execution environment. Complex data structures such as dictionaries and lists are JSON-encoded when exported as environment variables.

Both 2.x & 1.x fabric plugins support the same operation inputs mentioned above. However, the following operation inputs are special cases:

The reason for that, because fabric 2.x plugin is using different fabric version and API from the one used by fabric 1.x plugin.

fabric_env

fabric 2.x plugin accept two forms of fabric_env values:

Fabric 2.x

The fabric_env supported values of fabric 2.x input are the following:

Fabric 1.x

The following values of fabric_env in fabric 1.x plugin backward compatible with fabric 2.x plugin:

hide_output

fabric 2.x plugin accept two forms of hide_output values:

Fabric 2.x

The hide_output supported values of fabric 2.x input are the following:

By default, hide_output is turned off.

Fabric 1.x

Fabric 1.x plugin supports range of values of hide_output that does not support by Fabric 2.x plugin. However, Fabric 2.x plugin do a translation to make it work with fabric 2.x plugin.

The translation happened as the following:

Process Configuration

The run_script task accepts a process input that enables the process that runs the script to be configured:

Example:

node_templates:
  example_node:
    type: cloudify.nodes.WebServer
    interfaces:
      cloudify.interfaces.lifecycle:
        start:
          implementation: fabric.fabric_plugin.tasks.run_script
          inputs:
            script_path: scripts/start.sh
            # Optional
            process:
              # Optional
              cwd: /home/ubuntu
              # Optional
              command_prefix:
              # Optional
              args: [--arg1, --arg2, arg3]
              # Optional
              env:
                MY_VAR_1: my_value_1
                MY_VAR_2: my_value_2

Executing Commands or Scripts with sudo Privileges

The run_commands and run_script execution methods both accept a use_sudo input (which defaults to false). When true, the commands or script are executed using sudo. For fabric 1.X ,this enables, for instance, the use of the sudo_prefix Fabric env property to run an alternative implementation of sudo. See additional sudo-related configuration that you can apply to your Fabric env here. Following is an example that uses use_sudo and sudo_prefix:

imports:
    - http://www.getcloudify.org/spec/fabric-plugin/1.3/plugin.yaml

node_templates:
  example_node:
    type: cloudify.nodes.WebServer
    interfaces:
      cloudify.interfaces.lifecycle:
          create:
            implementation: fabric.fabric_plugin.tasks.run_commands
            inputs:
              commands:
                - apt-get install -y python-dev git
                - echo 'config' > /etc/my_config
              # if `use_sudo` is omitted, it defaults to `false`
              use_sudo: true
              fabric_env:
                host_string: 10.10.1.10
                user: some_username
                password: some_password
                sudo_prefix: 'mysudo -c'

Hiding Output(1.X version only)

Fabric generates output of its command execution. You can hide some of that output, for example to make your execution logs more readable, or to ignore irrelevant data. To hide output, use the hide_output input with any of the four execution methods. The hide_output input is a list of groups of outputs to hide as specified here.

An example that uses hide_output:

imports:
    - http://www.getcloudify.org/spec/fabric-plugin/1.3/plugin.yaml

node_templates:
  example_node:
    type: cloudify.nodes.WebServer
    interfaces:
      cloudify.interfaces.lifecycle:
        start:
          implementation: fabric.fabric_plugin.tasks.run_script
          inputs:
            # Path to the script relative to the blueprint directory
            script_path: scripts/start.sh
            MY_ENV_VAR: some-value
            # If omitted, nothing will be hidden
            hide_output:
              - running
              - warnings

Exception Handling (2.X version only)

Fabric might generate an exception if there is a problem with the execution. By default, all of the exceptions are considered recoverable but you can control that with a provided list of error codes to change that to be non-recoverable. The non_recoverable_error_exit_codes input is a list of exit_codes.

An example that uses non_recoverable_error_exit_codes:

imports:
    - plugin:cloudify-fabric-plugin

node_templates:
  example_node:
    type: cloudify.nodes.WebServer
    interfaces:
      cloudify.interfaces.lifecycle:
        start:
          implementation: fabric.fabric_plugin.tasks.run_commands
          inputs:
            commands: [df /mdf]
            non_recoverable_error_exit_codes:
              - 1
              - 2
            fabric_env:
              host_string: 192.168.10.13
              user: some_username
              key_filename: /path/to/key/file            

SSH Configuration

version 1.X:

The Fabric plugin extracts the correct host IP address based on the node’s host. You can set these and additional SSH configuration by passing fabric_env to operation inputs. This applies to run_commands, run_task and run_module_task. The fabric_env input is passed as-is to the underlying Fabric library. Check their documentation for additional details.

Following is an example that uses fabric_env:

imports:
    - http://www.getcloudify.org/spec/fabric-plugin/1.3/plugin.yaml

node_templates:
  example_node:
    type: cloudify.nodes.WebServer
    interfaces:
      cloudify.interfaces.lifecycle:
        start:
          implementation: fabric.fabric_plugin.tasks.run_commands
          inputs:
            commands: [touch ~/my_file]
            fabric_env:
              host_string: 192.168.10.13
              user: some_username
              key_filename: /path/to/key/file

version 2.X:

On fabric 2.X there is no global env dictionary, instead Connection is being used. On this version of the plugin, fabric_env dictionary is actually the arguments to set the Connection object.

Example:

imports:
    - plugin:cloudify-fabric-plugin

node_templates:
  example_node:
    type: cloudify.nodes.WebServer
    interfaces:
      cloudify.interfaces.lifecycle:
        start:
          implementation: fabric.fabric_plugin.tasks.run_commands
          inputs:
            commands: [touch ~/my_file]
            fabric_env:
              host: 192.168.10.13
              user: some_username
              key_filename: /path/to/key/file

Note that password can be passed too, the fabric plugin will pack the password/key into connect_kwargs.
Also, for users which used 1.X fabric plugin, host_string will be treated as host. If key is received(which not exists on fabric2), the fabric plugin will load the key into connect_kwargs['pkey'].

Tip

Using a tasks file instead of a list of commands enables you to use python code to execute commands. In addition, you will be able to use the ctx object to perform actions based on contextual data.

Using a list of commands might be a good solution for very simple cases in which you do not want to maintain a tasks file.

Warning

using ~ in the file path of key_filename is not supported!

ctx for the Fabric Plugin

Studio Conductor supports using ctx in Python scripts executed by the fabric plugin on remote machines. Most of the functionality is similar to how the script plugin exposes the ctx object.

Executing ctx Commands

Until now, to use the Fabric plugin to execute Python scripts you had to use ctx commands in the following way.

os.system('ctx logger info Hello!')

Now, you can do one of two things to achieve the same result:

from cloudify import ctx

ctx.logger.info("Hello!")

or

from cloudify import ctx

ctx('logger info Hello!')

The first example shows native ctx usage that can be used to perform most of the trivial actions you can perform using the script plugin. For example, using the logger; retrieving runtime properties and setting them for node instances; setting the source/target node instances runtime properties in relationship operations; retrieving node properties; downloading blueprint resources; aborting operations, and so on.

The second example demonstrates that you can still use ctx to execute commands as if you are running it from a bash script.

The most notable difference is that, to get all properties for a node or runtime properties for a node instance, you have to run the following:

from cloudify import ctx

my_node_properties = ctx.node.properties.get_all()
my_instance_runtime_properties = ctx.instance.runtime_properties.get_all()

This is also true for source and target node properties and node instance runtime properties.