Conductor Documentation

WRCP Tester Plugin

Overview

The purpose of this plugin, and an accompanying blueprint (as in examples), is to provide a simple field modifiable testing framework. The plugin operation assumes a WRCP Conductor environment already exists, as produced by WRCP plugin discovery.

Theory of operation

The idea behind the framework is that a field team can use a standard blueprint to develop and run tests. The blueprint.yaml file is not intended to be modified. Tests are added in the tests subdirectory of the blueprint. Each subdirectory of tests represents a test, and includes a script named main. For local test (ran on conductor) the test folder must included a “main.local” file and shouldn’t contain the “main” file. These scripts (or executables) rely on the #! first line to determine the interpreter. The script directory can contain any artifacts necessary for the tests. At test time, the tests directories are copied to remote hosts and executed. Results are return on stdout (or stderr for errors) and standard unix semantics determine pass/fail: passed tests exit with 0. To run commands with sudo in the tests without being prompted for a password, it’s necessary to use it with the following structure (the value attributed for SUDO_PWD is imported from the blueprint.yaml file):

echo $SUDO_PWD | sudo -S bash -c <command>

Requirements

[centos@localhost examples]$ tree
├── blueprint.yaml
└── tests
    ├── test1
    │   └── main
    ├── test2
    │   └── main.local
    └── test3        
        └── main
tosca_definitions_version: cloudify_dsl_1_3

imports:
  - https://cloudify.co/spec/cloudify/6.0.0/types.yaml
  - plugin:wrcp-tester-plugin

blueprint_labels:

  custom-node-type:
    values:
    - wrcp-tester

node_templates:

  test:
    type: windriver.tester.nodes.TestDriver

Node types

windriver.tester.nodes.TestDriver This node represents the driver that performs tests connecting to WRCP targets.

Properites

Running using CLI

  1. cfy plugins upload wrcp_tester_plugin.wgn -y plugin.yaml
  2. cfy blueprints upload blueprint.zip -b tester-blueprint
  3. cfy deployments group create wrcp_group
  4. cfy deployment group extend -ar "blueprint_id=<WRCP_BLUEPRINT_NAME>" -lr "csys-env-type=Wind-River-Cloud-Platform-System-Controller" wrcp_group
  5. cfy deployment group create -b tester-blueprint tester_group
  6. cfy deployments group extend --into-environments wrcp_group tester_group
  7. cfy execution group start install -g <tester_group_name>
  8. cfy executions group start execute_operation -p '{"operation": "windriver.interfaces.test.run_tests"}' -g tester_group

Running using UI

The dashboard is the easiest way to run and check the tests results. The dashboard is the last page in Conductor left panel

  1. Go to Precheck Dashboard
  2. Click the “Execute tests” button
  3. A modal is displayed where the user can select:
    • A “Deployment Group” in a list
    • The timeout setting the “Timeout” field
    • The list of desired tests to be executed setting the “Tests” field with the name of each test
    • The list of tests to be exclude from execution setting the “Exclude” field with the name of each test
  4. Start the test and wait it finishes
    • The results must be present in “Results grid”

Precheck Dashboard

Deployment list

Test Results List

Precheck Dashboard

Filter

Include or exclude desired tests appending to a list, or using regex


#just remote-bash will run
cfy executions group start execute_operation -p '{"operation": "windriver.interfaces.test.run_tests", "operation_kwargs": { "tests": ["remote-bash"]}}' -g tester_group
 
#just local test will not run
cfy executions group start execute_operation -p '{"operation": "windriver.interfaces.test.run_tests", "operation_kwargs": { "exclude": ["local"]}}' -g tester_group
 
#just the ones with "remote-" at the name will run
cfy executions groups start execute_operation  -p '{"operation": "windriver.interfaces.test.run_tests", "operation_kwargs": { "tests": ["^remote*$"] } }

Timeout

The timeout value defined in the execution defines how much time a test can run before raising an error


#in this example, 30s timeout has been set
cfy executions groups start execute_operation  -p '{"operation": "windriver.interfaces.test.run_tests", "operation_kwargs": {"timeout": 30}}' -g test_group

Remote vs Local tests

Frequent questions Local Remote
Where the tests run? local conductor WRCP host
Run python scripts? yes, only test with WRCP/K8S API yes
Run bash scripts? no yes
Run tests with WRCP API? yes no
Run tests with Kubernetes API? yes no
Need a file name extension to run ? .local no
Run sudo passwordless script? - yes

Tests with Kubernetes API

#!/usr/bin/env python
from kubernetes import client
from wrcp_tester.common.decorators import with_kubernetes_client
 
 
@with_kubernetes_client
def list_pods(k8s_client):
    core_v1 = client.CoreV1Api(k8s_client)
    pod_result = core_v1.list_pod_for_all_namespaces()
    print("{:<18}{:<20}{}".format("pod_id", "namespace", "name"))
    for pod in pod_result.items:
        print("{:<18}{:<20}{}".format(
            pod.status.pod_ip,
            pod.metadata.namespace,
            pod.metadata.name))
 
 
if __name__ == "__main__":
    list_pods()

Tests with WRCP API

There are two decorators that use WRCP API: @with_system_configuration_management and @with_distributed_cloud_connection


#!/usr/bin/env python
 
from wrcp_tester.common.decorators import with_system_configuration_management
 
@with_system_configuration_management
def sample_test(conn):
    """Sample test using WRCP System Configuration Management (cgts-client) API
 
    Args:
        conn: authenticated client connection
    """
    print("RESPONSE", conn.ihost.list())
 
 
if __name__ == "__main__":
    sample_test()


#!/usr/bin/env python
 
from wrcp_tester.common.decorators import with_distributed_cloud_connection
 
@with_distributed_cloud_connection
def sample_test(conn):
    """Sample test using WRCP Distributed Cloud Client (distcloud-client) API
 
    Args:
        conn: authenticated client connection
    """
    alarms = conn.alarm_manager.list_alarms()
    for alarm in alarms:
        print("STATUS ", alarm.status)
 
 
if __name__ == "__main__":
    sample_test()
	

Tests with Sudo

#!/bin/sh |

echo "Test needs sudo: cat /etc/grub2.cfg"

#echo $SUDO_PWD | sudo -S bash -c <command>
echo $SUDO_PWD | sudo -S bash -c 'cat /etc/grub2.cfg'