diff options
author | mrichomme <morgan.richomme@orange.com> | 2020-04-10 18:22:59 +0200 |
---|---|---|
committer | Morgan Richomme <morgan.richomme@orange.com> | 2020-04-14 17:25:12 +0000 |
commit | 28babfa0a7ab8a6c1df9d16c7235d968c1337df3 (patch) | |
tree | 2a3bf4bd9aeb83ac2ee44a2e41ee665b3b394931 /test/mocks | |
parent | c06ea62716be250a59bd1d3857baa1a3a927a317 (diff) |
Fix integration markdown errors for linter
Issue-ID: INT-1523
Signed-off-by: mrichomme <morgan.richomme@orange.com>
Change-Id: I2be0865395b12e1f277834b0c096f5d183cb5056
Signed-off-by: mrichomme <morgan.richomme@orange.com>
Diffstat (limited to 'test/mocks')
9 files changed, 312 insertions, 271 deletions
diff --git a/test/mocks/datafilecollector-testharness/auto-test/README.md b/test/mocks/datafilecollector-testharness/auto-test/README.md index b73067dee..f6ccd52cb 100644 --- a/test/mocks/datafilecollector-testharness/auto-test/README.md +++ b/test/mocks/datafilecollector-testharness/auto-test/README.md @@ -1,54 +1,61 @@ -## Running automated test case and test suites +# Running automated test case and test suites + Test cases run a single test case and test suites run one or more test cases in a sequence. The test cases and test suites are possible to run on both Ubuntu and Mac-OS. -##Overall structure and setup +## Overall structure and setup + Test cases and test suites are written as bash scripts which call predefined functions in two other bash scripts located in ../common dir. The functions are described further below. The integration repo is needed as well as docker. -If needed setup the ``DFC_LOCAL_IMAGE`` and ``DFC_REMOTE_IMAGE`` env var in test_env.sh to point to the dfc images (local registry image or next registry image) without the image tag. +If needed setup the `DFC_LOCAL_IMAGE` and `DFC_REMOTE_IMAGE` env var in test_env.sh to point to the dfc images (local registry image or next registry image) without the image tag. The predefined images should be ok for current usage: -``DFC_REMOTE_IMAGE=nexus3.onap.org:10001/onap/org.onap.dcaegen2.collectors.datafile.datafile-app-server`` +`DFC_REMOTE_IMAGE=nexus3.onap.org:10001/onap/org.onap.dcaegen2.collectors.datafile.datafile-app-server` -``DFC_LOCAL_IMAGE=onap/org.onap.dcaegen2.collectors.datafile.datafile-app-server`` +`DFC_LOCAL_IMAGE=onap/org.onap.dcaegen2.collectors.datafile.datafile-app-server` -If the test cases/suites in this dir are not executed in the auto-test dir in the integration repo, then the ``SIM_GROUP`` env var need to point to the ``simulator-group`` dir. +If the test cases/suites in this dir are not executed in the auto-test dir in the integration repo, then the `SIM_GROUP` env var need to point to the `simulator-group` dir. See instructions in the test_env.sh. The ../common dir is needed as well in the case. That is, it is possible to have auto-test dir (and the common dir) somewhere else than in the integration repo but the simulator-group and common dir need to be available. -##Test cases and test suites naming. -Each file filename should have the format ``<tc-id>.sh`` for test cases and ``<ts-id>.sh`` for test suite. The tc-id and ts-id are the +## Test cases and test suites naming + +Each file filename should have the format `<tc-id>.sh` for test cases and `<ts-id>.sh` for test suite. The tc-id and ts-id are the identify of the test case or test suite. Example FTC2.sh, FTC2 is the id of the test case. Just the contents of the files determines if it is a test case or test suite so good to name the file so it is easy to see if it is a test case or a test suite. -A simple way to list all test cases/suite along with the description is to do ``grep ONELINE_DESCR *.sh`` in the shell. +A simple way to list all test cases/suite along with the description is to do `grep ONELINE_DESCR *.sh` in the shell. -##Logs from containers and test cases -All logs from each test cases are stored under ``logs/<tc-id>/``. +## Logs from containers and test cases + +All logs from each test cases are stored under `logs/<tc-id>/`. The logs include the application.log and the container log from dfc, the container logs from each simulator and the test case log (same as the screen output). In the test cases the logs are stored with a prefix so the logs can be stored at different steps during the test. All test cases contains an entry to save all logs with prefix 'END' at the end of each test case. -##Execution## -Test cases and test suites are executed by: `` [sudo] ./<tc-id or ts-id>.sh local | remote | remote-remove | manual-container | manual-app``</br> -**local** - uses the dfc image pointed out by ``DFC_LOCAL_IMAGE`` in the test_env, should be the dfc image built locally in your docker registry.</br> -**remote** - uses the dfc image pointed out by ``DFC_REMOTE_IMAGE`` in the test_env, should be the dfc nexus image in your docker registry.</br> -**remote-remove** - uses the dfc image pointed out by ``DFC_REMOTE_IMAGE`` in the test_env, should be the dfc nexus image in your docker registry. Removes the nexus image and pull from remote registry.</br> -**manual-container** - uses dfc in a manually started container. The script will prompt you for manual starting and stopping of the container.</br> -**manual-app** - uses dfc app started as an external process (from eclipse etc). The script will prompt you for manual start and stop of the process.</br> + +## Execution + +Test cases and test suites are executed by: ` [sudo] ./<tc-id or ts-id>.sh local | remote | remote-remove | manual-container | manual-app`</br> + +- **local** - uses the dfc image pointed out by `DFC_LOCAL_IMAGE` in the test_env, should be the dfc image built locally in your docker registry.</br> +- **remote** - uses the dfc image pointed out by `DFC_REMOTE_IMAGE` in the test_env, should be the dfc nexus image in your docker registry.</br> +- **remote-remove** - uses the dfc image pointed out by `DFC_REMOTE_IMAGE` in the test_env, should be the dfc nexus image in your docker registry. Removes the nexus image and pull from remote registry.</br> +- **manual-container** - uses dfc in a manually started container. The script will prompt you for manual starting and stopping of the container.</br> +- **manual-app** - uses dfc app started as an external process (from eclipse etc). The script will prompt you for manual start and stop of the process.</br> When running dfc manually, either as a container or an app the ports need to be set to map the instance id of the dfc. Most test cases start dfc with index 0, then the test case expects the ports of dfc to be mapped to the standar port number. However, if a higher instance id than 0 is used then the mapped ports need add that index to the port number (eg, if index 2 is used the dfc need to map port 8102 and 8435 instead of the standard 8100 and 8433). -##Test case file## +## Test case file + A test case file contains a number of steps to verify a certain functionality. -A description of the test case should be given to the ``TC_ONELINE_DESCR`` var. The description will be printed in the test result. +A description of the test case should be given to the `TC_ONELINE_DESCR` var. The description will be printed in the test result. The empty template for a test case files looks like this: -(Only the parts noted with < and > shall be changed.) +(Only the parts noted with < and > shall be changed.) ------------------------------------------------------------ ``` #!/bin/bash @@ -69,20 +76,18 @@ store_logs END print_result ``` ------------------------------------------------------------ The ../common/testcase_common.sh contains all functions needed for the test case file. See the README.md file in the ../common dir for a description of all available functions. +## Test suite files -##Test suite files## A test suite file contains one or more test cases to run in sequence. -A description of the test case should be given to the ``TS_ONELINE_DESCR`` var. The description will be printed in the test result. +A description of the test case should be given to the `TS_ONELINE_DESCR` var. The description will be printed in the test result. The empty template for a test suite files looks like this: -(Only the parts noted with ``<`` and ``>`` shall be changed.) +(Only the parts noted with `<` and `>` shall be changed.) ------------------------------------------------------------ ``` #!/bin/bash @@ -104,11 +109,11 @@ suite_complete ``` ------------------------------------------------------------ The ../common/testsuite_common.sh contains all functions needed for a test suite file. See the README.md file in the ../common dir for a description of all available functions. -##Known limitations## +## Known limitations + When DFC has polled a new event from the MR simulator, DFC starts to check each file whether it has been already published or not. This check is done per file towards the DR simulator. If the event contains a large amount of files, there is a risk that DFC will flood the DR simulator with requests for these checks. The timeout in DFC for the response is currently 4 sec and the DR simulator may not be able to answer all request within the timeout. DR simulator is single threaded. This seem to be a problem only for the first polled event. For subsequent events these requests seem to be spread out in time by DFC so the DR simulator can respond in time. @@ -117,4 +122,4 @@ A number of the test script will report failure due to this limitation in the DR The FTP servers may deny connection when too many file download requests are made in a short time from DFC. This is visible in the DFC application log as WARNINGs for failed downloads. However, DFC always retry the failed download a number of times to -minimize the risk of giving up download completely for these files.
\ No newline at end of file +minimize the risk of giving up download completely for these files. diff --git a/test/mocks/datafilecollector-testharness/common/README.md b/test/mocks/datafilecollector-testharness/common/README.md index bcd345739..31f40ef10 100644 --- a/test/mocks/datafilecollector-testharness/common/README.md +++ b/test/mocks/datafilecollector-testharness/common/README.md @@ -1,4 +1,4 @@ -##Common test scripts and env file for test +## Common test scripts and env file for test **test_env.sh**</br> Common env variables for test in the auto-test dir. Used by the auto test cases/suites but could be used for other test script as well. @@ -9,7 +9,7 @@ Common functions for auto test cases in the auto-test dir. A subset of the funct **testsuite_common.sh**</br> Common functions for auto test suites in the auto-test dir. -##Descriptions of functions in testcase_common.sh +## Descriptions of functions in testcase_common.sh The following is a list of the available functions in a test case file. Please see some of the defined test cases for examples. @@ -90,97 +90,97 @@ Sleep for a number of seconds and prints dfc heartbeat output every 30 sec **mr_equal <variable-name> <target-value> [<timeout-in-sec>]**</br> Tests if a variable value in the MR simulator is equal to a target value and an optional timeout. -</br>Arg: ``<variable-name> <target-value>`` - This test set pass or fail depending on if the variable is +</br>Arg: `<variable-name> <target-value>` - This test set pass or fail depending on if the variable is equal to the targer or not. -</br>Arg: ``<variable-name> <target-value> <timeout-in-sec>`` - This test waits up to the timeout seconds +</br>Arg: `<variable-name> <target-value> <timeout-in-sec>` - This test waits up to the timeout seconds before setting pass or fail depending on if the variable value becomes equal to the target value or not. **mr_greater <variable-name> <target-value> [<timeout-in-sec>]**</br> Tests if a variable value in the MR simulator is greater than a target value and an optional timeout. -</br>Arg: ``<variable-name> <target-value>`` - This test set pass or fail depending on if the variable is +</br>Arg: `<variable-name> <target-value>` - This test set pass or fail depending on if the variable is greater the target or not. -</br>Arg: ``<variable-name> <target-value> <timeout-in-sec>`` - This test waits up to the timeout seconds +</br>Arg: `<variable-name> <target-value> <timeout-in-sec>` - This test waits up to the timeout seconds before setting pass or fail depending on if the variable value is greater than the target value or not. **mr_less <variable-name> <target-value> [<timeout-in-sec>]**</br> Tests if a variable value in the MR simulator is less than a target value and an optional timeout. -</br>Arg: ``<variable-name> <target-value>`` - This test set pass or fail depending on if the variable is +</br>Arg: `<variable-name> <target-value>` - This test set pass or fail depending on if the variable is less than the target or not. -</br>Arg: ``<variable-name> <target-value> <timeout-in-sec>`` - This test waits up to the timeout seconds +</br>Arg: `<variable-name> <target-value> <timeout-in-sec>` - This test waits up to the timeout seconds before setting pass or fail depending on if the variable value is less than the target value or not. **mr_contain_str <variable-name> <target-value> [<timeout-in-sec>]**</br> Tests if a variable value in the MR simulator contains a substring target and an optional timeout. -</br>Arg: ``<variable-name> <target-value>`` - This test set pass or fail depending on if the variable contains +</br>Arg: `<variable-name> <target-value>` - This test set pass or fail depending on if the variable contains the target substring or not. -</br>Arg: ``<variable-name> <target-value> <timeout-in-sec>`` - This test waits up to the timeout seconds +</br>Arg: `<variable-name> <target-value> <timeout-in-sec>` - This test waits up to the timeout seconds before setting pass or fail depending on if the variable value contains the target substring or not. **dr_equal <variable-name> <target-value> [<timeout-in-sec>]**</br> Tests if a variable value in the DR simulator is equal to a target value and an optional timeout. -</br>Arg: ``<variable-name> <target-value>`` - This test set pass or fail depending on if the variable is +</br>Arg: `<variable-name> <target-value>` - This test set pass or fail depending on if the variable is equal to the target or not. -</br>Arg: ``<variable-name> <target-value> <timeout-in-sec>`` - This test waits up to the timeout seconds +</br>Arg: `<variable-name> <target-value> <timeout-in-sec>` - This test waits up to the timeout seconds before setting pass or fail depending on if the variable value becomes equal to the target value or not. **dr_greater <variable-name> <target-value> [<timeout-in-sec>]**</br> Tests if a variable value in the DR simulator is greater than a target value and an optional timeout. -</br>Arg: ``<variable-name> <target-value>`` - This test set pass or fail depending on if the variable is +</br>Arg: `<variable-name> <target-value>` - This test set pass or fail depending on if the variable is greater the target or not. -</br>Arg: ``<variable-name> <target-value> <timeout-in-sec>`` - This test waits up to the timeout seconds +</br>Arg: `<variable-name> <target-value> <timeout-in-sec>` - This test waits up to the timeout seconds before setting pass or fail depending on if the variable value is greater than the target value or not. **dr_less <variable-name> <target-value> [<timeout-in-sec>]**</br> Tests if a variable value in the DR simulator is less than a target value and an optional timeout. -</br>Arg: ``<variable-name> <target-value>`` - This test set pass or fail depending on if the variable is +</br>Arg: `<variable-name> <target-value>` - This test set pass or fail depending on if the variable is less than the target or not. -</br>Arg: ``<variable-name> <target-value> <timeout-in-sec>`` - This test waits up to the timeout seconds +</br>Arg: `<variable-name> <target-value> <timeout-in-sec>` - This test waits up to the timeout seconds before setting pass or fail depending on if the variable value is less than the target value or not. **dr_contain_str <variable-name> <target-value> [<timeout-in-sec>]**</br> Tests if a variable value in the DR simulator contains a substring target and an optional timeout. -</br>Arg: ``<variable-name> <target-value>`` - This test set pass or fail depending on if the variable contains +</br>Arg: `<variable-name> <target-value>` - This test set pass or fail depending on if the variable contains the target substring or not. -</br>Arg: ``<variable-name> <target-value> <timeout-in-sec>`` - This test waits up to the timeout seconds +</br>Arg: `<variable-name> <target-value> <timeout-in-sec>` - This test waits up to the timeout seconds before setting pass or fail depending on if the variable value contains the target substring or not. **drr_equal <variable-name> <target-value> [<timeout-in-sec>]**</br> Tests if a variable value in the DR Redir simulator is equal to a target value and an optional timeout. -</br>Arg: ``<variable-name> <target-value>`` - This test set pass or fail depending on if the variable is +</br>Arg: `<variable-name> <target-value>` - This test set pass or fail depending on if the variable is equal to the target or not. -</br>Arg: ``<variable-name> <target-value> <timeout-in-sec>`` - This test waits up to the timeout seconds +</br>Arg: `<variable-name> <target-value> <timeout-in-sec>` - This test waits up to the timeout seconds before setting pass or fail depending on if the variable value becomes equal to the target value or not. **drr_greater <variable-name> <target-value> [<timeout-in-sec>]**</br> Tests if a variable value in the DR Redir simulator is greater than a target value and an optional timeout. -</br>Arg: ``<variable-name> <target-value>`` - This test set pass or fail depending on if the variable is +</br>Arg: `<variable-name> <target-value>` - This test set pass or fail depending on if the variable is greater the target or not. -</br>Arg: ``<variable-name> <target-value> <timeout-in-sec>`` - This test waits up to the timeout seconds +</br>Arg: `<variable-name> <target-value> <timeout-in-sec>` - This test waits up to the timeout seconds before setting pass or fail depending on if the variable value is greater than the target value or not. **drr_less <variable-name> <target-value> [<timeout-in-sec>]**</br> Tests if a variable value in the DR Redir simulator is less than a target value and an optional timeout. -</br>Arg: ``<variable-name> <target-value>`` - This test set pass or fail depending on if the variable is +</br>Arg: `<variable-name> <target-value>` - This test set pass or fail depending on if the variable is less than the target or not. -</br>Arg: ``<variable-name> <target-value> <timeout-in-sec>`` - This test waits up to the timeout seconds +</br>Arg: `<variable-name> <target-value> <timeout-in-sec>` - This test waits up to the timeout seconds before setting pass or fail depending on if the variable value is less than the target value or not. **drr_contain_str <variable-name> <target-value> [<timeout-in-sec>]**</br> Tests if a variable value in the DR Redir simulator contains a substring target and an optional timeout. -</br>Arg: ``<variable-name> <target-value>`` - This test set pass or fail depending on if the variable contains +</br>Arg: `<variable-name> <target-value>` - This test set pass or fail depending on if the variable contains the target substring or not. -</br>Arg: ``<variable-name> <target-value> <timeout-in-sec>`` - This test waits up to the timeout seconds +</br>Arg: `<variable-name> <target-value> <timeout-in-sec>` - This test waits up to the timeout seconds before setting pass or fail depending on if the variable value contains the target substring or not. @@ -203,18 +203,17 @@ Print the test result. Only once at the very end of the script. Print all variables from the simulators and the dfc heartbeat. In addition, comment in the file can be added using the normal comment sign in bash '#'. -Comments that shall be visible on the screen as well as in the test case log, use ``echo "<msg>"``. +Comments that shall be visible on the screen as well as in the test case log, use `echo "<msg>"`. - -##Descriptions of functions in testsuite_common.sh +## Descriptions of functions in testsuite_common.sh The following is a list of the available functions in a test suite file. Please see a existing test suite for examples. **suite_setup**</br> Sets up the test suite and print out a heading. -**run_tc <tc-script> <$1 from test suite script> <$2 from test suite script>**</br> +**run_tc <tc-script> <$1 from test suite script> <$2 from test suite script>**</br> Execute a test case with arg from test suite script **suite_complete**</br> -Print out the overall result of the executed test cases.
\ No newline at end of file +Print out the overall result of the executed test cases. diff --git a/test/mocks/datafilecollector-testharness/dr-sim/README.md b/test/mocks/datafilecollector-testharness/dr-sim/README.md index a258ed46d..4e7273a11 100644 --- a/test/mocks/datafilecollector-testharness/dr-sim/README.md +++ b/test/mocks/datafilecollector-testharness/dr-sim/README.md @@ -1,77 +1,106 @@ -###Run DR simulators as docker container -1. Build docker container with ```docker build -t drsim_common:latest .``` -2. Run the container ```docker-compose up``` +# Run DR simulators as docker container + +1. Build docker container with `docker build -t drsim_common:latest .` +2. Run the container `docker-compose up` 3. For specific behavior of of the simulators, add arguments to the `command` entries in the `docker-compose.yml`. + For example `command: node dmaapDR.js --tc no_publish` . (No argument will assume '--tc normal'). Run `node dmaapDR.js --printtc` and `node dmaapDR-redir.js --printtc` for details or see further below for the list of possible arg to the simulator -###Run DR simulators and all other simulators as one group +# Run DR simulators and all other simulators as one group + See the README in the 'simulator-group' dir. -###Run DR simulators from cmd line +# Run DR simulators from cmd line + 1. install nodejs 2. install npm + Make sure that you run these commands in the application directory "dr-sim" + 3. `npm install express` 4. `npm install argparse` 5. `node dmaapDR.js` #keep it in the foreground, see below for a list for arg to the simulator 6. `node dmaapDR_redir.js` #keep it in the foreground, see below for a list for arg to the simulator -###Arg to control the behavior of the simulators +# Arg to control the behavior of the simulators + +## DR + +\--tc tc_normal Normal case, query response based on published files. Publish respond with ok/redirect depending on if file is published or not.</br> + +\--tc tc_none_published Query respond 'ok'. Publish respond with redirect.</br> + +\--tc tc_all_published Query respond with filename. Publish respond with 'ok'.</br> + +\--tc tc_10p_no_response 10% % no response for query and publish. Otherwise normal case.</br> + +\--tc tc_10first_no_response 10 first queries and requests gives no response for query and publish. Otherwise normal case.</br> + +\--tc tc_100first_no_response 100 first queries and requests gives no response for query and publish. Otherwise normal case.</br> + +\--tc tc_all_delay_1s All responses delayed 1s (both query and publish).</br> + +\--tc tc_all_delay_10s All responses delayed 10s (both query and publish).</br> + +\--tc tc_10p_delay_10s 10% of responses delayed 10s, (both query and publish).</br> -**DR** +\--tc tc_10p_error_response 10% error response for query and publish. Otherwise normal case.</br> - --tc tc_normal Normal case, query response based on published files. Publish respond with ok/redirect depending on if file is published or not.</br> - --tc tc_none_published Query respond 'ok'. Publish respond with redirect.</br> - --tc tc_all_published Query respond with filename. Publish respond with 'ok'.</br> - --tc tc_10p_no_response 10% % no response for query and publish. Otherwise normal case.</br> - --tc tc_10first_no_response 10 first queries and requests gives no response for query and publish. Otherwise normal case.</br> - --tc tc_100first_no_response 100 first queries and requests gives no response for query and publish. Otherwise normal case.</br> - --tc tc_all_delay_1s All responses delayed 1s (both query and publish).</br> - --tc tc_all_delay_10s All responses delayed 10s (both query and publish).</br> - --tc tc_10p_delay_10s 10% of responses delayed 10s, (both query and publish).</br> - --tc tc_10p_error_response 10% error response for query and publish. Otherwise normal case.</br> - --tc tc_10first_error_response 10 first queries and requests gives no response for query and publish. Otherwise normal case.</br> - --tc tc_100first_error_response 100 first queries and requests gives no response for query and publish. Otherwise normal case.</br> +\--tc tc_10first_error_response 10 first queries and requests gives no response for query and publish. Otherwise normal case.</br> +\--tc tc_100first_error_response 100 first queries and requests gives no response for query and publish. Otherwise normal case.</br> -**DR Redirect** +## DR Redirect - --tc_normal Normal case, all files publish and DR updated.</br> - --tc_no_publish Ok response but no files published.</br> - --tc_10p_no_response 10% % no response (file not published).</br> - --tc_10first_no_response 10 first requests give no response (files not published).</br> - --tc_100first_no_response 100 first requests give no response (files not published).</br> - --tc_all_delay_1s All responses delayed 1s, normal publish.</br> - --tc_all_delay_10s All responses delayed 10s, normal publish.</br> - --tc_10p_delay_10s 10% of responses delayed 10s, normal publish.</br> - --tc_10p_error_response 10% error response (file not published).</br> - --tc_10first_error_response 10 first requests give error response (file not published).</br> - --tc_100first_error_response 100 first requests give error responses (file not published).</br> +\--tc_normal Normal case, all files publish and DR updated.</br> +\--tc_no_publish Ok response but no files published.</br> -###Needed environment +\--tc_10p_no_response 10% % no response (file not published).</br> -DR +\--tc_10first_no_response 10 first requests give no response (files not published).</br> - DRR_SIM_IP Set to host name of the DR Redirect simulator "drsim_redir" if running the simulators in a docker private network. Otherwise to "localhost" - DR_FEEDS A comma separated list of configured feednames and filetypes. Example "1:A,2:B:C" - Feed 1 for filenames beginning with A and feed2 for filenames beginning with B or C. +\--tc_100first_no_response 100 first requests give no response (files not published).</br> + +\--tc_all_delay_1s All responses delayed 1s, normal publish.</br> + +\--tc_all_delay_10s All responses delayed 10s, normal publish.</br> + +\--tc_10p_delay_10s 10% of responses delayed 10s, normal publish.</br> + +\--tc_10p_error_response 10% error response (file not published).</br> + +\--tc_10first_error_response 10 first requests give error response (file not published).</br> + +\--tc_100first_error_response 100 first requests give error responses (file not published).</br> + +# Needed environment + +## DR + +``` +DRR_SIM_IP Set to host name of the DR Redirect simulator "drsim_redir" if running the simulators in a docker private network. Otherwise to "localhost" +DR_FEEDS A comma separated list of configured feednames and filetypes. Example "1:A,2:B:C" - Feed 1 for filenames beginning with A and feed2 for filenames beginning with B or C. +``` `DRR_SIM_IP` is needed for the redirected publish request to be redirected to the DR redirect server. -DR Redirect (DRR for short) +## DR Redirect (DRR for short) - DR_SIM_IP Set to host name of the DR simulator "drsim" if running the simulators in a docker private network. Otherwise to "localhost" - DR_REDIR_FEEDS Same contentd as DR_FEEDS for DR. +``` +DR_SIM_IP Set to host name of the DR simulator "drsim" if running the simulators in a docker private network. Otherwise to "localhost" +DR_REDIR_FEEDS Same contentd as DR_FEEDS for DR. +``` The DR Redirect server send callback to DR server to update the list of successfully published files. When running as container (using an ip address from the `dfc_net` docker network) the env shall be set to 'drsim'. . When running the servers from command line, set the env variable `DR_SIM_IP=localhost` -###APIs for statistic readout -The simulator can be queried for statistics (use curl from cmd line or open in browser, curl used below): +# APIs for statistic readout -DR +The simulator can be queried for statistics (use curl from cmd line or open in browser, curl used below): +## DR `curl localhost:3906/` - returns 'ok' @@ -135,9 +164,7 @@ DR `curl localhost:3906/ctr_publish_query_bad_file_prefix/<feed>` - returns a list of the number of publish queries with bad file prefix for a feed - -DR Redirect - +## DR Redirect `curl localhost:3908/` - returns 'ok' @@ -178,6 +205,3 @@ DR Redirect `curl localhost:3908/feeds/dwl_volume` - returns a list of the number of bytes of the published files for each feed `curl localhost:3908/dwl_volume/<feed>` - returns the number of bytes of the published files for a feed - - - diff --git a/test/mocks/datafilecollector-testharness/ftps-sftp-server/README.md b/test/mocks/datafilecollector-testharness/ftps-sftp-server/README.md index 3bd67404a..f20b29698 100644 --- a/test/mocks/datafilecollector-testharness/ftps-sftp-server/README.md +++ b/test/mocks/datafilecollector-testharness/ftps-sftp-server/README.md @@ -1,27 +1,29 @@ -###Deployment of certificates: (in case of update) +# Deployment of certificates: (in case of update) This folder is prepared with a set of keys matching DfC for test purposes. Copy from datafile-app-server/config/keys to the ./tls/ the following files: -* dfc.crt -* ftp.crt -* ftp.key +- dfc.crt +- ftp.crt +- ftp.key -###Docker preparations -Source: https://docs.docker.com/install/linux/linux-postinstall/ +# Docker preparations + +Source: <https://docs.docker.com/install/linux/linux-postinstall/> `sudo usermod -aG docker $USER` then logout-login to activate it. -###Prepare files for the simulator +# Prepare files for the simulator + Run `prepare.sh` with an argument found in `test_cases.yml` (or add a new tc in that file) to create files (1MB, 5MB and 50MB files) and a large number of symbolic links to these files to simulate PM files. The files names maches the files in the events produced by the MR simulator. The dirs with the files will be mounted by the ftp containers, defined in the docker-compse file, when started -###Starting/stopping the FTPS/SFTP server(s) +# Starting/stopping the FTPS/SFTP server(s) Start: `docker-compose up` @@ -30,6 +32,6 @@ Stop: Ctrl +C, then `docker-compose down` or `docker-compose down --remove-orph If you experience issues (or port collision), check the currently running other containers by using 'docker ps' and stop them if necessary. +# Cleaning docker structure -###Cleaning docker structure -Deep cleaning: `docker system prune`
\ No newline at end of file +Deep cleaning: `docker system prune` diff --git a/test/mocks/datafilecollector-testharness/mr-sim/README.md b/test/mocks/datafilecollector-testharness/mr-sim/README.md index d3ca91c87..653b47e8f 100644 --- a/test/mocks/datafilecollector-testharness/mr-sim/README.md +++ b/test/mocks/datafilecollector-testharness/mr-sim/README.md @@ -1,45 +1,43 @@ -#MR-simulator -This readme contains: +# MR-simulator -**Introduction** +This readme contains: -**Building and running** +- Introduction +- Building and running +- Configuration -**Configuration** +## Introduction -###Introduction### The MR-sim is a python script delivering batches of events including one or more fileReady for one or more PNFs. It is possible to configure number of events, PNFs, consumer groups, exising or missing files, file prefixes and change identifier. In addition, MR sim can be configured to deliver file url for up to 5 FTP servers (simulating the PNFs). -###Building and running### +## Building and running + It is possible build and run MR-sim manually as a container if needed. In addition MR-sim can be executed as python script, see instuctions further down. Otherwise it is recommended to use the test scripts in the auto-test dir or run all simulators in one go using scripts in the simulator-group dir. To build and run manually as a docker container: -1. Build docker container with ```docker build -t mrsim:latest .``` -2. Run the container ```docker-compose up``` -###Configuration### +1. Build docker container with `docker build -t mrsim:latest .` +2. Run the container `docker-compose up` + +## Configuration + The event pattern, called TC, of the MR-sim is controlled with a arg to python script. See section TC info for available patterns. All other configuration is done via envrionment variables. The simulator listens to port 2222. The following envrionment vaiables are used: -**FTPS_SIMS** - A comma-separated list of hostname:port for the FTP servers to generate ftps file urls for. If not set MR sim will assume 'localhost:21'. Minimum 1 and maximum 5 host-port pairs can be given. - -**SFTP_SIMS** - A comma-separated list of hostname:port for the FTP servers to generate sftp file urls for. If not set MR sim will assume 'localhost:1022'. Minimum 1 and maximum 5 host-port pairs can be given. - -**NUM_FTP_SERVERS** - Number of FTP servers to use out of those specified in the envrioment variables above. The number shall be in the range 1-5. - -**MR_GROUPS** - A comma-separated list of consummer-group:changeId[:changeId]*. Defines which change identifier that should be used for each consumer gropu. If not set the MR-sim will assume 'OpenDcae-c12:PM_MEAS_FILES'. +- **FTPS_SIMS** - A comma-separated list of hostname:port for the FTP servers to generate ftps file urls for. If not set MR sim will assume 'localhost:21'. Minimum 1 and maximum 5 host-port pairs can be given. +- **SFTP_SIMS** - A comma-separated list of hostname:port for the FTP servers to generate sftp file urls for. If not set MR sim will assume 'localhost:1022'. Minimum 1 and maximum 5 host-port pairs can be given. +- **NUM_FTP_SERVERS** - Number of FTP servers to use out of those specified in the envrioment variables above. The number shall be in the range 1-5. +- **MR_GROUPS** - A comma-separated list of consummer-group:changeId[:changeId]\*. Defines which change identifier that should be used for each consumer gropu. If not set the MR-sim will assume 'OpenDcae-c12:PM_MEAS_FILES'. +- **MR_FILE_PREFIX_MAPPING** - A comma-separated list of changeId:filePrefix. Defines which file prefix to use for each change identifier, needed to distinguish files for each change identifiers. If not set the MR-sim will assume 'PM_MEAS_FILES:A -**MR_FILE_PREFIX_MAPPING** - A comma-separated list of changeId:filePrefix. Defines which file prefix to use for each change identifier, needed to distinguish files for each change identifiers. If not set the MR-sim will assume 'PM_MEAS_FILES:A +## Statistics read-out and commands - - -###Statistics read-out and commands### The simulator can be queried for statistics and started/stopped (use curl from cmd line or open in browser, curl used below): `curl localhost:2222` - Just returns 'Hello World'. @@ -60,70 +58,63 @@ The simulator can be queried for statistics and started/stopped (use curl from `curl localhost:2222/fileprefixes` - returns the setting of env var MR_FILE_PREFIX_MAPPING. - `curl localhost:2222/ctr_requests` - returns an integer of the number of get requests, for all groups, to the event poll path `curl localhost:2222/groups/ctr_requests` - returns a list of integers of the number of get requests, for each consumer group, to the event poll path `curl localhost:2222/ctr_requests/<consumer-group>` - returns an integer of the number of get requests, for the specified consumer group, to the event poll path - `curl localhost:2222/ctr_responses` - returns an integer of the number of get responses, for all groups, to the event poll path `curl localhost:2222/groups/ctr_responses` - returns a list of integers of the number of get responses, for each consumer group, to the event poll path `curl localhost:2222/ctr_responses/<consumer-group>` - returns an integer of the number of get responses, for the specified consumer group, to the event poll path - `curl localhost:2222/ctr_files` - returns an integer of the number generated files for all groups `curl localhost:2222/groups/ctr_files` - returns a list of integers of the number generated files for each group `curl localhost:2222/ctr_files/<consumer-group>` - returns an integer or the number generated files for the specified group - `curl localhost:2222/ctr_unique_files` - returns an integer of the number generated unique files for all groups `curl localhost:2222/groups/ctr_unique_files` - returns a list of integers of the number generated unique files for each group `curl localhost:2222/ctr_unique_files/<consumer-group>` - returns an integer or the number generated unique files for the specified group - - `curl localhost:2222/ctr_events` - returns the total number of events for all groups `curl localhost:2222/groups/ctr_events` - returns a list the integer of the total number of events for each group `curl localhost:2222/ctr_events/<consumer-group>` - returns the total number of events for a specified group - `curl localhost:2222/exe_time_first_poll` - returns the execution time in mm:ss from the first poll `curl localhost:2222/groups/exe_time_first_poll` - returns a list of the execution time in mm:ss from the first poll for each group `curl localhost:2222/exe_time_first_poll/<consumer-group>` - returns the execution time in mm:ss from the first poll for the specified group - `curl localhost:2222/ctr_unique_PNFs` - returns the number of unique PNFS in all events. `curl localhost:2222/groups/ctr_unique_PNFs` - returns a list of the number of unique PNFS in all events for each group. `curl localhost:2222/ctr_unique_PNFs/<consumer-group>` - returns the number of unique PNFS in all events for the specified group. +## Alternative to running python (as described below) on your machine, use the docker files -#Alternative to running python (as described below) on your machine, use the docker files. -1. Build docker container with ```docker build -t mrsim:latest .``` -2. Run the container ```docker-compose up``` -The behavior can be changed by argument to the python script in the docker-compose.yml +1. Build docker container with `docker build -t mrsim:latest .` +2. Run the container `docker-compose up` + The behavior can be changed by argument to the python script in the docker-compose.yml +## Common TC info -##Common TC info File names for 1MB, 5MB and 50MB files -Files in the format: <size-in-mb>MB_<sequence-number>.tar.gz Ex. for 5MB file with sequence number 12: 5MB_12.tar.gz +Files in the format: <size-in-mb>MB\_<sequence-number>.tar.gz Ex. for 5MB file with sequence number 12: 5MB_12.tar.gz The sequence numbers are stepped so that all files have unique names -Missing files (files that are not expected to be found in the ftp server. Format: MissingFile_<sequence-number>.tar.gz +Missing files (files that are not expected to be found in the ftp server. Format: MissingFile\*<sequence-number>.tar.gz + +When the number of events are exhausted, empty replies are returned '\[]', for the limited test cases. For endless tc no empty replies will be given. -When the number of events are exhausted, empty replies are returned '[]', for the limited test cases. For endless tc no empty replies will be given. Test cases are limited unless noted as 'endless'. TC100 - One ME, SFTP, 1 1MB file, 1 event @@ -140,7 +131,6 @@ TC112 - One ME, SFTP, 5MB files, 100 files per event, 100 events, 1 event per po TC113 - One ME, SFTP, 1MB files, 100 files per event, 100 events. All events in one poll. - TC120 - One ME, SFTP, 1MB files, 100 files per event, 100 events, 1 event per poll. 10% of replies each: no response, empty message, slow response, 404-error, malformed json TC121 - One ME, SFTP, 1MB files, 100 files per event, 100 events, 1 event per poll. 10% missing files @@ -195,36 +185,28 @@ TC8XX is same as TC7XX but with FTPS TC2XXX is same as TC1XXX but with FTPS - ## Developer workflow -1. ```sudo apt install python3-venv``` -2. ```source .env/bin/activate/``` -3. ```pip3 install "anypackage"``` #also include in source code -4. ```pip3 freeze | grep -v "pkg-resources" > requirements.txt``` #to create a req file -5. ```FLASK_APP=mr-sim.py flask run``` - - or - - ```python3 mr-sim.py ``` - -6. Check/lint/format the code before commit/amed by ```autopep8 --in-place --aggressive --aggressive mr-sim.py``` - - -## User workflow on *NIX +1. `sudo apt install python3-venv` +2. `source .env/bin/activate/` +3. `pip3 install "anypackage"` #also include in source code +4. `pip3 freeze | grep -v "pkg-resources" > requirements.txt` #to create a req file +5. `FLASK_APP=mr-sim.py flask run` + or + `python3 mr-sim.py ` +6. Check/lint/format the code before commit/amed by `autopep8 --in-place --aggressive --aggressive mr-sim.py` +## User workflow on \*NIX When cloning/fetching from the repository first time: + 1. `git clone` 2. `cd "..." ` #navigate to this folder 3. `source setup.sh ` #setting up virtualenv and install requirements - - you'll get a sourced virtualenv shell here, check prompt + you'll get a sourced virtualenv shell here, check prompt 4. `(env) $ python3 mr-sim.py --help` - - alternatively - - `(env) $ python3 mr-sim.py --tc1` + alternatively + `(env) $ python3 mr-sim.py --tc1` Every time you run the script, you'll need to step into the virtualenv by following step 3 first. @@ -241,4 +223,4 @@ When cloning/fetching from the repository first time: 7. 'pip3 install -r requirements.txt' #this will install in the local environment then 8. 'python3 dfc-sim.py' -Every time you run the script, you'll need to step into the virtualenv by step 2+6.
\ No newline at end of file +Every time you run the script, you'll need to step into the virtualenv by step 2+6. diff --git a/test/mocks/datafilecollector-testharness/simulator-group/README.md b/test/mocks/datafilecollector-testharness/simulator-group/README.md index 55a2467ae..1af9e3e80 100644 --- a/test/mocks/datafilecollector-testharness/simulator-group/README.md +++ b/test/mocks/datafilecollector-testharness/simulator-group/README.md @@ -1,4 +1,5 @@ -###Introduction +# Introduction + The purpose of the "simulator-group" is to run all containers in one go with specified behavior. Mainly this is needed for CSIT tests and for auto test but can be used also for manual testing of dfc both as an java-app or as a manually started container. Instead of running the simulators manually as described below the auto-test cases @@ -13,83 +14,66 @@ In general these steps are needed to run the simulator group and dfc 5. Start the simulators 6. Start dfc -###Overview of the simulators. +# Overview of the simulators. + There are 5 different types of simulators. For futher details, see the README.md in each simulator dir. -1. The MR simulator emits fileready events, upon poll requests, with new and historice file references. -It is possible to configire the change identifier and file prefixes for these identifiers and for which consumer groups -these change identifier shall be generated. It is also possible to configure the number of events and files to generate and -from which ftp servers the files shall be fetched from. +1. The MR simulator emits fileready events, upon poll requests, with new and historice file references + It is possible to configire the change identifier and file prefixes for these identifiers and for which consumer groups + these change identifier shall be generated. It is also possible to configure the number of events and files to generate and + from which ftp servers the files shall be fetched from. 2. The DR simulator handles the publish queries (to check if a file has previously been published) and the -actual publish request (which results in a redirect to the DR REDIR simulator. It keeps a 'db' of published files updated by the DR REDIR simulator. -It is possible to configure 1 or more feeds along with the accepted filename prefixes for each feed. It is also possible -to configure the responses for the publish queries and publish requests. + actual publish request (which results in a redirect to the DR REDIR simulator. It keeps a 'db' of published files updated by the DR REDIR simulator. + It is possible to configure 1 or more feeds along with the accepted filename prefixes for each feed. It is also possible + to configure the responses for the publish queries and publish requests. 3. The DR REDIR simulator handles the redirect request for publish from the DR simulator. All accepted files will be stored as and empty -file with a file name concatenated from the published file name + file size + feed id. -It is possible to configure 1 or more feeds along with the accepted filename prefixes for each feed. It is also possible -to configure the responses for the publish requests. + file with a file name concatenated from the published file name + file size + feed id. + It is possible to configure 1 or more feeds along with the accepted filename prefixes for each feed. It is also possible + to configure the responses for the publish requests. 4. The SFTP simulator(s) handles the ftp download requests. 5 of these simulators are always started and in the MR sim it is -possible to configure the distrubution of files over these 5 servers (from 1 up to 5 severs). At start of the server, the server is -populated with files to download. + possible to configure the distrubution of files over these 5 servers (from 1 up to 5 severs). At start of the server, the server is + populated with files to download. 5. The FTPS simulator(s) is the same as the SFTP except that it using the FTPS protocol. +# Build the simulator images -### Build the simulator images Run the script `prepare-images.sh` to build the docker images for MR, DR and FTPS servers. -###Edit simulator env variables - - - - -###Summary of scripts and files -`consul_config.sh` - Convert a json config file to work with dfc when manually started as java-app or container and then add that json to Consul. - -`dfc-internal-stats.sh` - Periodically extract jvm data and dfc internal data and print to console/file. - -`docker-compose-setup.sh` - Sets environment variables for the simulators and start the simulators with that settings. - -`docker-compose-template.yml` - A docker compose template with environment variables setting. Used for producing a docker-compose file to defined the simulator containers. - -`prepare-images.sh` - Script to build all needed simulator images. - -`setup-ftp-files-for-image.sh` - Script executed in the ftp server to create files for download. +# Edit simulator env variables -`sim-monitor-start.sh` - Script to install needed packages and start the simulator monitor. +## Summary of scripts and files -`sim-monitor.js` - The source file the simulator monitor. +- `consul_config.sh` - Convert a json config file to work with dfc when manually started as java-app or container and then add that json to Consul. +- `dfc-internal-stats.sh` - Periodically extract jvm data and dfc internal data and print to console/file. +- `docker-compose-setup.sh` - Sets environment variables for the simulators and start the simulators with that settings. +- `docker-compose-template.yml` - A docker compose template with environment variables setting. Used for producing a docker-compose file to defined the simulator containers. +- `prepare-images.sh` - Script to build all needed simulator images. +- `setup-ftp-files-for-image.sh` - Script executed in the ftp server to create files for download. +- `sim-monitor-start.sh` - Script to install needed packages and start the simulator monitor. +- `sim-monitor.js` - The source file the simulator monitor. +- `simulators-kill.sh` - Script to kill all the simulators +- `simulators-start.sh` - Script to start all the simulators. All env variables need to be set prior to executing the script. -`simulators-kill.sh` - Script to kill all the simulators +## Preparation -`simulators-start.sh` - Script to start all the simulators. All env variables need to be set prior to executing the script. +Do the manual steps to prepare the simulator images: +- Build the mr-sim image. +- cd ../mr-sim +- Run the docker build command to build the image for the MR simulator: 'docker build -t mrsim:latest .' +- cd ../dr-sim +- Run the docker build command to build the image for the DR simulators: \`docker build -t drsim_common:latest .' +- cd ../ftps-sftp-server +- Check the README.md in ftps-sftp-server dir in case the cert need to be updated. +- Run the docker build command to build the image for the DR simulators: \`docker build -t ftps_vsftpd:latest -f Dockerfile-ftps .' - -###Preparation -Do the manual steps to prepare the simulator images - -Build the mr-sim image. - -cd ../mr-sim - -Run the docker build command to build the image for the MR simulator: 'docker build -t mrsim:latest .' - -cd ../dr-sim - -Run the docker build command to build the image for the DR simulators: `docker build -t drsim_common:latest .' - -cd ../ftps-sftp-server -Check the README.md in ftps-sftp-server dir in case the cert need to be updated. -Run the docker build command to build the image for the DR simulators: `docker build -t ftps_vsftpd:latest -f Dockerfile-ftps .' - - -###Execution +## Execution Edit the `docker-compose-setup.sh` (or create a copy) to setup the env variables to the desired test behavior for each simulators. See each simulator to find a description of the available settings (DR_TC, DR_REDIR_TC and MR_TC). The following env variables shall be set (example values). Note that NUM_FTPFILES and NUM_PNFS controls the number of ftp files created in the ftp servers. -A total of NUM_FTPFILES * NUM_PNFS ftp files will be created in each ftp server (4 files in the below example). +A total of NUM_FTPFILES \* NUM_PNFS ftp files will be created in each ftp server (4 files in the below example). Large settings will be time consuming at start of the servers. Note that the number of files must match the number of file references emitted from the MR sim. @@ -122,7 +106,8 @@ So farm, this only works when the simulator python script is started from the co Kill all the containers with `simulators-kill.se` `simulators_start.sh` is for CSIT test and requires the env variables for test setting to be present in the shell. -`setup-ftp-files.for-image.sh` is for CSIT and executed when the ftp servers are started from the docker-compose-setup.sh`. + +`setup-ftp-files.for-image.sh` is for CSIT and executed when the ftp servers are started from the docker-compose-setup.sh\`. To make DFC to be able to connect to the simulator containers, DFC need to run in host mode. Start DFC by the following cmd: `docker run -d --network="host" --name dfc_app <dfc-image> ` @@ -130,9 +115,8 @@ Start DFC by the following cmd: `docker run -d --network="host" --name dfc_app < `<dfc-image>` could be either the locally built image `onap/org.onap.dcaegen2.collectors.datafile.datafile-app-server` or the one in nexus `nexus3.onap.org:10001/onap/org.onap.dcaegen2.collectors.datafile.datafile-app-server`. +# Start the simulator monitor - -###Start the simulator monitor Start the simulator monitor server with `node sim-monitor.js` on the cmd line and the open a browser with the url `localhost:9999/mon` to see the statisics page with data from DFC(ss), MR sim, DR sim and DR redir sim. If needed run 'npm install express' first diff --git a/test/mocks/mass-pnf-sim/README.md b/test/mocks/mass-pnf-sim/README.md index 07f74e2b7..f997f56ad 100644 --- a/test/mocks/mass-pnf-sim/README.md +++ b/test/mocks/mass-pnf-sim/README.md @@ -1,48 +1,78 @@ ### Mass PNF simulator + The purpose of this simulator is to mimic the PNF for benchmark purposes. This variant is based on the PNF simulator and use several components. The modification are focusing on the following areas: - -add a script configuring and governing multiple instances of PNF simualtor - -removing parts which are not required for benchmark purposes. - -add functionality which creates and maintains the ROP files - -add functionality to query the actual ROP files and construct VES events based on them +- add a script configuring and governing multiple instances of PNF simualtor +- removing parts which are not required for benchmark purposes. +- add functionality which creates and maintains the ROP files +- add functionality to query the actual ROP files and construct VES events based on them +### Pre-configuration -###Pre-configuration The ipstart should align to a /28 Ip address range start (e.g. 10.11.0.16, 10.11.0.32) For debug purposes, you can use your own IP address as VES collector, use "ip" command to determine it. Example: + +``` ./mass-pnf-sim.py --bootstrap 2 --urlves http://10.148.95.??:10000/eventListener/v7 --ipfileserver 10.148.95.??? --typefileserver sftp --ipstart 10.11.0.16 +``` Note that the file creator is started at a time of the bootstrapping. Stop/start will not re-launch it. -###Replacing VES for test purposes -`sudo nc -vv -l -k -p 10000` +### Replacing VES for test purposes + +``` +sudo nc -vv -l -k -p 10000 +``` + +### Start -###Start Define the amount of simulators to be launched + +``` ./mass-pnf-sim.py --start 2 +``` + +### Trigger -###Trigger +``` ./mass-pnf-sim.py --trigger 2 +``` -###Trigger only a subset of the simulators +### Trigger only a subset of the simulators + +The following command will trigger 0,1,2,3: + +``` ./mass-pnf-sim.py --triggerstart 0 --triggerend 3 -#this will trigger 0,1,2,3 +``` + +The following command will trigger 4 and 5: +``` ./mass-pnf-sim.py --triggerstart 4 --triggerend 5 -#this will trigger 4,5 +``` -###Stop and clean +### Stop and clean + +``` ./mass-pnf-sim.py --stop 2 ./mass-pnf-sim.py --clean +``` + +### Verbose printout from Python -###Verbose printout from Python +``` python3 -m trace --trace --count -C . ./mass-pnf-sim.py ..... +``` + +### Cleaning and recovery after incorrect configuration -###Cleaning and recovery after incorrect configuration +``` docker stop $(docker ps -aq); docker rm $(docker ps -aq) +``` diff --git a/test/mocks/mass-pnf-sim/pnf-sim-lightweight/README.md b/test/mocks/mass-pnf-sim/pnf-sim-lightweight/README.md index 0e2b668a4..927140571 100644 --- a/test/mocks/mass-pnf-sim/pnf-sim-lightweight/README.md +++ b/test/mocks/mass-pnf-sim/pnf-sim-lightweight/README.md @@ -1,16 +1,26 @@ -##Local development shortcuts: -####To start listening on port 10000 for test purposes -`nc -l -k -p 10000` -####Test the command above: -`echo "Hello World" | nc localhost 10000` +## Local development shortcuts: -####Trigger the pnf simulator locally: +To start listening on port 10000 for test purposes: + +``` +nc -l -k -p 10000 +``` + +Test the command above: + +``` +echo "Hello World" | nc localhost 10000 +``` + +Trigger the pnf simulator locally: ``` ~/dev/git/integration/test/mocks/mass-pnf-sim/pnf-sim-lightweight$ curl -s -X POST -H "Content-Type: application/json" -H "X-ONAP-RequestID: 123" -H "X-InvocationID: 456" -d @config/config.json http://localhost:5000/simulator/start ``` -#### VES event sending + +## VES event sending + the default action is to send a VES Message every 15 minutes and the total duration of the VES FileReady Message sending is 1 day (these values can be changed in config/config.json) Message from the stdout of nc: @@ -27,7 +37,8 @@ User-Agent: Apache-HttpClient/4.5.5 (Java/1.8.0_162) Accept-Encoding: gzip,deflate ``` -```javascript +```i +javascript {"event":{"commonEventHeader":{"startEpochMicrosec":"1551865758690","sourceId":"val13","eventId":"registration_51865758", "nfcNamingCode":"oam","internalHeaderFields":{},"priority":"Normal","version":"4.0.1","reportingEntityName":"NOK6061ZW3", "sequence":"0","domain":"notification","lastEpochMicrosec":"1551865758690","eventName":"pnfRegistration_Nokia_5gDu", @@ -36,4 +47,4 @@ Accept-Encoding: gzip,deflate "arrayOfNamedHashMap":[{"name":"10MB.tar.gz","hashMap":{ "location":"ftpes://10.11.0.68/10MB.tar.gz","fileFormatType":"org.3GPP.32.435#measCollec", "fileFormatVersion":"V10","compression":"gzip"}}]}}} -```
\ No newline at end of file +``` diff --git a/test/mocks/pnf-onboarding/README.md b/test/mocks/pnf-onboarding/README.md index b14b34d95..b75a77631 100644 --- a/test/mocks/pnf-onboarding/README.md +++ b/test/mocks/pnf-onboarding/README.md @@ -1,20 +1,22 @@ -PNF Package for Integration Test -================================ +# PNF Package for Integration Test **NOTE: Requires openssl to be preinstalled.** This module builds 3 PNF packages based on the files in `/src/main/resources/csarContent/` 1. unsigned package: - `sample-pnf-1.0.1-SNAPSHOT.csar` + `sample-pnf-1.0.1-SNAPSHOT.csar` 2. signed packages: - A) `sample-signed-pnf-1.0.1-SNAPSHOT.zip` - B) `sample-signed-pnf-cms-includes-cert-1.0.1-SNAPSHOT.zip` - The signed packages are based on ETSI SOL004 Security Option 2. They contain csar, cert and cms files. In package B cms includes cert. + A) `sample-signed-pnf-1.0.1-SNAPSHOT.zip` + B) `sample-signed-pnf-cms-includes-cert-1.0.1-SNAPSHOT.zip` + The signed packages are based on ETSI SOL004 Security Option 2. They contain csar, cert and cms files. In package B cms includes cert. The packages are generated by running the following command in the same directory as this readme file i.e. pnf-onboarding directory: -> `$ mvn clean install` + +> ``` +> `$ mvn clean install` +> ``` The packages will be stored in the maven generated `target` directory. @@ -22,5 +24,7 @@ To be able to use the signed packages in SDC the `src/main/resources/securityCon If SDC is running in containers locally then the following commands could be used to copy the root.cert to the default location in SDC Onboarding Container. It is assumed that the commands are executed from inside pnf-onboarding directory. -> `$ docker exec -it <sdc-onboard-backend-container-id> mkdir -p /var/lib/jetty/cert` -> `$ docker cp src/main/resources/securityContent/root.cert <sdc-onboard-backend-container-id>:/var/lib/jetty/cert`
\ No newline at end of file +> ``` +> `$ docker exec -it <sdc-onboard-backend-container-id> mkdir -p /var/lib/jetty/cert` +> `$ docker cp src/main/resources/securityContent/root.cert <sdc-onboard-backend-container-id>:/var/lib/jetty/cert` +> ``` |