aboutsummaryrefslogtreecommitdiffstats
path: root/docs
diff options
context:
space:
mode:
Diffstat (limited to 'docs')
-rw-r--r--docs/development/devtools/apex-s3p.rst284
-rwxr-xr-xdocs/development/devtools/csv/20201016-1715-distr-stability-aggregate.csv11
-rwxr-xr-xdocs/development/devtools/csv/20201016-1715-distr-stability-summary.csv11
-rwxr-xr-xdocs/development/devtools/csv/20201020-1730-distr-performance-aggregate.csv10
-rwxr-xr-xdocs/development/devtools/csv/20201020-1730-distr-performance-summary.csv10
-rw-r--r--docs/development/devtools/distribution-s3p.rst331
-rw-r--r--docs/development/devtools/drools-s3p.rst157
-rwxr-xr-xdocs/development/devtools/images/20201016-1715-distr-stability-20201018T2040-monitor.pngbin0 -> 48913 bytes
-rwxr-xr-xdocs/development/devtools/images/20201016-1715-distr-stability-20201018T2040-threads.pngbin0 -> 64954 bytes
-rwxr-xr-xdocs/development/devtools/images/20201020-1730-distr-performance-20201020T2025-monitor.pngbin0 -> 50243 bytes
-rwxr-xr-xdocs/development/devtools/images/20201020-1730-distr-performance-20201020T2025-threads.pngbin0 -> 68058 bytes
-rw-r--r--docs/development/devtools/images/ControlLoop-vCPE-48f0c2c3-a172-4192-9ae3-052274181b6e.pngbin19884 -> 67774 bytes
-rw-r--r--docs/development/devtools/images/ControlLoop-vCPE-Fail.pngbin21369 -> 67512 bytes
-rw-r--r--docs/development/devtools/images/ControlLoop-vDNS-6f37f56d-a87d-4b85-b6a9-cc953cf779b3.pngbin22806 -> 69667 bytes
-rw-r--r--docs/development/devtools/images/ControlLoop-vDNS-Fail.pngbin20800 -> 66615 bytes
-rw-r--r--docs/development/devtools/images/ControlLoop-vFirewall-d0a1dfc6-94f5-4fd4-a5b5-4630b438850a.pngbin20423 -> 72763 bytes
-rw-r--r--docs/development/devtools/images/apex_perf_jm_1.PNGbin0 -> 202811 bytes
-rw-r--r--docs/development/devtools/images/apex_perf_jm_2.PNGbin0 -> 286406 bytes
-rw-r--r--docs/development/devtools/images/apex_s3p_jm-1.pngbin0 -> 78282 bytes
-rw-r--r--docs/development/devtools/images/apex_s3p_jm-2.pngbin0 -> 183969 bytes
-rw-r--r--docs/development/devtools/images/distribution-performance-api-report.pngbin76255 -> 0 bytes
-rw-r--r--docs/development/devtools/images/distribution-performance-summary-report.pngbin98261 -> 0 bytes
-rw-r--r--docs/development/devtools/images/distribution-results-tree.pngbin147859 -> 0 bytes
-rw-r--r--docs/development/devtools/images/distribution-summary-report.pngbin101801 -> 0 bytes
-rw-r--r--docs/development/devtools/images/distribution-vvm-monitor.pngbin103570 -> 0 bytes
-rw-r--r--docs/development/devtools/images/distribution-vvm-threads.pngbin159531 -> 0 bytes
-rw-r--r--docs/development/devtools/zip/frankfurt/apex_s3p_result.tar.gzbin686376 -> 4238336 bytes
27 files changed, 301 insertions, 513 deletions
diff --git a/docs/development/devtools/apex-s3p.rst b/docs/development/devtools/apex-s3p.rst
index 6143e165..6c59f655 100644
--- a/docs/development/devtools/apex-s3p.rst
+++ b/docs/development/devtools/apex-s3p.rst
@@ -27,153 +27,6 @@ Setup details
The stability test is performed on VM's running in OpenStack cloud environment. There are 2 seperate VM's, one for running apex pdp & other one for running JMeter to simulate steady flow of transactions.
-**OpenStack environment details**
-
-Version: Mitaka
-
-**apex-pdp VM details**
-
-OS:Ubuntu 18.04 LTS
-
-CPU: 4 core
-
-RAM: 4 GB
-
-HardDisk: 40 GB
-
-Docker Version: 19.03.8, build afacb8b7f0
-
-Java: openjdk version "11.0.7"
-
-**JMeter VM details**
-
-OS: Ubuntu 18.04 LTS
-
-CPU: 4 core
-
-RAM: 4 GB
-
-HardDisk: 40 GB
-
-Java: openjdk version "11.0.7"
-
-JMeter: 5.2.1
-
-Install JMeter in virtual machine
----------------------------------
-
-Make the etc/hosts entries
-
-.. code-block:: bash
-
- echo $(hostname -I | cut -d\ -f1) $(hostname) | sudo tee -a /etc/hosts
-
-Make the DNS entries
-
-.. code-block:: bash
-
- echo "nameserver <PrimaryDNSIPIP>" >> sudo /etc/resolvconf/resolv.conf.d/head
-
- echo "nameserver <SecondaryDNSIP>" >> sudo /etc/resolvconf/resolv.conf.d/head
-
- resolvconf -u
-
-Update the ubuntu software installer
-
-.. code-block:: bash
-
- apt-get update
-
-Check & Install Java
-
-.. code-block:: bash
-
- apt-get install -y openjdk-11-jdk
-
- java -version
-
-Download & install JMeter
-
-.. code-block:: bash
-
- mkdir jMeter
-
-
- cd jMeter
-
-
- wget http://mirrors.whoishostingthis.com/apache//jmeter/binaries/apache-jmeter-5.2.1.zip
-
-
- unzip apache-jmeter-5.2.1.zip
-
-Install apex-pdp in virtual machine
------------------------------------
-
-We will be running apex-pdp as docker container. So we need to first install docker and then create the container hosting apex-pdp by pulling the image from ONAP repository.
-
-**Docker Installation**
-
-1. Make the etc/hosts entries
-
-.. code-block:: bash
-
- echo $(hostname -I | cut -d\ -f1) $(hostname) | sudo tee -a /etc/hosts
-
-2. Make the DNS entries
-
-.. code-block:: bash
-
- echo "nameserver <PrimaryDNSIPIP>" >> sudo /etc/resolvconf/resolv.conf.d/head
- echo "nameserver <SecondaryDNSIP>" >> sudo /etc/resolvconf/resolv.conf.d/head
- resolvconf -u
-
-3. Update the ubuntu software installer
-
-.. code-block:: bash
-
- apt-get update
-
-4. Check and Install Java
-
-.. code-block:: bash
-
- apt-get install -y openjdk-11-jdk
- java -version
-
-Ensure that the Java version that is executing is OpenJDK version 8
-
-5. Check and install docker
-
-.. code-block:: bash
-
- curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
- sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"
- sudo apt-get update
- sudo apt-cache policy docker-ce
- sudo apt-get install -y docker-ce
- sudo systemctl enable docker
- sudo systemctl start docker
- sudo usermod -aG docker <your user id>
-
-6. Logout and re-login to ensure the ``usermod`` command takes effective
-
-7. Check the status of the Docker service and ensure it is running correctly
-
-.. code-block:: bash
-
- docker ps
-
-**Install apex-pdp**
-
-Run the below command to create the container hosting apex-pdp by pulling the image from ONAP repository.
-
-.. code-block:: bash
-
- docker run -d --name apex -p 12561:12561 -p 23324:23324 -p 9911:9911 nexus3.onap.org:10001/onap/policy-apex-pdp:2.3.1 /bin/bash -c "/opt/app/policy/apex-pdp/bin/apexApps.sh jmx-test -c /opt/app/policy/apex-pdp/examples/config/SampleDomain/RESTServerJsonEvent.json"
- docker ps
-
-Note: If you observe that requests from JMeter client is failing due to timeout, then modify the "RESTServerJsonEvent.json" mentioned in the above command and increase the "synchronousTimeout" property as per needed.
Install & Configure VisualVM
----------------------------
@@ -211,7 +64,7 @@ Sample Screenshot of visualVM
Test Plan
---------
-The 72 hours stability test will run the following steps in 20 threaded loop.
+The 72 hours stability test will run the following steps in 5 threaded loop.
- **Send Input Event** - sends an input message to rest interface of apex-pdp.
- **Assert Response Code** - assert the response code coming from apex-pdp.
@@ -227,7 +80,7 @@ The following steps can be used to configure the parameters of test plan.
**Name** **Description** **Default Value**
================== ============================================================================ ============================
wait Wait time after each request (in milliseconds) 500
-threads Number of threads to run test cases in parallel. 20
+threads Number of threads to run test cases in parallel. 5
threadsTimeOutInMs Synchronization timer for threads running in parallel (in milliseconds). 5000
================== ============================================================================ ============================
@@ -249,27 +102,27 @@ Stability Test Result
**Summary**
-Stability test plan was triggered for 72 hours injecting input events to apex-pdp from 20 client threads running in JMeter.
+Stability test plan was triggered for 72 hours injecting input events to apex-pdp from 5 client threads.
-After the test stop, we can generate a HTML test report via command
+Once the test has complete - we can generate a HTML test report via the following command
.. code-block:: bash
~/jMeter/apache-jmeter-5.2.1/bin/jmeter -g stability.log -o ./result/
-============================================== =================================================== ================================ ============= ============
-**Number of Client Threads running in JMeter** **Number of Server Threads running in Apex engine** **Total number of input events** **Success %** **Error %**
-============================================== =================================================== ================================ ============= ============
-20 4 8594220 100% 0%
-============================================== =================================================== ================================ ============= ============
+============================================== ================================ ============= ============ ============================
+**Number of Client Threads running in JMeter** **Total number of input events** **Success %** **Error %** **Average Time per Request**
+============================================== ================================ ============= ============ ============================
+5 8594220 100% 0% 5518.73
+============================================== ================================ ============= ============ ============================
.. image:: images/stability-jmeter.PNG
:download:`result.zip <zip/frankfurt/apex_s3p_result.tar.gz>`
-Frankfurt release
-^^^^^^^^^^^^^^^^^^
+Stability Test of Apex PDP
+^^^^^^^^^^^^^^^^^^^^^^^^^^
The 72 hour Stability Test for apex-pdp has the goal of introducing a steady flow of transactions using jMeter.
@@ -277,8 +130,8 @@ The input event will be submitted through the rest interface of DMaaP , which th
This test will be performed in an OOM deployment setup. The test will be performed in a multi-threaded environment where 5 threads running in JMeter will keep sending events for the duration of 72 hours.
-Test Plan Frankfurt release
----------------------------
+Test Plan
+---------
The 72 hours stability test will run the following steps in a 5 threaded loop.
@@ -302,6 +155,10 @@ The following steps can be used to configure the parameters of the test plan.
wait Wait time after each request (in milliseconds) 120000
threads Number of threads to run test cases in parallel. 5
threadsTimeOutInMs Synchronization timer for threads running in parallel (in milliseconds). 150000
+PAP_PORT Port number of PAP for making REST API calls
+API_PORT Port number of API for making REST API calls
+APEX_PORT Port number of APEX for making REST API calls
+DMAAP_PORT Port number of DMAAP for making REST API calls
================== ============================================================================ ============================
@@ -309,114 +166,85 @@ Download and update the jmx file presented in the apex-pdp git repository - `jmx
- HTTPSampler.domain - The ip address of the VM in which the apex container is running
- HTTPSampler.port - The listening port, here is 23324
-- ThreadGroup.druation - Set the duration to 72 hours (in seconds)
+- ThreadGroup.duration - Set the duration to 72 hours (in seconds)
Use the CLI mode to start the test
.. code-block:: bash
- ./jmeter.sh -n -t ~/apexPdpStabilityTestPlan.jmx -Jusers=1 -l ~/stability.log
+ nohup ./jmeter.sh -n -t ~/apexPdpStabilityTestPlan.jmx -Jusers=1 -l ~/stability.log
+
+Stability Test Results
+----------------------
-Stability Test Results Frankfurt release
------------------------------------------
+The stability test plan was triggered for 72 hours, injecting input events to apex-pdp pod from 5 client threads running in JMeter.
-The stability test plan was triggered for 72 hours, injecting input events to apex-pdp from 5 client threads running in JMeter.
+The stability tests were executed as part of a full ONAP OOM deployment in Nordix lab.
-After the test stops, we can generate an HTML test report via the command:
+Once the tests complete, we can generate an HTML test report via the command:
.. code-block:: bash
~/jMeter/apache-jmeter-5.2.1/bin/jmeter -g stability.log -o ./result/
-============================================== =================================================== ================================ ============= ============
-**Number of Client Threads running in JMeter** **Number of Server Threads running in Apex engine** **Total number of input events** **Success %** **Error %**
-============================================== =================================================== ================================ ============= ============
-5 4 26766 100% 0%
-============================================== =================================================== ================================ ============= ============
-
-**VisualVM Screenshot**
+============================================== ================================ ============= ============ ============================
+**Number of Client Threads running in JMeter** **Total number of input events** **Success %** **Error %** **Average Time per Request**
+============================================== ================================ ============= ============ ============================
+5 8594220 100% 0% 5518.73
+============================================== ================================ ============= ============ ============================
-.. image:: images/frankfurt/apex_s3p_vm-1.png
-.. image:: images/frankfurt/apex_s3p_vm-2.png
**JMeter Screenshot**
-.. image:: images/frankfurt/apex_s3p_jm-1.png
-.. image:: images/frankfurt/apex_s3p_jm-2.png
+.. image:: images/apex_s3p_jm-1.png
+.. image:: images/apex_s3p_jm-2.png
:download:`result.zip <zip/frankfurt/apex_s3p_result.tar.gz>`
Setting up Performance Tests in APEX
++++++++++++++++++++++++++++++++++++
-The apex-pdp has built in support for performance testing. A special performance testing REST server is available in the code base for performance testing.
-It is in the module `performance-benchmark-test <https://github.com/onap/policy-apex-pdp/tree/master/testsuites/performance/performance-benchmark-test>`_.
-To execute a benchmark test, you start the REST server, and then configure and run APEX against the server.
-There are example configurations for running tests in the `resources of this module <https://github.com/onap/policy-apex-pdp/tree/master/testsuites/performance/performance-benchmark-test/src/main/resources/examples/benchmark>`_.
+The Performance test is performed on a similar setup to the Stability test. JMeter will send a large number of REST requests and will then retrieve those requests.
-In order to run the test for 72 hours, set the batch count in the `EventGeneratorConfig.json <https://github.com/onap/policy-apex-pdp/blob/master/testsuites/performance/performance-benchmark-test/src/main/resources/examples/benchmark/EventGeneratorConfig.json>`_ file to zero, which causes the REST server to generate batches forever.
+Performnce test plan will be the same as the stability test plan except for some differences listed below:
-Here is an example of how to do this:
+- Increase the number of threads from 5 to 60.
+- Reduce test time to ninety minutes.
+- Calculate the amount of requests handled in the time frame.
-1. Clone and build the apex-pdp git repo
+Run Test
+--------
-2. Go into the performance-benchmark-test module and run the REST server
+Running the performance test will be the same as the stability test. That is, launch JMeter pointing to corresponding *.jmx* test plan. The *API_HOST* , *API_PORT* , *PAP_HOST* , *PAP_PORT* are already set up in *.jmx*.
.. code-block:: bash
- cd testsuites/performance/performance-benchmark-test
- mvn exec:java -Dexec.mainClass="org.onap.policy.apex.testsuites.performance.benchmark.eventgenerator.EventGenerator" -Dexec.args="-c src/main/resources/examples/benchmark/EventGeneratorConfig.json"
+ nohup ./jmeter.sh -n -t ~/performance.jmx -Jusers=1 -l ~/perf.log
-3. Separately, create a local directory and unzip the APEX tarball
+Once the tests have completed, run the following the gather results.
.. code-block:: bash
- mkdir apex
- cd apex
- tar zxvf ~/git/onap/policy/apex-pdp/packages/apex-pdp-package-full/target/*gz
+ ~/jMeter/apache-jmeter-5.2.1/bin/jmeter -g perf.log -o ./performance_result/
-4. Run APEX with a configuration that runs against the benchmark REST server, select the configuration that is appropriate for the number of threads for the number of cores on the host on which APEX is running. For example on a 32 core machine, select the "32" configuration, on an 8 core machine, select the "08" configuration.
+Performance Test Result
+-----------------------
-.. code-block:: bash
-
- bin/apexApps.sh engine -c ~/git/onap/policy/apex-pdp/testsuites/performance/performance-benchmark-test/src/main/resources/examples/benchmark/Javascript64.json
-
-5. To get the test results, Issue the following command using CURL or from a browser(also can store the result into a file by setting outfile in the `EventGeneratorConfig.json <https://github.com/onap/policy-apex-pdp/blob/master/testsuites/performance/performance-benchmark-test/src/main/resources/examples/benchmark/EventGeneratorConfig.json>`_ file, statistics would be written into this file after event generator terminated)
-
-.. code-block:: bash
-
- curl http://localhost:32801/EventGenerator/Stats
-
-The results are similar to those below:
+**Summary**
-:download:`Example APEX performance metrics <json/example-apex-perf.json>`
+Performance test was triggered for 90 minutes. The results are shown below.
-Performance Test Result Frankfurt
----------------------------------
+**Test Statistics**
-**Summary**
+============================ =========== ========= ==================================
+**Total Number of Requests** **Success** **Error** **Average Time Taken per Request**
+============================ =========== ========= ==================================
+9870 100 % 0 % 5506.09 ms
+============================ =========== ========= ==================================
-Performance test was triggered for 2 hours on a 4 core, 4GB RAM virtual machine.
+**JMeter Screenshot**
-**Test Statistics**
+.. image:: images/apex_perf_jm_1.png
-:download:`Attached result log <json/frankfurt-apex-perf.json>`
-
-=============== ============= ================= ============== ===================== ================== ============= ===========
-**batchNumber** **batchSize** **eventsNotSent** **eventsSent** **eventsNotReceived** **eventsReceived** **Success %** **Error %**
-=============== ============= ================= ============== ===================== ================== ============= ===========
--1 431250 0 431250 0 431250 100 % 0 %
-=============== ============= ================= ============== ===================== ================== ============= ===========
-
-======================== ========================= ========================
-**averageRoundTripNano** **shortestRoundTripNano** **longestRoundTripNano**
-======================== ========================= ========================
-148965724 20169907 429339393
-======================== ========================= ========================
-
-============================ ============================= ============================
-**averageApexExecutionNano** **shortestApexExecutionNano** **longestApexExecutionNano**
-============================ ============================= ============================
-62451899 3901010 354528579
-============================ ============================= ============================
+.. image:: images/apex_perf_jm_2.png
diff --git a/docs/development/devtools/csv/20201016-1715-distr-stability-aggregate.csv b/docs/development/devtools/csv/20201016-1715-distr-stability-aggregate.csv
new file mode 100755
index 00000000..88ff3a37
--- /dev/null
+++ b/docs/development/devtools/csv/20201016-1715-distr-stability-aggregate.csv
@@ -0,0 +1,11 @@
+Label,# Samples,Average,Median,90% Line,95% Line,99% Line,Min,Max,Error %,Throughput,Received KB/sec,Sent KB/sec
+Remove CSAR,10813,7,6,8,9,21,5,1065,0.000%,.04172,0.02,0.00
+Add CSAR script,10813,7,7,8,9,14,5,2026,0.000%,.04172,0.02,0.00
+Healthcheck,10813,1,1,2,2,3,0,86,0.000%,.04172,0.01,0.01
+Statistics,10813,1,1,2,2,3,0,93,0.000%,.04172,0.01,0.01
+CheckPDPGroupQuery,10813,246,195,395,546,981,33,7297,0.000%,.04172,0.11,0.01
+Check Policy Deployed,10813,20,21,23,24,28,12,255,0.000%,.04172,0.02,0.01
+Undeploy Policy ,10813,357,306,517,678,1118,195,7267,0.000%,.04172,0.01,0.01
+Delete Policy,10813,394,335,587,782,1165,204,7176,0.000%,.04172,5.40,0.02
+CheckPDPGroupQueryForDeletedPolicy,10812,230,180,379,537,930,17,5243,0.000%,.04172,0.11,0.01
+TOTAL,97316,140,21,367,471,863,0,7297,0.000%,.37545,5.72,0.08
diff --git a/docs/development/devtools/csv/20201016-1715-distr-stability-summary.csv b/docs/development/devtools/csv/20201016-1715-distr-stability-summary.csv
new file mode 100755
index 00000000..12547b60
--- /dev/null
+++ b/docs/development/devtools/csv/20201016-1715-distr-stability-summary.csv
@@ -0,0 +1,11 @@
+Label,# Samples,Average,Min,Max,Std. Dev.,Error %,Throughput,Received KB/sec,Sent KB/sec,Avg. Bytes
+Remove CSAR,10813,7,5,1065,20.90,0.000%,.04172,0.02,0.00,548.0
+Add CSAR script,10813,7,5,2026,24.81,0.000%,.04172,0.02,0.00,473.0
+Healthcheck,10813,1,0,86,1.48,0.000%,.04172,0.01,0.01,227.0
+Statistics,10813,1,0,93,2.96,0.000%,.04172,0.01,0.01,323.9
+CheckPDPGroupQuery,10813,246,33,7297,207.07,0.000%,.04172,0.11,0.01,2814.0
+Check Policy Deployed,10813,20,12,255,4.79,0.000%,.04172,0.02,0.01,461.0
+Undeploy Policy ,10813,357,195,7267,206.26,0.000%,.04172,0.01,0.01,260.0
+Delete Policy,10813,394,204,7176,203.89,0.000%,.04172,5.40,0.02,132538.0
+CheckPDPGroupQueryForDeletedPolicy,10812,230,17,5243,179.75,0.000%,.04172,0.11,0.01,2756.0
+TOTAL,97316,140,0,7297,205.30,0.000%,.37545,5.72,0.08,15600.2
diff --git a/docs/development/devtools/csv/20201020-1730-distr-performance-aggregate.csv b/docs/development/devtools/csv/20201020-1730-distr-performance-aggregate.csv
new file mode 100755
index 00000000..4accfca4
--- /dev/null
+++ b/docs/development/devtools/csv/20201020-1730-distr-performance-aggregate.csv
@@ -0,0 +1,10 @@
+Label,# Samples,Average,Median,90% Line,95% Line,99% Line,Min,Max,Error %,Throughput,Received KB/sec,Sent KB/sec
+Remove CSAR,198,4,4,5,5,11,3,13,0.000%,.01380,0.00,0.00
+Add CSAR script,198,16,16,18,20,22,14,23,0.000%,.01380,0.00,0.00
+Healthcheck,71655,1,1,2,2,2,0,205,0.000%,4.97617,1.10,1.14
+Statistics,71652,0,1,1,1,2,0,85,0.000%,4.97609,1.57,1.14
+Check Policy Deployed,198,21,20,23,25,31,16,205,0.000%,.01380,0.03,0.00
+CheckPDPGroupQuery,198,234,193,350,524,915,27,1034,0.000%,.01381,0.05,0.00
+Undeploy/Delete Policies,197,12340,12155,14027,14343,15618,10402,16007,0.000%,.01379,17.94,0.00
+CheckPDPGroupQueryForDeletedPolicy,197,185,161,256,328,584,109,734,0.000%,.01380,0.04,0.00
+TOTAL,144493,18,1,2,2,3,0,16007,0.000%,10.03406,20.59,2.29
diff --git a/docs/development/devtools/csv/20201020-1730-distr-performance-summary.csv b/docs/development/devtools/csv/20201020-1730-distr-performance-summary.csv
new file mode 100755
index 00000000..f4046a57
--- /dev/null
+++ b/docs/development/devtools/csv/20201020-1730-distr-performance-summary.csv
@@ -0,0 +1,10 @@
+Label,# Samples,Average,Min,Max,Std. Dev.,Error %,Throughput,Received KB/sec,Sent KB/sec,Avg. Bytes
+Remove CSAR,198,4,3,13,1.30,0.000%,.01380,0.00,0.00,226.0
+Add CSAR script,198,16,14,23,1.56,0.000%,.01380,0.00,0.00,220.0
+Healthcheck,71655,1,0,205,1.39,0.000%,4.97617,1.10,1.14,227.0
+Statistics,71652,0,0,85,0.88,0.000%,4.97609,1.57,1.14,323.0
+Check Policy Deployed,198,21,16,205,13.29,0.000%,.01380,0.03,0.00,2313.0
+CheckPDPGroupQuery,198,234,27,1034,148.16,0.000%,.01381,0.05,0.00,3396.0
+Undeploy/Delete Policies,197,12340,10402,16007,1205.91,0.000%,.01379,17.94,0.00,1332213.2
+CheckPDPGroupQueryForDeletedPolicy,197,185,109,734,84.16,0.000%,.01380,0.04,0.00,2756.0
+TOTAL,144493,18,0,16007,457.65,0.000%,10.03406,20.59,2.29,2101.3
diff --git a/docs/development/devtools/distribution-s3p.rst b/docs/development/devtools/distribution-s3p.rst
index 093e28c0..1db411c6 100644
--- a/docs/development/devtools/distribution-s3p.rst
+++ b/docs/development/devtools/distribution-s3p.rst
@@ -7,179 +7,215 @@
Policy Distribution component
#############################
-72 Hours Stability Test of Distribution
-+++++++++++++++++++++++++++++++++++++++
+72h Stability and 4h Performance Tests of Distribution
+++++++++++++++++++++++++++++++++++++++++++++++++++++++
-Introduction
-------------
-The 72 hour Stability Test for policy distribution has the goal of introducing a steady flow of transactions initiated from a test client server running JMeter. The policy distribution is configured with a special FileSystemReception plugin to monitor a local directory for newly added csar files to be processed by itself. The input CSAR will be added/removed by the test client(JMeter) and the result will be pulled from the backend(PAP and PolicyAPI) by the test client(JMeter).
+VM Details
+----------
+The stability and performance tests are performed on VM's running in the OpenStack cloud environment in the ONAP integration lab. There are two separate VMs, one for running backend policy services which policy distribution needs, and the other for the policy distribution service itself and Jmeter.
-The test will be performed in an environment where Jmeter will continuously add/remove a test csar into the special directory where policy distribuion is monitoring and will then get the processed results from PAP and PolicyAPI to verify the successful deployment of the policy. The policy will then be undeployed and the test will loop continuously until 72 hours have elapsed.
+**OpenStack environment details**
-Setup details
--------------
+- Version: Windriver Titanium
-The stability test is performed on VM's running in the OpenStack cloud environment in the ONAP integration lab. There are 2 separate VMs, one for running backend policy services which policy distribution needs, and the other is for policy distribution service itself and Jmeter.
+**Policy Backend VM details (VM1)**
-**OpenStack environment details**
+- OS: Ubuntu 18.04.5 LTS
+- CPU: 8 core, Intel Xeon E3-12xx v2 (Ivy Bridge), 2693.668 MHz, 16384 kB cache
+- RAM: 32 GB
+- HardDisk: 200 GB
+- Docker version 19.03.8, build afacb8b7f0
+- Java: openjdk 11.0.8 2020-07-14
+
+**JMeter and Distribution VM details (VM2)**
-Version: Windriver Titanium
+- OS: Ubuntu 18.04.5 LTS
+- CPU: 8 core, Intel Xeon E3-12xx v2 (Ivy Bridge), 2693.668 MHz, 16384 kB cache
+- RAM: 32 GB
+- HardDisk: 200 GB
+- Docker version 19.03.8, build afacb8b7f0
+- Java: openjdk 11.0.8 2020-07-14
+- JMeter: 5.1.1
-**Policy Backend VM details(VM1)**
-OS:Ubuntu 18.04.4 LTS
+VM1 & VM2: Common Setup
+-----------------------
+Make sure to execute below commands on both VM1 & VM2
-CPU: 8 core
+Update the ubuntu software installer
-RAM: 32 GB
+.. code-block:: bash
-HardDisk: 160 GB
+ sudo apt update
-Docker version 19.03.8, build afacb8b7f0
+Install Java
-Java: openjdk version "11.0.7"
+.. code-block:: bash
-**JMeter and Distribution VM details(VM2)**
+ sudo apt install -y openjdk-11-jdk
-OS: Ubuntu 18.04.4 LTS
+Ensure that the Java version that is executing is OpenJDK version 11
-CPU: 8 core
+.. code-block:: bash
-RAM: 32 GB
+ $ java --version
+ openjdk 11.0.8 2020-07-14
+ OpenJDK Runtime Environment (build 11.0.8+10-post-Ubuntu-0ubuntu118.04.1)
+ OpenJDK 64-Bit Server VM (build 11.0.8+10-post-Ubuntu-0ubuntu118.04.1, mixed mode, sharing)
-HardDisk: 160 GB
+Install Docker
-Docker version 19.03.8, build afacb8b7f0
+.. code-block:: bash
-Java: openjdk version "11.0.7"
+ # Add docker repository
+ curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
+ sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"
+ sudo apt update
-JMeter: 5.1.1
+ # Check available docker versions (if necessary)
+ apt-cache policy docker-ce
-Install Docker in VM1 & VM2
----------------------------
-Make sure to execute below commands in VM1 & VM2 both.
+ # Install docker
+ sudo apt install -y docker-ce=5:19.03.8~3-0~ubuntu-bionic docker-ce-cli=5:19.03.8~3-0~ubuntu-bionic containerd.io
-Update the ubuntu software installer
+Change the permissions of the Docker socket file
.. code-block:: bash
- $ apt-get update
+ sudo chmod 666 /var/run/docker.sock
-Install and check Java
+Check the status of the Docker service and ensure it is running correctly
.. code-block:: bash
- $ apt-get install -y openjdk-11-jdk
- $ java -version
+ $ systemctl status --no-pager docker
+ docker.service - Docker Application Container Engine
+ Loaded: loaded (/lib/systemd/system/docker.service; enabled; vendor preset: enabled)
+ Active: active (running) since Wed 2020-10-14 13:59:40 UTC; 1 weeks 0 days ago
+ # ... (truncated for brevity)
-Ensure that the Java version that is executing is OpenJDK version 11
+ $ docker ps
+ CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
-Install and check Docker
+Clone the policy-distribution repo to access the test scripts
.. code-block:: bash
- $ curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add - add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"
- $ apt-get update
- $ apt-cache policy docker-ce
- $ apt-get install -y docker-ce
- $ systemctl status docker
- $ docker ps
+ git clone https://gerrit.onap.org/r/policy/distribution
-Change the permissions of the Docker socket file
-.. code-block:: bash
+VM1 Only: Install Simulators, Policy-PAP, Policy-API and MariaDB
+----------------------------------------------------------------
- $ sudo chmod 666 /var/run/docker.sock
+Modify the setup_components.sh script located at:
-Check the status of the Docker service and ensure it is running correctly
+- ~/distribution/testsuites/stability/src/main/resources/simulatorsetup/setup_components.sh
-.. code-block:: bash
+Ensure the correct docker image versions are specified - e.g. for Guilin-RC0
- $ service docker status
- $ docker ps
+- nexus3.onap.org:10001/onap/policy-api:2.3.2
+- nexus3.onap.org:10001/onap/policy-pap:2.3.2
-Install Simulators, PAP, PolicyAPI and MariaDB in VM1
------------------------------------------------------
+Run the setup_components.sh script to start the test support components:
-To install all the components needed for Distribution, copy over the script and related files found within the simulatorsetup directory within $(REPOPATH)/distribution/testsuites/stability/src/main/resources
+.. code-block:: bash
-Run setup_components.sh script to bring up the required docker containers
+ ~/distribution/testsuites/stability/src/main/resources/distributionsetup/setup_distribution.sh
After installation, ensure the following docker containers are up and running:
.. code-block:: bash
- CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
- 11195b01300a nexus3.onap.org:10001/onap/policy-pap:2.2.2-SNAPSHOT "bash ./policy-pap.sh" 13 seconds ago Up 9 seconds 0.0.0.0:7000->6969/tcp policy-pap
- 6266aa6b0137 nexus3.onap.org:10001/onap/policy-api:2.2.3-SNAPSHOT "bash ./policy-api.sh" 25 seconds ago Up 22 seconds 0.0.0.0:6969->6969/tcp policy-api
- 6a85d155aa8a pdp/simulator:latest "bash pdp-sim.sh" About a minute ago Up About a minute pdp-simulator
- 0b41992ccfd7 dmaap/simulator:latest "bash dmaap-sim.sh" About a minute ago Up About a minute 0.0.0.0:3904->3904/tcp message-router
- 595056b2a094 mariadb:10.2.14 "docker-entrypoint.s…" About a minute ago Up About a minute 0.0.0.0:3306->3306/tcp mariadb
+ $ docker ps
+ CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
+ a187cb0ff08a nexus3.onap.org:10001/onap/policy-pap:2.3.2 "bash ./policy-pap.sh" 4 days ago Up 4 days 0.0.0.0:7000->6969/tcp policy-pap
+ 2f7632fe90c3 nexus3.onap.org:10001/onap/policy-api:2.3.2 "bash ./policy-api.sh" 4 days ago Up 4 days 0.0.0.0:6969->6969/tcp policy-api
+ 70fa27d6d992 pdp/simulator:latest "bash pdp-sim.sh" 4 days ago Up 4 days pdp-simulator
+ 3c9ff28ba050 dmaap/simulator:latest "bash dmaap-sim.sh" 4 days ago Up 4 days 0.0.0.0:3904->3904/tcp message-router
+ 60cfcf8cfe65 mariadb:10.2.14 "docker-entrypoint.s…" 4 days ago Up 4 days 0.0.0.0:3306->3306/tcp mariadb
+
-Install Distribution in VM2
----------------------------
+VM2 Only: Install Distribution
+------------------------------
-To install the Distribution service, copy over the script and related files found within the distributionsetup directory within $(REPOPATH)/distribution/testsuites/stability/src/main/resources
+Modify the setup_distribution.sh script located at:
-Run setup_distribution.sh script to install the distribution service, provide the IP of VM1 as the arguments to the script.
+- ~/distribution/testsuites/stability/src/main/resources/distributionsetup/setup_distribution.sh
+
+Ensure the correct docker image version is specified - e.g. for Guilin-RC0:
+
+- nexus3.onap.org:10001/onap/policy-distribution:2.4.2
+
+Run the setup_distribution.sh script to install the distribution service, provide the IP of VM1 (twice) as the arguments to the script:
-e.g
.. code-block:: bash
- $ ./setup_distribution.sh 10.2.0.24 10.2.0.24
+ ~/distribution/testsuites/stability/src/main/resources/distributionsetup/setup_distribution.sh <vm1-ipaddr> <vm1-ipaddr>
Ensure the distribution container is running.
-Install JMeter in VM2
----------------------
+.. code-block:: bash
-Download and install jMeter
+ $ docker ps
+ CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
+ 9a8db2bad156 nexus3.onap.org:10001/onap/policy-distribution:2.4.2 "bash ./policy-dist.…" 29 hours ago Up 29 hours 0.0.0.0:6969->6969/tcp, 0.0.0.0:9090->9090/tcp policy-distribution
-.. code-block:: bash
- $ mkdir jMeter
- $ cd jMeter
- $ wget https://archive.apache.org/dist/jmeter/binaries/apache-jmeter-5.1.1.zip
- $ unzip apache-jmeter-5.1.1.zip
+VM2 Only: Install JMeter
+------------------------
-Install & configure visualVM in VM2
------------------------------------
-VisualVM needs to be installed in the virtual machine running Distrbution. It will be used to monitor CPU, Memory and GC for Distribution while the stability tests are running.
+Download and install JMeter
.. code-block:: bash
- $ sudo apt-get install visualVM
+ # Install required packages
+ sudo apt install -y wget unzip
-Run these commands to configure permissions
+ # Install JMeter
+ mkdir -p jmeter
+ wget https://archive.apache.org/dist/jmeter/binaries/apache-jmeter-5.1.1.zip
+ unzip -qd jmeter apache-jmeter-5.1.1.zip
+ rm apache-jmeter-5.1.1.zip
+
+
+VM2 Only: Install & configure visualVM
+--------------------------------------
+
+VisualVM needs to be installed in the virtual machine running Distrbution (VM2). It will be used to monitor CPU, Memory and GC for Distribution while the stability tests are running.
.. code-block:: bash
- $ cd /usr/lib/jvm/java-11-openjdk-amd64/bin/
- $ sudo touch visualvm.policy
- $ sudo chmod 777 visualvm.policy
+ sudo apt install -y visualvm
- $ vi visualvm.policy
+Run these commands to configure permissions
- Add the following in visualvm.policy
+.. code-block:: bash
- grant codebase "file:/usr/lib/jvm/java-11-openjdk-amd64/lib/tools.jar" {
- permission java.security.AllPermission;
+ # Create Java security policy file for VisualVM
+ sudo cat > /usr/lib/jvm/java-11-openjdk-amd64/bin/visualvm.policy << EOF
+ grant codebase "jrt:/jdk.jstatd" {
+ permission java.security.AllPermission;
+ };
+ grant codebase "jrt:/jdk.internal.jvmstat" {
+ permission java.security.AllPermission;
};
+ EOF
-Run the following commands to start jstatd using port 1111
+ # Set globally accessable permissions on policy file
+ sudo chmod 777 /usr/lib/jvm/java-11-openjdk-amd64/bin/visualvm.policy
+
+Run the following command to start jstatd using port 1111
.. code-block:: bash
- $ cd /usr/lib/jvm/java-8-openjdk-amd64/bin/
- $ ./jstatd -p 1111 -J-Djava.security.policy=visualvm.policy &
+ /usr/lib/jvm/java-11-openjdk-amd64/bin/jstatd -p 1111 -J-Djava.security.policy=/usr/lib/jvm/java-11-openjdk-amd64/bin/visualvm.policy &
-Using the VM2 Desktop, run visualVM to connect to localhost:9090
-Run the command
+Run visualVM to connect to localhost:9090
.. code-block:: bash
- $ visualvm
+ visualvm &
This will load up the visualVM GUI
@@ -189,16 +225,26 @@ Connect to Distribution JMX Port.
2. Enter the Port 9090. this is the JMX port exposed by the dsitribution container
3. Double click on the newly added nodes under "Local" to start monitoring CPU, Memory & GC.
-Example Screenshot
-
-Sample Screenshot of visualVM
+Example Screenshot of visualVM
.. image:: images/distribution-s3p-vvm-sample.png
-Test Plan Setup
----------------
-The 72 hours stability test will run the following steps sequentially in a single threaded loop.
+Stability Test of Policy Distribution
++++++++++++++++++++++++++++++++++++++
+
+Introduction
+------------
+
+The 72 hour Stability Test for policy distribution has the goal of introducing a steady flow of transactions initiated from a test client server running JMeter. The policy distribution is configured with a special FileSystemReception plugin to monitor a local directory for newly added csar files to be processed by itself. The input CSAR will be added/removed by the test client(JMeter) and the result will be pulled from the backend(PAP and PolicyAPI) by the test client (JMeter).
+
+The test will be performed in an environment where Jmeter will continuously add/remove a test csar into the special directory where policy distribuion is monitoring and will then get the processed results from PAP and PolicyAPI to verify the successful deployment of the policy. The policy will then be undeployed and the test will loop continuously until 72 hours have elapsed.
+
+
+Test Plan Sequence
+------------------
+
+The 72h stability test will run the following steps sequentially in a single threaded loop.
- **Delete Old CSAR** - Checks if CSAR already exists in the watched directory, if so it deletes it
- **Add CSAR** - Adds CSAR to the directory that distribution is watching
@@ -230,46 +276,47 @@ Screenshot of Distribution stability test plan
.. image:: images/distribution-s3p-testplan.png
+
Running the Test Plan
---------------------
-Copy the Test Plans folder onto VM2
-Edit the /tmp/ folder permissions to allow the Testplan to insert the CSAR into the /tmp/policydistribution/distributionmount/ folder
+Edit the /tmp folder permissions to allow the testplan to insert the CSAR into the /tmp/policydistribution/distributionmount folder
.. code-block:: bash
- $ sudo chmod a+trwx /tmp
+ sudo mkdir -p /tmp/policydistribution/distributionmount
+ sudo chmod -R a+trwx /tmp
-From the apache jMeter folder run the test, pointing it towards the stabiltiy.jmx file inside the testplans folder
+From the apache JMeter folder run the test for 72h, pointing it towards the stability.jmx file inside the testplans folder and specifying a logfile to collect the results
.. code-block:: bash
- $ ./bin/jmeter -n -t /home/rossc/testplans/stability.jmx -Jduration=259200 -l testresults.jtl
+ ~/jmeter/apache-jmeter-5.1.1/bin/jmeter -n -t ~/distribution/testsuites/stability/src/main/resources/testplans/stability.jmx -Jduration=259200 -l ~/20201016-1715-distr-stability.jtl &
+
Test Results
------------
**Summary**
-Stability test plan was triggered for 72 hours.
+- Stability test plan was triggered for 72 hours.
+- No errors were reported
**Test Statistics**
-======================= ================= ================== ==================================
-**Total # of requests** **Success %** **Error %** **Average time taken per request**
-======================= ================= ================== ==================================
-194313 100 % 0 % 145 ms
-======================= ================= ================== ==================================
+.. csv-table:: Stability Results - Summary Report
+ :file: csv/20201016-1715-distr-stability-summary.csv
+ :header-rows: 1
-**VisualVM Screenshot**
+.. csv-table:: Stability Results - Aggregate Report
+ :file: csv/20201016-1715-distr-stability-aggregate.csv
+ :header-rows: 1
-.. image:: images/distribution-vvm-monitor.png
-.. image:: images/distribution-vvm-threads.png
+**VisualVM Screenshots**
-**JMeter Screenshot**
+.. image:: images/20201016-1715-distr-stability-20201018T2040-monitor.png
+.. image:: images/20201016-1715-distr-stability-20201018T2040-threads.png
-.. image:: images/distribution-summary-report.png
-.. image:: images/distribution-results-tree.png
Performance Test of Policy Distribution
+++++++++++++++++++++++++++++++++++++++
@@ -277,56 +324,62 @@ Performance Test of Policy Distribution
Introduction
------------
-Performance test of distribution has the goal of testing the min/avg/max processing time and
-rest call throughput for all the requests when the number of requests are large enough to saturate
-the resource and find the bottleneck.
-It also tests that distribution can handle multiple policy csar's and that these are deployed within 30 seconds consistently.
+The 4h Performance Test of Policy Distribution has the goal of testing the min/avg/max processing time and rest call throughput for all the requests when the number of requests are large enough to saturate the resource and find the bottleneck.
+
+It also tests that distribution can handle multiple policy CSARs and that these are deployed within 30 seconds consistently.
+
Setup Details
-------------
The performance test is based on the same setup as the distribution stability tests.
-Test Plan
----------
+
+Test Plan Sequence
+------------------
Performance test plan is different from the stability test plan.
-Instead of handling one policy csar at a time, multiple csar's are deployed within the watched folder at the exact same time.
-We then expect all policies from these csar's to be deployed within 30 seconds.
-Alongside these, there are multithreaded tests running towards the healtchcheck and statistics endpoints of the distribution service.
-Run Test
---------
+- Instead of handling one policy csar at a time, multiple csar's are deployed within the watched folder at the exact same time.
+- We expect all policies from these csar's to be deployed within 30 seconds.
+- There are also multithreaded tests running towards the healtchcheck and statistics endpoints of the distribution service.
-Copy the performance test plans folder onto VM2.
-Edit the /tmp/ folder permissions to allow the Testplan to insert the CSAR into the /tmp/policydistribution/distributionmount/ folder.
+
+Running the Test Plan
+---------------------
+
+Edit the /tmp folder permissions to allow the Testplan to insert the CSAR into the /tmp/policydistribution/distributionmount folder.
.. code-block:: bash
- $ sudo chmod a+trwx /tmp
+ sudo mkdir -p /tmp/policydistribution/distributionmount
+ sudo chmod -R a+trwx /tmp
-From the apache jMeter folder run the test, pointing it towards the stabiltiy.jmx file inside the testplans folder
+From the apache JMeter folder run the test for 4h, pointing it towards the performance.jmx file inside the testplans folder and specifying a logfile to collect the results
.. code-block:: bash
- $ ./bin/jmeter -n -t /home/rossc/testplans/performance.jmx -Jduration=259200 -l testresults.jtl
+ ~/jmeter/apache-jmeter-5.1.1/bin/jmeter -n -t ~/distribution/testsuites/performance/src/main/resources/testplans/performance.jmx -Jduration=14400 -l ~/20201020-1730-distr-performance.jtl &
Test Results
------------
**Summary**
-Performance test plan was triggered for 4 hours.
+- Performance test plan was triggered for 4 hours.
+- No errors were reported
**Test Statistics**
-======================= ================= ================== ==================================
-**Total # of requests** **Success %** **Error %** **Average time taken per request**
-======================= ================= ================== ==================================
-239819 100 % 0 % 100 ms
-======================= ================= ================== ==================================
+.. csv-table:: Performance Results - Summary Report
+ :file: csv/20201020-1730-distr-performance-summary.csv
+ :header-rows: 1
+
+.. csv-table:: Performance Results - Aggregate Report
+ :file: csv/20201020-1730-distr-performance-aggregate.csv
+ :header-rows: 1
-**JMeter Screenshot**
+**VisualVM Screenshots**
-.. image:: images/distribution-performance-summary-report.png
-.. image:: images/distribution-performance-api-report.png
+.. image:: images/20201020-1730-distr-performance-20201020T2025-monitor.png
+.. image:: images/20201020-1730-distr-performance-20201020T2025-threads.png
diff --git a/docs/development/devtools/drools-s3p.rst b/docs/development/devtools/drools-s3p.rst
index 58c522f6..18bd4898 100644
--- a/docs/development/devtools/drools-s3p.rst
+++ b/docs/development/devtools/drools-s3p.rst
@@ -10,7 +10,7 @@
Policy Drools PDP component
~~~~~~~~~~~~~~~~~~~~~~~~~~~
-Both the Performance and the Stability tests were executed against a default ONAP installation in the PFPP tenant, from an independent VM running the jmeter tool to inject the load.
+Both the Performance and the Stability tests were executed against a default ONAP installation in the policy-k8s tenant in the windriver lab, from an independent VM running the jmeter tool to inject the load.
General Setup
*************
@@ -51,82 +51,6 @@ PDP-D Setup
The kubernetes charts were modified previous to the installation with
the changes below.
-The oom/kubernetes/policy/charts/drools/resources/configmaps/base.conf was
-modified as follows:
-
-.. code-block:: bash
-
- --- a/kubernetes/policy/charts/drools/resources/configmaps/base.conf
- +++ b/kubernetes/policy/charts/drools/resources/configmaps/base.conf
- @@ -85,27 +85,27 @@ DMAAP_SERVERS=message-router
-
- # AAI
-
- -AAI_HOST=aai.{{.Release.Namespace}}
- -AAI_PORT=8443
- +AAI_HOST=localhost
- +AAI_PORT=6666
- AAI_CONTEXT_URI=
-
- # MSO
-
- -SO_HOST=so.{{.Release.Namespace}}
- -SO_PORT=8080
- -SO_CONTEXT_URI=onap/so/infra/
- -SO_URL=https://so.{{.Release.Namespace}}:8080/onap/so/infra
- +SO_HOST=localhost
- +SO_PORT=6667
- +SO_CONTEXT_URI=
- +SO_URL=https://localhost:6667/
-
- # VFC
-
- -VFC_HOST=
- -VFC_PORT=
- +VFC_HOST=localhost
- +VFC_PORT=6668
- VFC_CONTEXT_URI=api/nslcm/v1/
-
- # SDNC
-
- -SDNC_HOST=sdnc.{{.Release.Namespace}}
- -SDNC_PORT=8282
- +SDNC_HOST=localhost
- +SDNC_PORT=6670
- SDNC_CONTEXT_URI=restconf/operations/
-
-The AAI actor had to be modified to disable https to talk to the AAI simulator.
-
-.. code-block:: bash
-
- ~/oom/kubernetes/policy/charts/drools/resources/configmaps/AAI-http-client.properties
-
- http.client.services=AAI
-
- http.client.services.AAI.managed=true
- http.client.services.AAI.https=false
- http.client.services.AAI.host=${envd:AAI_HOST}
- http.client.services.AAI.port=${envd:AAI_PORT}
- http.client.services.AAI.userName=${envd:AAI_USERNAME}
- http.client.services.AAI.password=${envd:AAI_PASSWORD}
- http.client.services.AAI.contextUriPath=${envd:AAI_CONTEXT_URI}
-
-The SO actor had to be modified similarly.
-
-.. code-block:: bash
-
- oom/kubernetes/policy/charts/drools/resources/configmaps/SO-http-client.properties:
-
- http.client.services=SO
-
- http.client.services.SO.managed=true
- http.client.services.SO.https=false
- http.client.services.SO.host=${envd:SO_HOST}
- http.client.services.SO.port=${envd:SO_PORT}
- http.client.services.SO.userName=${envd:SO_USERNAME}
- http.client.services.SO.password=${envd:SO_PASSWORD}
- http.client.services.SO.contextUriPath=${envd:SO_CONTEXT_URI}
-
The feature-controlloop-utils was started by adding the following script:
.. code-block:: bash
@@ -136,7 +60,6 @@ The feature-controlloop-utils was started by adding the following script:
#!/bin/bash
bash -c "features enable controlloop-utils"
-
Stability Test of Policy PDP-D
******************************
@@ -145,40 +68,17 @@ The 72 hour stability test happened in parallel with the stability run of the AP
Worker Node performance
=======================
-The VM named onap-k8s-07 was monitored for the duration of the two parallel
-stability runs. The table below show the usage ranges:
+The VM named onap-k8s-09 was monitored for the duration of the 72 hours
+stability run. The table below show the usage ranges:
.. code-block:: bash
- NAME CPU(cores) CPU% MEMORY(bytes) MEMORY%
- onap-k8s-07 <=1374m <=20% <=10643Mi <=66%
+ NAME CPU(cores) CPU%
+ onap-k8s-09 <=1214m <=20%
PDP-D performance
=================
-The PDP-D uses a small configuration:
-
-.. code-block:: bash
-
- small:
- limits:
- cpu: 1
- memory: 4Gi
- requests:
- cpu: 100m
- memory: 1Gi
-
-In practicality, this corresponded to an allocated 3.75G heap for the JVM based.
-
-The PDP-D was monitored during the run and stayed below the following ranges:
-
-.. code-block:: bash
-
- NAME CPU(cores) MEMORY(bytes)
- dev-drools-0 <=142m 684Mi
-
-Garbage collection was monitored without detecting any significant degradation.
-
The test set focused on the following use cases:
- vCPE
@@ -204,7 +104,7 @@ The command executed was
.. code-block:: bash
- jmeter -n -t /home/ubuntu/jhh/s3p.jmx > /dev/null 2>&1
+ ./jmeter -n -t /home/ubuntu/drools-applications/testsuites/stability/src/main/resources/frankfurt/s3p.jmx -l /home/ubuntu/jmeter_result/jmeter.jtl -e -o /home/ubuntu/jmeter_result > /dev/null 2>&1
The results were computed by taking the ellapsed time from the audit.log
(this log reports all end to end transactions, marking the start, end, and
@@ -222,13 +122,7 @@ ControlLoop-vCPE-48f0c2c3-a172-4192-9ae3-052274181b6e:
.. code-block:: bash
- count 155246.000000
- mean 269.894226
- std 64.556282
- min 133.000000
- 50% 276.000000
- max 1125.000000
-
+ Max: 4323 ms, Min: 143 ms, Average: 380 ms [samples taken for average: 260628]
.. image:: images/ControlLoop-vCPE-48f0c2c3-a172-4192-9ae3-052274181b6e.png
@@ -240,14 +134,7 @@ ControlLoop-vCPE-Fail:
.. code-block:: bash
- ControlLoop-vCPE-Fail :
- count 149621.000000
- mean 280.483522
- std 67.226550
- min 134.000000
- 50% 279.000000
- max 5394.000000
-
+ Max: 3723 ms, Min: 148 ms, Average: 671 ms [samples taken for average: 87888]
.. image:: images/ControlLoop-vCPE-Fail.png
@@ -258,13 +145,7 @@ ControlLoop-vDNS-6f37f56d-a87d-4b85-b6a9-cc953cf779b3:
.. code-block:: bash
- count 293000.000000
- mean 21.961792
- std 7.921396
- min 15.000000
- 50% 20.000000
- max 672.000000
-
+ Max: 6437 ms, Min: 19 ms, Average: 165 ms [samples taken for average: 59259]
.. image:: images/ControlLoop-vDNS-6f37f56d-a87d-4b85-b6a9-cc953cf779b3.png
@@ -275,13 +156,7 @@ ControlLoop-vDNS-Fail:
.. code-block:: bash
- count 59357.000000
- mean 3010.261267
- std 76.599948
- min 0.000000
- 50% 3010.000000
- max 3602.000000
-
+ Max: 1176 ms, Min: 4 ms, Average: 5 ms [samples taken for average: 340810]
.. image:: images/ControlLoop-vDNS-Fail.png
@@ -292,16 +167,6 @@ ControlLoop-vFirewall-d0a1dfc6-94f5-4fd4-a5b5-4630b438850a:
.. code-block:: bash
- count 175401.000000
- mean 184.581251
- std 35.619075
- min 136.000000
- 50% 181.000000
- max 3972.000000
-
+ Max: 4016 ms, Min: 177 ms, Average: 644 ms [samples taken for average: 36460]
.. image:: images/ControlLoop-vFirewall-d0a1dfc6-94f5-4fd4-a5b5-4630b438850a.png
-
-
-
-
diff --git a/docs/development/devtools/images/20201016-1715-distr-stability-20201018T2040-monitor.png b/docs/development/devtools/images/20201016-1715-distr-stability-20201018T2040-monitor.png
new file mode 100755
index 00000000..abdba921
--- /dev/null
+++ b/docs/development/devtools/images/20201016-1715-distr-stability-20201018T2040-monitor.png
Binary files differ
diff --git a/docs/development/devtools/images/20201016-1715-distr-stability-20201018T2040-threads.png b/docs/development/devtools/images/20201016-1715-distr-stability-20201018T2040-threads.png
new file mode 100755
index 00000000..2a9745ae
--- /dev/null
+++ b/docs/development/devtools/images/20201016-1715-distr-stability-20201018T2040-threads.png
Binary files differ
diff --git a/docs/development/devtools/images/20201020-1730-distr-performance-20201020T2025-monitor.png b/docs/development/devtools/images/20201020-1730-distr-performance-20201020T2025-monitor.png
new file mode 100755
index 00000000..8ef443b1
--- /dev/null
+++ b/docs/development/devtools/images/20201020-1730-distr-performance-20201020T2025-monitor.png
Binary files differ
diff --git a/docs/development/devtools/images/20201020-1730-distr-performance-20201020T2025-threads.png b/docs/development/devtools/images/20201020-1730-distr-performance-20201020T2025-threads.png
new file mode 100755
index 00000000..8f4731c4
--- /dev/null
+++ b/docs/development/devtools/images/20201020-1730-distr-performance-20201020T2025-threads.png
Binary files differ
diff --git a/docs/development/devtools/images/ControlLoop-vCPE-48f0c2c3-a172-4192-9ae3-052274181b6e.png b/docs/development/devtools/images/ControlLoop-vCPE-48f0c2c3-a172-4192-9ae3-052274181b6e.png
index 5502fe90..5708502f 100644
--- a/docs/development/devtools/images/ControlLoop-vCPE-48f0c2c3-a172-4192-9ae3-052274181b6e.png
+++ b/docs/development/devtools/images/ControlLoop-vCPE-48f0c2c3-a172-4192-9ae3-052274181b6e.png
Binary files differ
diff --git a/docs/development/devtools/images/ControlLoop-vCPE-Fail.png b/docs/development/devtools/images/ControlLoop-vCPE-Fail.png
index 27601d9c..8c87ddfe 100644
--- a/docs/development/devtools/images/ControlLoop-vCPE-Fail.png
+++ b/docs/development/devtools/images/ControlLoop-vCPE-Fail.png
Binary files differ
diff --git a/docs/development/devtools/images/ControlLoop-vDNS-6f37f56d-a87d-4b85-b6a9-cc953cf779b3.png b/docs/development/devtools/images/ControlLoop-vDNS-6f37f56d-a87d-4b85-b6a9-cc953cf779b3.png
index d4b9e050..763efe76 100644
--- a/docs/development/devtools/images/ControlLoop-vDNS-6f37f56d-a87d-4b85-b6a9-cc953cf779b3.png
+++ b/docs/development/devtools/images/ControlLoop-vDNS-6f37f56d-a87d-4b85-b6a9-cc953cf779b3.png
Binary files differ
diff --git a/docs/development/devtools/images/ControlLoop-vDNS-Fail.png b/docs/development/devtools/images/ControlLoop-vDNS-Fail.png
index 643afea8..bd7302c5 100644
--- a/docs/development/devtools/images/ControlLoop-vDNS-Fail.png
+++ b/docs/development/devtools/images/ControlLoop-vDNS-Fail.png
Binary files differ
diff --git a/docs/development/devtools/images/ControlLoop-vFirewall-d0a1dfc6-94f5-4fd4-a5b5-4630b438850a.png b/docs/development/devtools/images/ControlLoop-vFirewall-d0a1dfc6-94f5-4fd4-a5b5-4630b438850a.png
index 23a543d2..5ba85fb4 100644
--- a/docs/development/devtools/images/ControlLoop-vFirewall-d0a1dfc6-94f5-4fd4-a5b5-4630b438850a.png
+++ b/docs/development/devtools/images/ControlLoop-vFirewall-d0a1dfc6-94f5-4fd4-a5b5-4630b438850a.png
Binary files differ
diff --git a/docs/development/devtools/images/apex_perf_jm_1.PNG b/docs/development/devtools/images/apex_perf_jm_1.PNG
new file mode 100644
index 00000000..14b1a075
--- /dev/null
+++ b/docs/development/devtools/images/apex_perf_jm_1.PNG
Binary files differ
diff --git a/docs/development/devtools/images/apex_perf_jm_2.PNG b/docs/development/devtools/images/apex_perf_jm_2.PNG
new file mode 100644
index 00000000..7a2eebba
--- /dev/null
+++ b/docs/development/devtools/images/apex_perf_jm_2.PNG
Binary files differ
diff --git a/docs/development/devtools/images/apex_s3p_jm-1.png b/docs/development/devtools/images/apex_s3p_jm-1.png
new file mode 100644
index 00000000..b52a46c1
--- /dev/null
+++ b/docs/development/devtools/images/apex_s3p_jm-1.png
Binary files differ
diff --git a/docs/development/devtools/images/apex_s3p_jm-2.png b/docs/development/devtools/images/apex_s3p_jm-2.png
new file mode 100644
index 00000000..66002f84
--- /dev/null
+++ b/docs/development/devtools/images/apex_s3p_jm-2.png
Binary files differ
diff --git a/docs/development/devtools/images/distribution-performance-api-report.png b/docs/development/devtools/images/distribution-performance-api-report.png
deleted file mode 100644
index 12102718..00000000
--- a/docs/development/devtools/images/distribution-performance-api-report.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/distribution-performance-summary-report.png b/docs/development/devtools/images/distribution-performance-summary-report.png
deleted file mode 100644
index 3cea8e99..00000000
--- a/docs/development/devtools/images/distribution-performance-summary-report.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/distribution-results-tree.png b/docs/development/devtools/images/distribution-results-tree.png
deleted file mode 100644
index 324b6c16..00000000
--- a/docs/development/devtools/images/distribution-results-tree.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/distribution-summary-report.png b/docs/development/devtools/images/distribution-summary-report.png
deleted file mode 100644
index 56e75951..00000000
--- a/docs/development/devtools/images/distribution-summary-report.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/distribution-vvm-monitor.png b/docs/development/devtools/images/distribution-vvm-monitor.png
deleted file mode 100644
index 1afaa7d2..00000000
--- a/docs/development/devtools/images/distribution-vvm-monitor.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/distribution-vvm-threads.png b/docs/development/devtools/images/distribution-vvm-threads.png
deleted file mode 100644
index d611d701..00000000
--- a/docs/development/devtools/images/distribution-vvm-threads.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/zip/frankfurt/apex_s3p_result.tar.gz b/docs/development/devtools/zip/frankfurt/apex_s3p_result.tar.gz
index 4e4d6e82..998e1449 100644
--- a/docs/development/devtools/zip/frankfurt/apex_s3p_result.tar.gz
+++ b/docs/development/devtools/zip/frankfurt/apex_s3p_result.tar.gz
Binary files differ