aboutsummaryrefslogtreecommitdiffstats
path: root/docs/development
diff options
context:
space:
mode:
authorjhh <jorge.hernandez-herrero@att.com>2020-10-23 09:15:35 -0500
committerjhh <jorge.hernandez-herrero@att.com>2020-10-23 10:02:05 -0500
commita4ff7b8a6846734dafa9c55f37b081ee7e5efd24 (patch)
tree0705d289e2192428fa9e90fae65930bb0f272d84 /docs/development
parent43ec4a490be81466978a055ef2da1ce96e2cb99d (diff)
document guilin api stability run results
Issue-ID: POLICY-2829 Signed-off-by: jhh <jorge.hernandez-herrero@att.com> Change-Id: I87603f13e3b5a68157c18f30c32d33fb62b1a9b1 Signed-off-by: jhh <jorge.hernandez-herrero@att.com>
Diffstat (limited to 'docs/development')
-rw-r--r--docs/development/devtools/api-s3p.rst349
-rw-r--r--docs/development/devtools/images/api-response-time-distribution.pngbin0 -> 100422 bytes
-rw-r--r--docs/development/devtools/images/api-response-time-overtime.pngbin0 -> 468692 bytes
-rw-r--r--docs/development/devtools/images/result-1.pngbin521878 -> 0 bytes
-rw-r--r--docs/development/devtools/images/result-2.pngbin519467 -> 0 bytes
-rw-r--r--docs/development/devtools/images/result-3.pngbin522203 -> 0 bytes
-rw-r--r--docs/development/devtools/images/result-4.pngbin514222 -> 0 bytes
-rw-r--r--docs/development/devtools/images/result-5.pngbin526566 -> 0 bytes
-rw-r--r--docs/development/devtools/images/result-6.pngbin534061 -> 0 bytes
-rw-r--r--docs/development/devtools/images/results-1.pngbin937280 -> 0 bytes
-rw-r--r--docs/development/devtools/images/results-2.pngbin930783 -> 0 bytes
-rw-r--r--docs/development/devtools/images/results-3.pngbin979334 -> 0 bytes
-rw-r--r--docs/development/devtools/images/results-4.pngbin979863 -> 0 bytes
-rw-r--r--docs/development/devtools/images/results-5.pngbin467675 -> 0 bytes
-rw-r--r--docs/development/devtools/images/results-6.pngbin599097 -> 0 bytes
-rw-r--r--docs/development/devtools/images/summary-1.pngbin492140 -> 0 bytes
-rw-r--r--docs/development/devtools/images/summary-2.pngbin486975 -> 0 bytes
-rw-r--r--docs/development/devtools/images/summary-3.pngbin471819 -> 0 bytes
18 files changed, 28 insertions, 321 deletions
diff --git a/docs/development/devtools/api-s3p.rst b/docs/development/devtools/api-s3p.rst
index 982571ba..96656230 100644
--- a/docs/development/devtools/api-s3p.rst
+++ b/docs/development/devtools/api-s3p.rst
@@ -17,252 +17,24 @@ Policy API S3P Tests
Introduction
------------
-The 72 hour stability test of policy API has the goal of verifying the stability of running policy design API REST service by
-ingesting a steady flow of transactions of policy design API calls in a multi-thread fashion to simulate multiple clients' behaviors.
-All the transaction flows are initiated from a test client server running JMeter for the duration of 72+ hours.
+The 72 hour stability test of policy API has the goal of verifying the stability of running policy design API REST
+service by ingesting a steady flow of transactions in a multi-threaded fashion to
+simulate multiple clients' behaviors.
+All the transaction flows are initiated from a test client server running JMeter for the duration of 72 hours.
Setup Details
-------------
-The stability test is performed on VMs running in Intel Wind River Lab environment.
-There are 2 seperate VMs. One for running API while the other running JMeter & other necessary components, e.g. MariaDB, to simulate steady flow of transactions.
-For simplicity, let's assume:
-
-VM1 will be running JMeter, MariaDB.
-VM2 will be running API REST service and visualVM.
-
-**Lab Environment**
-
-Intel ONAP Integration and Deployment Labs
-`Physical Labs <https://wiki.onap.org/display/DW/Physical+Labs>`_,
-`Wind River <https://www.windriver.com/>`_
-
-**API VM Details (VM2)**
-
-OS: Ubuntu 18.04 LTS
-
-CPU: 4 core
-
-RAM: 8 GB
-
-HardDisk: 91 GB
-
-Docker Version: 18.09.8
-
-Java: OpenJDK 1.8.0_212
-
-**JMeter VM Details (VM1)**
-
-OS: Ubuntu 18.04 LTS
-
-CPU: 4 core
-
-RAM: 8GB
-
-HardDisk: 91GB
-
-Docker Version: 18.09.8
-
-Java: OpenJDK 1.8.0_212
-
-JMeter: 5.1.1
-
-**Software Installation & Configuration**
-
-**VM1 & VM2 in lab**
-
-**Install Java & Docker**
-
-Make the etc/hosts entries
-
-.. code-block:: bash
-
- $ echo $(hostname -I | cut -d\ -f1) $(hostname) | sudo tee -a /etc/hosts
-
-Update the Ubuntu software installer
-
-.. code-block:: bash
-
- $ sudo apt-get update
-
-Check and install Java
-
-.. code-block:: bash
-
- $ sudo apt-get install -y openjdk-8-jdk
- $ java -version
-
-Ensure that the Java version executing is OpenJDK version 8
-
-Check and install docker
-
-.. code-block:: bash
-
- $ curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
- $ sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"
- $ sudo apt-get update
- $ sudo apt-cache policy docker-ce
- $ sudo apt-get install -y unzip docker-ce
- $ systemctl status docker
- $ docker ps
-
-Change the permissions of the Docker socket file
-
-.. code-block:: bash
-
- $ sudo chmod 777 /var/run/docker.sock
-
-Or add the current user to the docker group
-
-.. code-block:: bash
-
- $ sudo usermod -aG docker $USER
-
-Check the status of the Docker service and ensure it is running correctly
-
-.. code-block:: bash
-
- $ service docker status
- $ docker ps
-
-**VM1 in lab**
-
-**Install JMeter**
-
-Download & install JMeter
-
-.. code-block:: bash
-
- $ mkdir jMeter
- $ cd jMeter
- $ wget http://mirrors.whoishostingthis.com/apache//jmeter/binaries/apache-jmeter-5.2.1.zip
- $ unzip apache-jmeter-5.2.1.zip
-
-**Install other necessary components**
-
-Pull api code & run setup components script
-
-.. code-block:: bash
-
- $ cd ~
- $ git clone https://git.onap.org/policy/api
- $ cd api/testsuites/stability/src/main/resources/simulatorsetup
- $ . ./setup_components.sh
-
-After installation, make sure the following mariadb container is up and running
-
-.. code-block:: bash
-
- ubuntu@test:~/api/testsuites/stability/src/main/resources/simulatorsetup$ docker ps
- CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
- 3849ce44b86d mariadb:10.2.14 "docker-entrypoint.s…" 11 days ago Up 11 days 0.0.0.0:3306->3306/tcp mariadb
-
-**VM2 in lab**
-
-**Install policy-api**
-
-Pull api code & run setup api script
-
-.. code-block:: bash
-
- $ cd ~
- $ git clone https://git.onap.org/policy/api
- $ cd api/testsuites/stability/src/main/resources/apisetup
- $ . ./setup_api.sh <host ip running api> <host ip running mariadb>
-
-After installation, make sure the following api container is up and running
-
-.. code-block:: bash
-
- ubuntu@tools-2:~/api/testsuites/stability/src/main/resources/apisetup$ docker ps
- CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
- 4f08f9972e55 nexus3.onap.org:10001/onap/policy-api:2.1.1-SNAPSHOT "bash ./policy-api.sh" 11 days ago Up 11 days 0.0.0.0:6969->6969/tcp, 0.0.0.0:9090->9090/tcp policy-api
-
-**Install & configure visualVM**
-
-VisualVM needs to be installed in the virtual machine having API up and running. It will be used to monitor CPU, Memory, GC for API while stability test is running.
-
-Install visualVM
-
-.. code-block:: bash
-
- $ sudo apt-get install visualvm
-
-Run few commands to configure permissions
-
-.. code-block:: bash
-
- $ cd /usr/lib/jvm/java-8-openjdk-amd64/bin/
- $ sudo touch visualvm.policy
- $ sudo chmod 777 visualvm.policy
-
- $ vi visualvm.policy
-
- Add the following in visualvm.policy
-
-
- grant codebase "file:/usr/lib/jvm/java-8-openjdk-amd64/lib/tools.jar" {
- permission java.security.AllPermission;
- };
-
-Run following commands to start jstatd using port 1111
-
-.. code-block:: bash
-
- $ cd /usr/lib/jvm/java-8-openjdk-amd64/bin/
- $ ./jstatd -p 1111 -J-Djava.security.policy=visualvm.policy &
-
-**Local Machine**
-
-**Run & configure visualVM**
-
-Run visualVM by typing
-
-.. code-block:: bash
-
- $ jvisualvm
-
-Connect to jstatd & remote policy-api JVM
-
- 1. Right click on "Remote" in the left panel of the screen and select "Add Remote Host..."
- 2. Enter the IP address of VM2 (running policy-api)
- 3. Right click on IP address, select "Add JMX Connection..."
- 4. Enter the VM2 IP Address (from step 2) <IP address>:9090 ( for example, 10.12.6.151:9090) and click OK.
- 5. Double click on the newly added nodes under "Remote" to start monitoring CPU, Memory & GC.
-
-Sample Screenshot of visualVM
-
-.. image:: images/results-5.png
-
-Run Test
---------
-
-**Local Machine**
-
-Connect to lab VPN
-
-.. code-block:: bash
-
- $ sudo openvpn --config <path to lab ovpn key file>
-
-SSH into JMeter VM (VM1)
-
-.. code-block:: bash
-
- $ ssh -i <path to lab ssh key file> ubuntu@<host ip of JMeter VM>
-
-Run JMeter test in background for 72+ hours
+The stability test was performed on a default ONAP OOM installation in the Intel Wind River Lab environment.
+JMeter was installed on a separate VM to inject the traffic defined in the
+`API stability script
+<https://git.onap.org/policy/api/tree/testsuites/stability/src/main/resources/testplans/policy_api_stability.jmx>`_
+with the following command:
.. code-block:: bash
- $ mkdir s3p
- $ nohup ./jMeter/apache-jmeter-5.2.1/bin/jmeter.sh -n -t ~/api/testsuites/stability/src/main/resources/testplans/policy_api_stability.jmx &
+ jmeter.sh --nongui --testfile policy_api_stability.jmx --logfile result.jtl
-(Optional) Monitor JMeter test that is running in background (anytime after re-logging into JMeter VM - VM1)
-
-.. code-block:: bash
-
- $ tail -f s3p/stability.log nohup.out
Test Plan
---------
@@ -333,108 +105,43 @@ of each entity is set to the running thread number.
- Get Preloaded Policy Types
-Test Results El-Alto
---------------------
+Test Results
+------------
**Summary**
-Policy API stability test plan was triggered and running for 72+ hours without any error occurred.
+No errors were found during the 72 hours of the Policy API stability run.
+The load was performed against a non-tweaked ONAP OOM installation.
**Test Statistics**
======================= ============= =========== =============================== =============================== ===============================
-**Total # of requests** **Success %** **Error %** **Avg. time taken per request** **Min. time taken per request** **Max. time taken per request**
+**Total # of requests** **Success %** **TPS** **Avg. time taken per request** **Min. time taken per request** **Max. time taken per request**
======================= ============= =========== =============================== =============================== ===============================
- 49723 100% 0% 86 ms 4 ms 795 ms
+ 176407 100% 0.68 7340 ms 34 ms 49298 ms
======================= ============= =========== =============================== =============================== ===============================
-**VisualVM Results**
-
-.. image:: images/results-5.png
-.. image:: images/results-6.png
**JMeter Results**
-.. image:: images/results-1.png
-.. image:: images/results-2.png
-.. image:: images/results-3.png
-.. image:: images/results-4.png
-
-
-Test Results Frankfurt
-----------------------
-
-PFPP ONAP Windriver lab
-
-**Summary**
-
-Policy API stability test plan was triggered and running for 72+ hours without
-any real errors occurring. The single failure was on teardown and was due to
-simultaneous test plans running concurrently on the lab system.
-
-Compared to El-Alto, 10x the number of API calls were made in the 72 hour run.
-However, the latency increased (most likely due to the synchronization added
-from
-`POLICY-2533 <https://jira.onap.org/browse/POLICY-2533>`_.
-This will be addressed in the next release.
-
-**Test Statistics**
-
-======================= ============= =========== =============================== =============================== ===============================
-**Total # of requests** **Success %** **Error %** **Avg. time taken per request** **Min. time taken per request** **Max. time taken per request**
-======================= ============= =========== =============================== =============================== ===============================
- 514953 100% 0% 2510 ms 336 ms 15034 ms
-======================= ============= =========== =============================== =============================== ===============================
-
-**VisualVM Results**
-
-VisualVM results were not captured as this was run in the PFPP ONAP Windriver
-lab.
+The following graphs shows the response time distribution. The "Get Policy Types" API calls are the most expensive calls that
+average a 10 seconds plus response time.
-**JMeter Results**
-
-.. image:: images/api-s3p-jm-1_F.png
+.. image:: images/api-response-time-distribution.png
+.. image:: images/api-response-time-overtime.png
Performance Test of Policy API
++++++++++++++++++++++++++++++
-Introduction
-------------
-
-Performance test of policy-api has the goal of testing the min/avg/max processing time and rest call throughput for all the requests when the number of requests are large enough to saturate the resource and find the bottleneck.
-
-Setup Details
--------------
+A specific performance test was omitted in Guilin. The JMeter script used in the stability run injected
+back to back traffic with 5 parallel threads with no pauses between requests. Since the JMeter threads operate
+in synchronous mode (waiting for a request's response before sending the next request), JMeter injection rates autoregulate
+because of the backpressure imposed by the response times. Even though the response times are high, the
+"Response over Time" graph above indicates that they remain constant at large, throughout the duration of the test.
+This together with the absence of notorious spikes in the kubernetes node CPU utilization suggests that the API
+component is not strained. A more enlightning set of tests, would plot jmeter threads (increasing load)
+against response times. These tests have not been performed in this release.
-The performance test is performed on OOM-based deployment of ONAP Policy framework components in Intel Wind River Lab environment.
-In addition, we use another VM with JMeter installed to generate the transactions.
-The JMeter VM will be sending large number of REST requests to the policy-api component and collecting the statistics.
-Policy-api component already knows how to communicate with MariaDB component if OOM-based deployment is working correctly.
-
-Test Plan
----------
-
-Performance test plan is the same as stability test plan above.
-Only differences are, in performance test, we increase the number of threads up to 20 (simulating 20 users' behaviors at the same time) whereas reducing the test time down to 1 hour.
-
-Run Test
---------
-
-Running/Triggering performance test will be the same as stability test. That is, launch JMeter pointing to corresponding *.jmx* test plan. The *API_HOST* and *API_PORT* are already set up in *.jmx*.
-
-Test Results
-------------
-Test results are shown as below. Overall, the test was running smoothly and successfully. We do see some minor failed transactions, especially in POST calls which intend to write into DB simultaneously in a multi-threaded fashion . All GET calls (reading from DB) were succeeded.
-
-.. image:: images/summary-1.png
-.. image:: images/summary-2.png
-.. image:: images/summary-3.png
-.. image:: images/result-1.png
-.. image:: images/result-2.png
-.. image:: images/result-3.png
-.. image:: images/result-4.png
-.. image:: images/result-5.png
-.. image:: images/result-6.png
diff --git a/docs/development/devtools/images/api-response-time-distribution.png b/docs/development/devtools/images/api-response-time-distribution.png
new file mode 100644
index 00000000..e57ff627
--- /dev/null
+++ b/docs/development/devtools/images/api-response-time-distribution.png
Binary files differ
diff --git a/docs/development/devtools/images/api-response-time-overtime.png b/docs/development/devtools/images/api-response-time-overtime.png
new file mode 100644
index 00000000..c80a6a64
--- /dev/null
+++ b/docs/development/devtools/images/api-response-time-overtime.png
Binary files differ
diff --git a/docs/development/devtools/images/result-1.png b/docs/development/devtools/images/result-1.png
deleted file mode 100644
index 4715cd7a..00000000
--- a/docs/development/devtools/images/result-1.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/result-2.png b/docs/development/devtools/images/result-2.png
deleted file mode 100644
index cd01147d..00000000
--- a/docs/development/devtools/images/result-2.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/result-3.png b/docs/development/devtools/images/result-3.png
deleted file mode 100644
index 01e27a30..00000000
--- a/docs/development/devtools/images/result-3.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/result-4.png b/docs/development/devtools/images/result-4.png
deleted file mode 100644
index 3fc2f36b..00000000
--- a/docs/development/devtools/images/result-4.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/result-5.png b/docs/development/devtools/images/result-5.png
deleted file mode 100644
index 9b7140c6..00000000
--- a/docs/development/devtools/images/result-5.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/result-6.png b/docs/development/devtools/images/result-6.png
deleted file mode 100644
index f07ea59e..00000000
--- a/docs/development/devtools/images/result-6.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/results-1.png b/docs/development/devtools/images/results-1.png
deleted file mode 100644
index 35e1a965..00000000
--- a/docs/development/devtools/images/results-1.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/results-2.png b/docs/development/devtools/images/results-2.png
deleted file mode 100644
index 82092025..00000000
--- a/docs/development/devtools/images/results-2.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/results-3.png b/docs/development/devtools/images/results-3.png
deleted file mode 100644
index 69d430a2..00000000
--- a/docs/development/devtools/images/results-3.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/results-4.png b/docs/development/devtools/images/results-4.png
deleted file mode 100644
index 47c0f5fa..00000000
--- a/docs/development/devtools/images/results-4.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/results-5.png b/docs/development/devtools/images/results-5.png
deleted file mode 100644
index effd062b..00000000
--- a/docs/development/devtools/images/results-5.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/results-6.png b/docs/development/devtools/images/results-6.png
deleted file mode 100644
index 1da1e366..00000000
--- a/docs/development/devtools/images/results-6.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/summary-1.png b/docs/development/devtools/images/summary-1.png
deleted file mode 100644
index a9d3b61e..00000000
--- a/docs/development/devtools/images/summary-1.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/summary-2.png b/docs/development/devtools/images/summary-2.png
deleted file mode 100644
index 2ca0c969..00000000
--- a/docs/development/devtools/images/summary-2.png
+++ /dev/null
Binary files differ
diff --git a/docs/development/devtools/images/summary-3.png b/docs/development/devtools/images/summary-3.png
deleted file mode 100644
index cd288d2b..00000000
--- a/docs/development/devtools/images/summary-3.png
+++ /dev/null
Binary files differ