summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
authorrenealr <reneal.rogers@amdocs.com>2018-11-19 15:01:00 -0500
committerrenealr <reneal.rogers@amdocs.com>2018-11-21 07:50:40 -0500
commitcf8a7968604dc3e0d9e71547738fa8d497e3ca55 (patch)
tree214ea8b451d5f35efd25c9180a5ac6148e08f36e
parent6f99c3307fe05014c23720cc191a086137135651 (diff)
update sparky read the docs files
update the rst files for sparky read the doc files removed white space Issue-ID: AAI-1268 Change-Id: I4ff2ab6d34e1cb4112fb22c46cd20306e44cc834 Signed-off-by: renealr <reneal.rogers@amdocs.com>
-rw-r--r--docs/index.rst174
-rw-r--r--docs/platform/architecture.rst7
-rw-r--r--docs/platform/images/aai-ui-view-inspect.jpgbin0 -> 26632 bytes
-rw-r--r--docs/platform/images/sparky-interaction.pngbin0 -> 17383 bytes
-rw-r--r--docs/platform/images/view1.pngbin0 -> 17630 bytes
-rw-r--r--docs/platform/images/view2.pngbin0 -> 12247 bytes
-rw-r--r--docs/platform/images/view3.pngbin0 -> 104016 bytes
-rw-r--r--docs/platform/images/view4.pngbin0 -> 3904 bytes
-rw-r--r--docs/platform/images/view5.pngbin0 -> 5625 bytes
-rw-r--r--docs/platform/images/view6.jpgbin0 -> 44408 bytes
-rw-r--r--docs/platform/index.rst14
-rw-r--r--docs/platform/installation.rst164
-rw-r--r--docs/platform/view_inspect.rst100
-rw-r--r--docs/platform/vnfs.rst48
-rw-r--r--docs/view_inspect.rst44
-rw-r--r--docs/vnfs.rst25
16 files changed, 342 insertions, 234 deletions
diff --git a/docs/index.rst b/docs/index.rst
index 73cef31..da82abd 100644
--- a/docs/index.rst
+++ b/docs/index.rst
@@ -11,173 +11,17 @@ Sparky - Inventory UI Service
:alt: alternate text
:align: right
:target: https://bestpractices.coreinfrastructure.org/projects/1737
-
-Architecture
-============
-Sparky a service that interacts with AAI and provides users with a user interface to view and analyze AAI data. The main goal behind Sparky is to provide a clear and user friendly view of AAI data.
-It is divided into both a front-end (the code that constructs the GUI) and a back-end (the Java code and other technologies that provide the front-end with its data). When Sparky is to be deployed, a .war file containing the front-end needs to be copied into the ``/src/main/resources/extApps`` directory. The back-end will then use this .war to present the front-end to users.
+Overview
+========
+Sparky is a service that interacts with resources and traversal microservices, and provides users with a user interface to view and analyze AAI data. The main goal behind Sparky is to provide a clear and user friendly view of AAI data.
-At this time, Sparky has two views available for use:
+The key AAI repos for running the Sparky UI are:
-.. toctree::
- :maxdepth: 1
-
- Graph-based view of entities within AAI <./view_inspect>
- Aggregation-based view of VNFs within AAI VNFs <./vnfs>
-
-Interactions
-------------
-Sparky requires connections to the following additional services:
-
-Front-end:
-
-- A Sparky back-end to serve the front-end
-
-Back-end:
-
-- An AAI instance as the main driver behind data
-- An Elasticsearch instance for data storage (a Synapse service instance is an implicit dependency which populates the Elasticsearch indexes)
-- A Search Data Service instance for search functionality
-- An eCOMP Portal instance for authentication
-
-Logging
-=======
-Sparky uses the Logback framework to generate logs. The logback.xml file can be found under the ``src/main/resources/`` folder
-
-Installation
-============
-
-Steps: Back-end
----------------
-
-Clone Git Repository
-********************
-Clone the Sparky back-end Git repository
-
-Build
-*****
-
-After cloning the project, build the project by executing the following Maven command from the project's top level directory:
-
-.. code-block:: bash
-
- mvn clean install
-
-After a successful install, build the docker image:
-
-.. code-block:: bash
-
- docker build -t openecomp/sparky target
-
-Deploy
-******
-
-Push the Docker image that you have built to your Docker repository and pull it down to the location that you will be running Sparky.
-
-Create the following directories on the host machine:
-
-- /logs
-- /opt/app/sparky/appconfig
-
-You will be mounting these as data volumes when you start the Docker container.
-
-Clone Configuration Repository
-******************************
-
-Clone the "test-config" repo to a seperate directory.
-Navigate to ``[test-config repo location]/sparky/appconfig`` (will contain files such as ``aai.properties``).
-
-Copy the entire contents of ``[test-config repo location]]/sparky/appconfig`` into the ``/opt/app/sparky/appconfig`` directory you created in an above step.
-
-Steps: Front-end
-----------------
-
-Clone Git Repository
-********************
-Clone the ``sparky-fe.git`` Sparky back-end Git repository
-
-Install Required Tools
-**********************
-You will need to install the following tools:
-
-- node.js, including the Node Package Manager (NPM) (if there issues installing the latest version, try 6.10.1)
-- Python 2.7.13
-
-After installing node.js and NPM, you need to install the required node.js packages by executing:
-
-.. code-block:: bash
+- aai/sparky-fe: This holds the code that constructs the GUI
+- aai/sparky-be: This holds the Java code and other technologies that provide the front-end with its data.
- npm install
-
-Build
-*****
-
-**To build the front-end (generate a .war file)**:
-
-Execute:
-
-.. code-block:: bash
-
- gulp build
-
-The build will create a directory called ``dist`` and add the ``aai.war`` file to it.
-
-If changes to the build flow are required, updating ``webpack.config.js`` and ``gulpfile.js`` will likely provide any build tuning that is required.
-
-**To run the front-end:**
-
-Execute:
-
-.. code-block:: bash
-
- npm start
-
-By default the local instance of the UI will be served to ``https://localhost:8001/aai/#/viewInspect``.
-
-This can be configured in the file ``webpack.devConfig.js``.
-
-Deploy
-******
-
-Push the Docker image that you have built to your Docker repository and pull it down to the location that you will be running Sparky.
-
-**Create the following directories on the host machine:**
-
-- /logs
-- /opt/app/sparky/appconfig
-
-You will be mounting these as data volumes when you start the Docker container.
-
-Configuration
-=============
-
-Steps: Back-end
----------------
-
-Edit property files in /opt/app/sparky/appconfig
-************************************************
-
-Listed below are the values that will need to be updated to make Sparky operate properly. The configuration files contain comments for contents not listed here.
-
-**search-service.properties:**
-
-search-service.ipAddress=*[ip address / hostname of the search-data-service that this instance will use]*
-search-service.httpPort=[http port of the search-data-service that this instance will use]
-
-**aai.properties:**
-
-aai.rest.host= *[ip address / hostname of the aai that this instance will use]*
-
-aai.rest.port= *[rest port of the aai that this instance will use]*
-
-**elasticsearch.properties:**
-
-elasticsearch.ipAddress= *[ip address / hostname of the elasticsearch that this instance will use*]
-elasticsearch.httpPort=*[http port of the elasticsearch that this instance will use*]
-elasticsearch.javaApiPort=*[java api port of the elasticsearch that this instance will use*]
-
-**portal/portal.properties:**
-**portal/portal-authentication.properties:**
+.. toctree::
+ :maxdepth: 2
-If this instance of Sparky will be served in an eCOMP Portal instance, use the two files above to configure against the proper Portal instance.
+ platform/index.rst
diff --git a/docs/platform/architecture.rst b/docs/platform/architecture.rst
new file mode 100644
index 0000000..bb6fbb6
--- /dev/null
+++ b/docs/platform/architecture.rst
@@ -0,0 +1,7 @@
+Architecture
+============
+Sparky Architecture in AAI
+===========================
+The Sparky microservice is built on a spring platform. The table below highlights Sparky's interaction with various microservices that exist in the AAI ecosystem.
+
+.. image:: images/sparky-interaction.png
diff --git a/docs/platform/images/aai-ui-view-inspect.jpg b/docs/platform/images/aai-ui-view-inspect.jpg
new file mode 100644
index 0000000..bf520ed
--- /dev/null
+++ b/docs/platform/images/aai-ui-view-inspect.jpg
Binary files differ
diff --git a/docs/platform/images/sparky-interaction.png b/docs/platform/images/sparky-interaction.png
new file mode 100644
index 0000000..892d3cc
--- /dev/null
+++ b/docs/platform/images/sparky-interaction.png
Binary files differ
diff --git a/docs/platform/images/view1.png b/docs/platform/images/view1.png
new file mode 100644
index 0000000..e5f40d9
--- /dev/null
+++ b/docs/platform/images/view1.png
Binary files differ
diff --git a/docs/platform/images/view2.png b/docs/platform/images/view2.png
new file mode 100644
index 0000000..ac66e47
--- /dev/null
+++ b/docs/platform/images/view2.png
Binary files differ
diff --git a/docs/platform/images/view3.png b/docs/platform/images/view3.png
new file mode 100644
index 0000000..2c83f31
--- /dev/null
+++ b/docs/platform/images/view3.png
Binary files differ
diff --git a/docs/platform/images/view4.png b/docs/platform/images/view4.png
new file mode 100644
index 0000000..4ab94f4
--- /dev/null
+++ b/docs/platform/images/view4.png
Binary files differ
diff --git a/docs/platform/images/view5.png b/docs/platform/images/view5.png
new file mode 100644
index 0000000..6813651
--- /dev/null
+++ b/docs/platform/images/view5.png
Binary files differ
diff --git a/docs/platform/images/view6.jpg b/docs/platform/images/view6.jpg
new file mode 100644
index 0000000..68bc264
--- /dev/null
+++ b/docs/platform/images/view6.jpg
Binary files differ
diff --git a/docs/platform/index.rst b/docs/platform/index.rst
new file mode 100644
index 0000000..cd6c133
--- /dev/null
+++ b/docs/platform/index.rst
@@ -0,0 +1,14 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+
+Platform
+--------
+
+Sparky is a user interface that currently exists in the AAI ecosystem. Sparky provides users the ability to view and analyze data that is present in A&AI.
+
+.. toctree::
+ :maxdepth: 2
+
+ architecture.rst
+ view_inspect.rst
+ vnfs.rst
+ installation.rst
diff --git a/docs/platform/installation.rst b/docs/platform/installation.rst
new file mode 100644
index 0000000..2541303
--- /dev/null
+++ b/docs/platform/installation.rst
@@ -0,0 +1,164 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+
+Installation and Developer Setup
+================================
+
+Project Structure
+-----------------
+
+Sparky is structured with a top level project that contains two sub-projects (application and service projects). The application project contains the configuration and spring application code that customizes and runs sparky as an application. The service project contains the core sparky code that provides functionality for sparky-fe requests and synchronization.
+
+In regards to the front-end (sparky-fe), sparky-be serves up the sparky-fe web content and sparky-be deals with all inter-microservice communications.
+
+Clone, Build, & Configure
+=========================
+
+Clone & Build
+-------------
+
+Clone the sparky-be repository into a directory of your choice.
+
+.. code-block:: bash
+
+ git clone ssh://<username>@gerrit.onap.org:29418/aai/sparky-be
+
+After cloning the project (sparky-be), build the project by executing the following Maven command from the project's top level directory:
+
+.. code-block:: bash
+
+ mvn clean install
+
+Configuration
+-------------
+
+All configuration for running sparky is found in ``<working directory>/sparkybe-onap-application/config``.
+
+Profiles
+--------
+
+*application.properties* is the main configuration point for the sparky application. Within this file the *spring.profiles* are set. Each spring profile has configuration that will be loaded into an associated spring bean. The currently available profiles:
+
+ * camel
+ * http | ssl
+ * portal
+ * fe-dev | fe-prod
+ * oxm-default | oxm-override
+ * resources | gizmo
+ * sync
+ * oxm-schema-dev | oxm-schema-prod
+
+Profile descriptions:
+
+ * camel - Enables spring-boot camel routing rules
+ * http - Sets Sparky's communication protocol to HTTP
+ * ssl - Sets Sparky's communication protocol to HTTPS
+ * portal - Adds ONAP portal processing to Sparky's flow
+ * fe-dev - Exposes the static folder for UI development when running Sparky locally (target/static)
+ * fe-prod - Exposes the standard path for the UI in the docker container
+ * oxm-default - Sets the default version and version list of OXM files to be used
+ * oxm-override - Sets a custom version and version list of OXM files to be used
+ * resources - Sparky will use aai-resources (microservice) as the primary source of inventory information
+ * gizmo - Sparky will use gizmo (microservice) as the primary source of inventory information
+ * sync - Will cause Sparky to run any configured synchronizers to populate index data in a single large transaction
+ * oxm-schema-dev - Sets the location to find the OXM files within a development environment
+ * oxm-schema-prod - Sets the location to find the OXM files within a deployed environment
+
+The idea behind the profiles is to create a simple approach to adjusting runtime behavior without needing to edit large xml files (see **Spring Beans** below). Ahead of running Sparky, some of the profiles will need to be edited to work within your environment (e.g. set where your custom OXM files need to be loaded from).
+
+Spring Beans
+------------
+
+The *spring-beans* directory contains all the .xml bean representations that will be auto-wired at runtime. Some of the beans are associated with a single profile (see "profile=" in header of bean declaration), and others will be loaded with differing values depending the profile used.
+
+Scanning through the beans and cross-referencing with their associated Java classes is a good way of getting familiar with the startup and runtime of Sparky.
+
+Authorization
+-------------
+
+Within the "auth" directory are any certs needed to communicate within the environment you are currently working in (remember Sparky can be configured to run using HTTP).
+
+Filters
+-------
+
+The "filters" directory contains the JSON descriptor files that describe the filters used in the VNFs view.
+
+Logging
+-------
+
+Sparky uses the Logback framework to generate logs. The logback.xml is contained in the "logging" directory.
+
+Running Locally
+===============
+
+The configuration described in this section will be in reference to running Sparky through Eclipse. The same steps can be applied to running via bash/cmd with minor tweaks.
+
+Sparky should be built ahead of running (``mvn clean install``). It's useful to add a build configuration to Eclipse to build Sparky.
+
+The run configuration should contain the following:
+
+* The configuration should be created based off of the "Maven Build" template
+* "Main" tab
+ * Build directory - ${workspace_loc:/sparky-be/sparkybe-onap-application}
+ * Goals - spring-boot:run
+ * Parameter table
+ * name: CONFIG_HOME value: ${workspace_loc:/sparky-be/sparkybe-onap-application}/config
+ * name: APP_HOME value: ${workspace_loc:/sparky-be/sparkybe-onap-application}
+
+Deploying Sparky
+================
+
+At time of writing (Oct 2018) Sparky is primarily deployed into a Kubernetes environment or a "pure" docker environment using custom chef parametrization. How you want to deploy Sparky is up to you. At a high level, the cleanest approach is ensuring your configured property (profiles) files are copied into the docker container so the Spring context has access to the values which will in turn start Sparky using your configured values.
+
+See ``sparky-be/sparkybe-onap-application/src/main/docker`` -> Dockerfile for details on how Sparky runs within a Docker container.
+
+Front-End (sparky-fe) Details
+=============================
+
+Clone, Build, & Configure
+-------------------------
+
+Clone the sparky-fe repository into a directory of your choice.
+
+Dependencies
+------------
+You will need to install the following tools:
+
+* node.js, including the Node Package Manager (NPM) (if there issues installing the latest version, try 6.10.1)
+* Python 2.7.13
+
+After installing node.js and NPM, you need to install the required node.js packages by navigating to the top level sparky-fe directory and executing:
+
+.. code-block:: bash
+
+ npm install
+
+Build
+-----
+
+To build sparky-fe (generate a .war file):
+
+Execute:
+
+.. code-block:: bash
+
+ gulp build
+
+The build will create a directory called ``dist`` and add the ``aai.war`` file to it.
+
+If changes to the build flow are required, updating ``webpack.config.js`` and ``gulpfile.js`` will likely provide any build tuning that is required.
+
+Running sparky-fe Locally
+=========================
+
+Execute:
+
+.. code-block:: bash
+
+ npm start
+
+By default the local instance of the UI will be served to ``http(s)://localhost:8001/``.
+
+Deploy sparky-fe
+================
+
+If you have access to a container repository (e.g. Nexus), push the .war image that you have built to your repository and configure your sparky-be ``sparkybe-onap-application/pom.xml`` to pull your sparky-fe image.
diff --git a/docs/platform/view_inspect.rst b/docs/platform/view_inspect.rst
new file mode 100644
index 0000000..81a9c58
--- /dev/null
+++ b/docs/platform/view_inspect.rst
@@ -0,0 +1,100 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+
+Sparky - View & Inspect
+=======================
+
+*View & Inspect* Overview
+~~~~~~~~~~~~~~~~~~~~~~~~~
+
+*View & Inspect* provides a graph based view of elements within AAI. A
+single entity is the entry point into each graph, and from that base
+element a graph is generated based off relationships.
+
+.. image:: images/aai-ui-view-inspect.jpg
+ :scale: 100 %
+ :alt: alternate text
+ :align: center
+
+*View & Inspect* Features
+~~~~~~~~~~~~~~~~~~~~~~~~~
+With View & Inspect UI, users can:
+ * Search for a network or service object using any part of a key attribute name, ID or object type
+ * Select a node matching suggestion returned in the drop-down list, and view a visual representation of its service hierarchy and related nodes
+ * View and search for multiple object types
+ * View the specific attributes of a node or any of its related nodes
+
+Navigation to *View & Inspect*
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+The *View & Inspect* view can be reached by two means:
+
+1. Main navigation menu
+2. Selecting a search result related to an entity instance (e.g. an
+ entity called readme-entity)
+
+Using *View & Inspect*
+~~~~~~~~~~~~~~~~~~~~~~
+
+*View & Inspect* is driven by using the search bar at the top of the UI
+to find and select entity instances. Once an instance has been selected,
+a request is processed in *Sparky's* backend component that generates a
+graph representation of the selected entity. The graph data is returned
+to *View & Inspect* and rendered on screen.
+
+Node Details
+^^^^^^^^^^^^
+
+Upon node selection, the selected graph node details will appear in a
+panel to the right of the graph titled, *Node Details*.
+
+Interacting with the Graph
+~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+The graph can be panned by clicking and holding empty space amongst the
+graph and moving the mouse. This will pan the entire graph. The graph
+can be zoomed in and out by using a mouse scroll wheel. Nodes in the
+graph can be select by clicking on them. Nodes in the graph can be moved
+by clicking, holding, and dragging them using the mouse.
+
+How to use *View & Inspect*
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+1. Start typing into the "Search Network" bar and search suggestions will be displayed - as below
+
+.. image:: images/view1.png
+ :height: 152px
+ :width: 250 px
+ :scale: 100 %
+ :alt: alternate text
+ :align: left
+
+.. image:: images/view2.png
+ :height: 150px
+ :width: 269 px
+ :scale: 100 %
+ :alt: alternate text
+ :align: center
+
+.. image:: images/view3.png
+ :height: 150px
+ :width: 245 px
+ :scale: 100 %
+ :alt: alternate text
+ :align: center
+
+Note: The OXM schema defines the services and resources archetypes and mappings to Java types that are used by A&AI to define the REST endpoints for reading and manipulating the inventory data. The OXM file has been annotated with searchable attributes. Sparky communicates with Synapse (data-router) to update ElasticSearch as entities are created/updated and deleted.
+
+The related objects are displayed as a graph. The attributes of the selected object are in the "Node Details" tab beside the graph display
+
+.. image:: images/view4.png
+ :height: 165px
+ :width: 298 px
+ :scale: 100 %
+ :alt: alternate text
+ :align: left
+
+.. image:: images/view5.png
+ :height: 165px
+ :width: 280 px
+ :scale: 100 %
+ :alt: alternate text
+ :align: center
diff --git a/docs/platform/vnfs.rst b/docs/platform/vnfs.rst
new file mode 100644
index 0000000..b047ee4
--- /dev/null
+++ b/docs/platform/vnfs.rst
@@ -0,0 +1,48 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+
+Sparky - VNFs
+==============
+
+*VNFs* Overview
+~~~~~~~~~~~~~~~
+
+*VNFs* is an aggregation-based view that provides aggregate counts of
+VNFs based off of provisioning status and orchestration status.
+
+*VNFs* Features
+~~~~~~~~~~~~~~~
+With VNFs UI, users can:
+ * Search for generic-vnfs by attribute value(s) specified in the search box using suggested search terms.
+ * View search results aggregated to display:
+ - A count of the total VNFs returned by the search
+ - A count of the VNFs for each prov-status values.
+ * Filter out data to reduce the amount of data displayed to the user. Users can filter data based on:
+ - Orchestration status
+ - Provisioning status
+ - Network function type
+ - Network function role
+
+Navigation to *VNFs*
+~~~~~~~~~~~~~~~~~~~~
+
+1. Main navigation menu
+2. Selecting a search result related to an aggregation result (e.g. and
+ VNFs)
+
+Using *VNFs*
+~~~~~~~~~~~~
+
+*VNFs* is driven by using the search bar at the top of the UI to find
+and select aggregation queries. Once selected, the aggregation queries
+will be sent to the *Sparky* backend component for processing. When a
+result set has been determined *VNFs* will render the data - as below:
+
+Note: The OXM schema defines the services and resources archetypes and mappings to Java types that are used by A&AI to define the REST endpoints for reading and manipulating the inventory data. The OXM file has been annotated with suggestible attributes. Sparky communicates with Synapse (data-router) to update ElasticSearch as entities are created/updated and deleted.
+
+
+.. image:: images/view6.jpg
+ :height: 150px
+ :width: 270 px
+ :scale: 100 %
+ :alt: alternate text
+ :align: center
diff --git a/docs/view_inspect.rst b/docs/view_inspect.rst
deleted file mode 100644
index d25f95f..0000000
--- a/docs/view_inspect.rst
+++ /dev/null
@@ -1,44 +0,0 @@
-.. This work is licensed under a Creative Commons Attribution 4.0 International License.
-
-Sparky - Inventory UI Service
-=============================
-
-*View & Inspect* Overview
-~~~~~~~~~~~~~~~~~~~~~~~~~
-
-*View & Inspect* provides a graph based view of elements within AAI. A
-single entity is the entry point into each graph, and from that base
-element a graph is generated based off relationships.
-
-Navigation to *View & Inspect*
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-The *View & Inspect* view can be reached by two means:
-
-1. Main navigation menu
-2. Selecting a search result related to an entity instance (e.g. an
- entity called readme-entity)
-
-Using *View & Inspect*
-~~~~~~~~~~~~~~~~~~~~~~
-
-*View & Inspect* is driven by using the search bar at the top of the UI
-to find and select entity instances. Once an instance has been slected,
-a request is proccessed in *Sparky's* backend component that generates a
-graph representation of the selected entity. The graph data is returned
-to *View & Inspect* and rendered on screen.
-
-Node Details
-^^^^^^^^^^^^
-
-Upon node selection, the selected graph node details will appear in a
-panel to the right of the graph titled, *Node Details*.
-
-Interacting with the Graph
-~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-The graph can be panned by clicking and holding empty space amongst the
-graph and moving the mouse. This will pan the entire graph. The graph
-can be zoomed in and out by using a mouse scroll wheel. Nodes in the
-graph can be select by clicking on them. Nodes in the graph can be moved
-by clicking, holding, and dragging them using the mouse.
diff --git a/docs/vnfs.rst b/docs/vnfs.rst
deleted file mode 100644
index e7f39ef..0000000
--- a/docs/vnfs.rst
+++ /dev/null
@@ -1,25 +0,0 @@
-.. This work is licensed under a Creative Commons Attribution 4.0 International License.
-
-Sparky - Inventory UI Service
-=============================
-
-*VNFs* Overview
-~~~~~~~~~~~~~~~
-
-*VNFs* is an aggregation based view that provides aggregate counts of
-VNFs based off of provsioning status and orchestration status.
-
-Navigation to *VNFs*
-~~~~~~~~~~~~~~~~~~~~
-
-1. Main navigation menu
-2. Selecting a search result related to an aggregation result (e.g. and
- VNFs)
-
-Using *VNFs*
-~~~~~~~~~~~~
-
-*VNFs* is driven by using the search bar at the top of the UI to find
-and select aggregation queries. Once selected, the aggregation queries
-will be sent to the *Sparky* backend component for processing. When a
-result set has been determined *VNFs* will render the data.