summaryrefslogtreecommitdiffstats
path: root/docs/sections/services/datalake-handler
diff options
context:
space:
mode:
authorVENKATESH KUMAR <vv770d@att.com>2020-11-12 15:00:02 -0500
committerVENKATESH KUMAR <vv770d@att.com>2020-11-12 22:30:51 -0500
commite21c848562599340a99fc216bdaae97686d92a72 (patch)
tree65a3446cccde840220e3e2be2b53ee16d0bad7b4 /docs/sections/services/datalake-handler
parent2cf33d0d1cecad5a309d428aa44f7740ca48032e (diff)
DCAE doc cleanup warnings
Change-Id: I39daca57d4754abe0db56a7f85e7274a5f45ad74 Signed-off-by: VENKATESH KUMAR <vv770d@att.com> Issue-ID: DCAEGEN2-2141 Signed-off-by: VENKATESH KUMAR <vv770d@att.com>
Diffstat (limited to 'docs/sections/services/datalake-handler')
-rw-r--r--docs/sections/services/datalake-handler/index.rst2
-rw-r--r--docs/sections/services/datalake-handler/installation.rst18
-rw-r--r--docs/sections/services/datalake-handler/overview.rst8
-rw-r--r--docs/sections/services/datalake-handler/userguide.rst26
4 files changed, 31 insertions, 23 deletions
diff --git a/docs/sections/services/datalake-handler/index.rst b/docs/sections/services/datalake-handler/index.rst
index 56ada5f2..3b445a55 100644
--- a/docs/sections/services/datalake-handler/index.rst
+++ b/docs/sections/services/datalake-handler/index.rst
@@ -3,7 +3,7 @@
DataLake-Handler MS
-==============
+===================
**DataLake-Handler MS** is a software component of ONAP that can systematically persist the events from DMaaP into supported Big Data storage systems.
It has a Admin UI, where a system administrator configures which Topics to be monitored, and to which data storage to store the data.
diff --git a/docs/sections/services/datalake-handler/installation.rst b/docs/sections/services/datalake-handler/installation.rst
index 5d8b3341..16294b98 100644
--- a/docs/sections/services/datalake-handler/installation.rst
+++ b/docs/sections/services/datalake-handler/installation.rst
@@ -3,17 +3,17 @@ Deployment Steps
DL-handler consists of two pods- the feeder and admin UI. It can be deployed by using cloudify blueprint. Datalake can be easily deployed through DCAE cloudify manager. The following steps guides you launch Datalake though cloudify manager.
Pre-requisite
-----------------
+-------------
- Make sure mariadb-galera from OOM is properly deployed and functional.
- An external database, such as Elasticsearch and MongoDB is deployed.
After datalake getting deployed, the admin UI can be used to configure the sink database address and credentials.
Log-in to the DCAE Bootstrap POD
----------------------------------------------------
+--------------------------------
First, we should find the bootstrap pod name through the following command and make sure that DCAE coudify manager is properly deployed.
- .. image :: .images/bootstrap-pod.png
+ .. image :: ./images/bootstrap-pod.png
Login to the DCAE bootstrap pod through the following command.
.. code-block :: bash
@@ -38,17 +38,17 @@ After validating, we can start to proceed blueprints uploading.
Verify Uploaded Blueprints
--------------------------
-Using "cft blueprint list" to varify your work.
+Using "cfy blueprint list" to verify your work.
.. code-block :: bash
#cfy blueprint list
You can see the following returned message to show the blueprints have been correctly uploaded.
- .. image :: ./imagesblueprint-list.png
+ .. image :: ./images/blueprint-list.png
Verify Plugin Versions
-------------------------------------------------------------------------------
+----------------------
If the version of the plugin used is different, update the blueprint import to match.
.. code-block :: bash
@@ -74,11 +74,13 @@ Next, we are going to launch the datalake.
Verify the Deployment Result
-----------------------------
The following command can be used to list the datalake logs.
+
.. code-block :: bash
+
#kubectl logs <datalake-pod> -n onap
The output should looks like.
- .. image :: ./feeder-log.png
+ .. image :: ./images/feeder-log.png
If you find any Java exception from log, make sure that the external database and datalake configuration are properly configured.
Admin UI can be used to configure the external database configuration.
@@ -97,4 +99,4 @@ Delete Blueprint
.. code-block :: bash
#cfy blueprints delete datalake-feeder
- #cfy blueprints deltet datalake-admin-ui
+ #cfy blueprints delete datalake-admin-ui
diff --git a/docs/sections/services/datalake-handler/overview.rst b/docs/sections/services/datalake-handler/overview.rst
index 51dab104..09e41a5b 100644
--- a/docs/sections/services/datalake-handler/overview.rst
+++ b/docs/sections/services/datalake-handler/overview.rst
@@ -30,6 +30,7 @@ Note that not all data storage systems in the picture are supported. In R6, the
- Couchbase
- Elasticsearch and Kibana
- HDFS
+
Depending on demands, new systems may be added to the supported list. In the following we use the term database for the storage,
even though HDFS is a file system (but with simple settings, it can be treats as a database, e.g. Hive.)
@@ -61,12 +62,9 @@ Features
- Read data directly from Kafka for performance.
- Support for pluggable databases. To add a new database, we only need to implement its corrosponding service.
- - Support REST API for inter-component communications. Besides managing DatAlake settings in MariaDB,
- Admin UI also use this API to start/stop Feeder, query Feeder status and statistics.
+ - Support REST API for inter-component communications. Besides managing DatAlake settings in MariaDB, Admin UI also use this API to start/stop Feeder, query Feeder status and statistics.
- Use MariaDB to store settings.
- - Support data processing features. Before persisting data, data can be massaged in Feeder.
- Currently two features are implemented: Correlate Cleared Message (in org.onap.datalake.feeder.service.db.ElasticsearchService)
- and Flatten JSON Array (org.onap.datalake.feeder.service.StoreService).
+ - Support data processing features. Before persisting data, data can be massaged in Feeder. Currently two features are implemented: Correlate Cleared Message (in org.onap.datalake.feeder.service.db.ElasticsearchService) and Flatten JSON Array (org.onap.datalake.feeder.service.StoreService).
- Connection to Kafka and DBs are secured
diff --git a/docs/sections/services/datalake-handler/userguide.rst b/docs/sections/services/datalake-handler/userguide.rst
index b3be9491..f1de54d0 100644
--- a/docs/sections/services/datalake-handler/userguide.rst
+++ b/docs/sections/services/datalake-handler/userguide.rst
@@ -1,8 +1,8 @@
Admin UI User Guide
----------------------
+-------------------
Introduction
-~~~~~~~~
+~~~~~~~~~~~~
DataLake Admin UI aims to provide a user-friendly dashboard to easily monitor and
manage DataLake configurations for the involved components, ONAP topics, databases,
and 3rd-party tools. Please refer to the link to access the Admin UI portal
@@ -10,8 +10,9 @@ via http://datalake-admin-ui:30479
DataLake Feeder Management
-******************
+**************************
.. image:: ./images/adminui-feeder.png
+
Click the "DataLake Feeder" on the menu bar, and the dashboard will show
the overview DataLake Feeder information, such as the numbers of topics.
Also, you can enable or disable DataLake Feeder process backend process
@@ -19,12 +20,14 @@ by using the toggle switch.
Kafka Management
-******************
+****************
.. image:: ./images/adminui-kafka.png
+
Click the "Kafka" on the menu bar, and it provides the kafka resource settings
including add, modify and delete in the page to fulfill your management demand.
.. image:: ./images/adminui-kafka-edit.png
+
You can modify the kafka resource via clicking the card,
and click the plus button to add a new Kafka resource.
Then, you will need to fill the required information such as identifying name,
@@ -32,11 +35,12 @@ message router address and zookeeper address, and so on to build it up.
Topics Management
-******************
+*****************
.. image:: ./images/adminui-topics.png
.. image:: ./images/adminui-topic-edit1.png
.. image:: ./images/adminui-topic-edit2.png
.. image:: ./images/adminui-topic-edit3.png
+
The Topic page lists down all the topics which you have been configured
by topic management. You can edit the topic setting via double click the specific row.
The setting includes DataLake feeder status - catch the topic or not,
@@ -45,37 +49,41 @@ And choose one or more Kafka items as topic resource
and define the databased to store topic info are necessary.
.. image:: ./images/adminui-topic-config.png
+
For the default configuration of Topics, you can click the "Default configurations" button
to do the setting. When you add a new topic, these configurations will be filled into the form automatically.
.. image:: ./images/adminui-topic-new.png
+
To add a new topic for the DataLake Feeder, you can click the "plus icon" button
to catch the data into the 3rd-party database.
Please be noted that only existing topics in the Kafka can be added.
Database Management
-******************
+*******************
.. image:: ./images/adminui-dbs.png
.. image:: ./images/adminui-dbs-edit.png
+
In the Database Management page, it allows you to add, modify and delete the database resources
where the message from topics will be stored.
DataLake supports a bunch of databases including Couchbase DB, Apache Druid, Elasticsearch, HDFS, and MongoDB.
3rd-Party Tools Management
-******************
+**************************
.. image:: ./images/adminui-tools.png
+
In the Tools page, it allows you to manage the resources of 3rd-party tools for data visualization.
Currently, DataLake supports two Tools which are Kibana and Apache Superset.
3rd-Party Design Tools Management
-******************
+*********************************
.. image:: ./images/adminui-design.png
.. image:: ./images/adminui-design-edit.png
+
After setting up the 3rd-party tools, you can import the template as the JSON, YAML or other formats
for data exploration, data visualization and dashboarding. DataLake supports Kibana dashboarding,
Kibana searching, Kibana visualization, Elasticsearch field mapping template,
and Apache Druid Kafka indexing service.
-