summaryrefslogtreecommitdiffstats
path: root/docs/sections/services/datalake-handler
diff options
context:
space:
mode:
authorHansen, Tony (th1395) <th1395@att.com>2023-05-23 21:27:17 +0000
committerVijay Venkatesh Kumar <vv770d@att.com>2023-06-01 14:01:59 +0000
commit35b3b01521cef7c4f89931e8e89841196f82e1ad (patch)
tree2e8dbdd4b29bafb82a8cf45b0981e76fbb81b299 /docs/sections/services/datalake-handler
parent45520e94a2811b3485edf6b36cbfaf36ca34ec76 (diff)
clean up some sphinx warnings
Change-Id: I8c2d925e8b27b4740227af0be3ab5d6f7153ec38 Signed-off-by: Hansen, Tony (th1395) <th1395@att.com> Issue-ID: DCAEGEN2-3399 Signed-off-by: Hansen, Tony (th1395) <th1395@att.com> Signed-off-by: Vijay Venkatesh Kumar <vv770d@att.com> Signed-off-by: Hansen, Tony (th1395) <th1395@att.com> (cherry picked from commit 663df2c1b8d9176094a62b00b7e80de096180621) Signed-off-by: Vijay Venkatesh Kumar <vv770d@att.com>
Diffstat (limited to 'docs/sections/services/datalake-handler')
-rw-r--r--docs/sections/services/datalake-handler/index.rst8
-rw-r--r--docs/sections/services/datalake-handler/installation-helm.rst1
-rw-r--r--docs/sections/services/datalake-handler/overview.rst22
-rw-r--r--docs/sections/services/datalake-handler/userguide.rst48
4 files changed, 41 insertions, 38 deletions
diff --git a/docs/sections/services/datalake-handler/index.rst b/docs/sections/services/datalake-handler/index.rst
index e4f1c905..b57b55dc 100644
--- a/docs/sections/services/datalake-handler/index.rst
+++ b/docs/sections/services/datalake-handler/index.rst
@@ -5,10 +5,10 @@
DataLake-Handler MS
===================
-**DataLake-Handler MS** is a software component of ONAP that can systematically persist the events from DMaaP into supported Big Data storage systems.
-It has a Admin UI, where a system administrator configures which Topics to be monitored, and to which data storage to store the data.
-It is also used to manage the settings of the storage and associated data analytics tool.
-The second part is the Feeder, which does the data transfer work and is horizontal scalable.
+**DataLake-Handler MS** is a software component of ONAP that can systematically persist the events from DMaaP into supported Big Data storage systems.
+It has a Admin UI, where a system administrator configures which Topics to be monitored, and to which data storage to store the data.
+It is also used to manage the settings of the storage and associated data analytics tool.
+The second part is the Feeder, which does the data transfer work and is horizontal scalable.
The third part, Data Extraction Service (DES), which will expose the data in the data storage via REST API for other ONAP components and external systems to consume.
.. image:: DL-DES.PNG
diff --git a/docs/sections/services/datalake-handler/installation-helm.rst b/docs/sections/services/datalake-handler/installation-helm.rst
index 015094cf..e52f2fa2 100644
--- a/docs/sections/services/datalake-handler/installation-helm.rst
+++ b/docs/sections/services/datalake-handler/installation-helm.rst
@@ -104,4 +104,3 @@ Datalake-Des:
+-------------------------------+------------------------------------------------+
|PG_DB | Postgress database name |
+-------------------------------+------------------------------------------------+
-
diff --git a/docs/sections/services/datalake-handler/overview.rst b/docs/sections/services/datalake-handler/overview.rst
index fc14f995..f2d361a2 100644
--- a/docs/sections/services/datalake-handler/overview.rst
+++ b/docs/sections/services/datalake-handler/overview.rst
@@ -1,6 +1,6 @@
-.. This work is licensed under a Creative Commons Attribution 4.0
- International License. http://creativecommons.org/licenses/by/4.0
-
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+ http://creativecommons.org/licenses/by/4.0
+
.. _docs_Datalake_Handler_MS:
Architecture
@@ -9,12 +9,12 @@ Architecture
Background
~~~~~~~~~~
-There are large amount of data flowing among ONAP components, mostly via DMaaP and Web Services.
-For example, all events/feed collected by DCAE collectors go through DMaaP.
-DMaaP is backed by Kafka, which is a system for Publish-Subscribe,
-where data is not meant to be permanent and gets deleted after certain retention period.
+There are large amount of data flowing among ONAP components, mostly via DMaaP and Web Services.
+For example, all events/feed collected by DCAE collectors go through DMaaP.
+DMaaP is backed by Kafka, which is a system for Publish-Subscribe,
+where data is not meant to be permanent and gets deleted after certain retention period.
Kafka is not a database, means that data there is not for query.
-Though some components may store processed result into their local databases, most of the raw data will eventually lost.
+Though some components may store processed result into their local databases, most of the raw data will eventually lost.
We should provide a systematic way to store these raw data, and even the processed result,
which will serve as the source for data analytics and machine learning, providing insight to the network operation.
@@ -31,15 +31,15 @@ Note that not all data storage systems in the picture are supported. In R6, the
- Elasticsearch and Kibana
- HDFS
-Depending on demands, new systems may be added to the supported list. In the following we use the term database for the storage,
+Depending on demands, new systems may be added to the supported list. In the following we use the term database for the storage,
even though HDFS is a file system (but with simple settings, it can be treats as a database, e.g. Hive.)
-Note that once the data is stored in databases, other ONAP components and systems will directly query data from the databases,
+Note that once the data is stored in databases, other ONAP components and systems will directly query data from the databases,
without interacting with DataLake Handler.
Description
~~~~~~~~~~~
-DataLake Handler's main function is to monitor and persist data flow through DMaaP and provide a query API for other component or external services. The databases are outside of ONAP scope,
+DataLake Handler's main function is to monitor and persist data flow through DMaaP and provide a query API for other component or external services. The databases are outside of ONAP scope,
since the data is expected to be huge, and a database may be a complicated cluster consisting of thousand of nodes.
Admin UI
diff --git a/docs/sections/services/datalake-handler/userguide.rst b/docs/sections/services/datalake-handler/userguide.rst
index f1de54d0..0a9a4222 100644
--- a/docs/sections/services/datalake-handler/userguide.rst
+++ b/docs/sections/services/datalake-handler/userguide.rst
@@ -1,10 +1,14 @@
+.. This work is licensed under a
+ Creative Commons Attribution 4.0 International License.
+ http://creativecommons.org/licenses/by/4.0
+
Admin UI User Guide
-------------------
Introduction
~~~~~~~~~~~~
-DataLake Admin UI aims to provide a user-friendly dashboard to easily monitor and
-manage DataLake configurations for the involved components, ONAP topics, databases,
+DataLake Admin UI aims to provide a user-friendly dashboard to easily monitor and
+manage DataLake configurations for the involved components, ONAP topics, databases,
and 3rd-party tools. Please refer to the link to access the Admin UI portal
via http://datalake-admin-ui:30479
@@ -13,9 +17,9 @@ DataLake Feeder Management
**************************
.. image:: ./images/adminui-feeder.png
-Click the "DataLake Feeder" on the menu bar, and the dashboard will show
-the overview DataLake Feeder information, such as the numbers of topics.
-Also, you can enable or disable DataLake Feeder process backend process
+Click the "DataLake Feeder" on the menu bar, and the dashboard will show
+the overview DataLake Feeder information, such as the numbers of topics.
+Also, you can enable or disable DataLake Feeder process backend process
by using the toggle switch.
@@ -23,14 +27,14 @@ Kafka Management
****************
.. image:: ./images/adminui-kafka.png
-Click the "Kafka" on the menu bar, and it provides the kafka resource settings
-including add, modify and delete in the page to fulfill your management demand.
+Click the "Kafka" on the menu bar, and it provides the kafka resource settings
+including add, modify and delete in the page to fulfill your management demand.
.. image:: ./images/adminui-kafka-edit.png
-You can modify the kafka resource via clicking the card,
-and click the plus button to add a new Kafka resource.
-Then, you will need to fill the required information such as identifying name,
+You can modify the kafka resource via clicking the card,
+and click the plus button to add a new Kafka resource.
+Then, you will need to fill the required information such as identifying name,
message router address and zookeeper address, and so on to build it up.
@@ -41,22 +45,22 @@ Topics Management
.. image:: ./images/adminui-topic-edit2.png
.. image:: ./images/adminui-topic-edit3.png
-The Topic page lists down all the topics which you have been configured
-by topic management. You can edit the topic setting via double click the specific row.
-The setting includes DataLake feeder status - catch the topic or not,
-data format, and the numbers of time to live for the topic.
-And choose one or more Kafka items as topic resource
+The Topic page lists down all the topics which you have been configured
+by topic management. You can edit the topic setting via double click the specific row.
+The setting includes DataLake feeder status - catch the topic or not,
+data format, and the numbers of time to live for the topic.
+And choose one or more Kafka items as topic resource
and define the databased to store topic info are necessary.
.. image:: ./images/adminui-topic-config.png
-For the default configuration of Topics, you can click the "Default configurations" button
+For the default configuration of Topics, you can click the "Default configurations" button
to do the setting. When you add a new topic, these configurations will be filled into the form automatically.
.. image:: ./images/adminui-topic-new.png
-To add a new topic for the DataLake Feeder, you can click the "plus icon" button
-to catch the data into the 3rd-party database.
+To add a new topic for the DataLake Feeder, you can click the "plus icon" button
+to catch the data into the 3rd-party database.
Please be noted that only existing topics in the Kafka can be added.
@@ -65,7 +69,7 @@ Database Management
.. image:: ./images/adminui-dbs.png
.. image:: ./images/adminui-dbs-edit.png
-In the Database Management page, it allows you to add, modify and delete the database resources
+In the Database Management page, it allows you to add, modify and delete the database resources
where the message from topics will be stored.
DataLake supports a bunch of databases including Couchbase DB, Apache Druid, Elasticsearch, HDFS, and MongoDB.
@@ -83,7 +87,7 @@ Currently, DataLake supports two Tools which are Kibana and Apache Superset.
.. image:: ./images/adminui-design.png
.. image:: ./images/adminui-design-edit.png
-After setting up the 3rd-party tools, you can import the template as the JSON, YAML or other formats
-for data exploration, data visualization and dashboarding. DataLake supports Kibana dashboarding,
-Kibana searching, Kibana visualization, Elasticsearch field mapping template,
+After setting up the 3rd-party tools, you can import the template as the JSON, YAML or other formats
+for data exploration, data visualization and dashboarding. DataLake supports Kibana dashboarding,
+Kibana searching, Kibana visualization, Elasticsearch field mapping template,
and Apache Druid Kafka indexing service.