Apache Gravitino web UI
This document primarily outlines how users can manage metadata within Apache Gravitino using the web UI, the graphical interface is accessible through a web browser as an alternative to writing code or using the REST interface.
Currently, you can integrate OAuth settings to view, add, modify, and delete metalakes, create catalogs, and view catalogs, schemas, and tables, among other functions.
Build and deploy the Gravitino Web UI and open it in a browser at http://<gravitino-host>:<gravitino-port>
, by default is http://localhost:8090.
Initial page
The web UI homepage displayed in Gravitino depends on the configuration parameter for OAuth mode, see the details in Security.
Set parameter for gravitino.authenticators
, simple
or oauth
. Simple mode is the default authentication option. If multiple authenticators are set, the first one is taken by default.
After changing the configuration, make sure to restart the Gravitino server.
<path-to-gravitino>/bin/gravitino.sh restart
Simple mode
gravitino.authenticators = simple
Set the configuration parameter gravitino.authenticators
to simple
, and the web UI displays the homepage (Metalakes).
At the top-right, the UI displays the current Gravitino version.
The main content displays the existing metalake list.
Oauth mode
gravitino.authenticators = oauth
Set the configuration parameter gravitino.authenticators
to oauth
, and the web UI displays the login page.
If both OAuth
and HTTPS
are set, due to the different security permission rules of various browsers, to avoid cross-domain errors,
it is recommended to use the Chrome browser for access and operation.
Such as Safari need to enable the developer menu, and select Disable Cross-Origin Restrictions
from the develop menu.
-
Enter the values corresponding to your specific configuration. For detailed instructions, please refer to Security.
-
Click on the
LOGIN
button takes you to the homepage.
At the top-right, there is an icon button that takes you to the login page when clicked.
Manage metadata
All the manage actions are performed by using the REST API
Metalake
Create metalake
On the homepage, clicking on the CREATE METALAKE
button displays a dialog to create a metalake.
Creating a metalake needs these fields:
- Name(required): the name of the metalake.
- Comment(optional): the comment of the metalake.
- Properties(optional): Click on the
ADD PROPERTY
button to add custom properties.
There are 3 actions you can perform on a metalake.
Show metalake details
Click on the action icon in the table cell.
You can see the detailed information of this metalake in the drawer component on the right.
Edit metalake
Click on the action icon in the table cell.
Displays the dialog for modifying fields of the selected metalake.
Disable metalake
Metalake defaults to in-use after successful creation.
Mouse over the switch next to the metalake's name to see the 'In-use' tip.
Click on the switch will disable the metalake, mouse over the switch next to the metalake's name to see the 'Not in-use' tip.
Drop metalake
Click on the action icon in the table cell.
Displays a confirmation dialog, clicking on the DROP
button drops this metalake.
Catalog
Click on a metalake name in the table views catalogs in a metalake.
If this is the first time, it shows no data until after creating a catalog.
Click on the left arrow icon button takes you to the metalake page.
Click on the Tab - DETAILS
views the details of the metalake on the metalake catalogs page.
On the left side of the page is a tree list, and the icons of the catalog correspond to their type and provider.
- Catalog (e.g. iceberg catalog)
- Schema
- Table
Hover your mouse over the corresponding icon to the data changes to a reload icon . Click on this icon to reload the currently selected data.
Create catalog
Click on the CREATE CATALOG
button displays the dialog to create a catalog.
Creating a catalog requires these fields:
- Catalog name(required): the name of the catalog
- Type(required):
relational
/fileset
/messaging
/model
, the default value isrelational
- Provider(required):
- Type
relational
-hive
/iceberg
/mysql
/postgresql
/doris
/paimon
/hudi
/oceanbase
- Type
fileset
-hadoop
- Type
messaging
-kafka
- Type
model
has no provider
- Type
- Comment(optional): the comment of this catalog
- Properties(each
provider
must fill in the required property fields specifically)
Providers
Required properties in various providers
1. Type relational
- Hive
- Iceberg
- MySQL
- PostgreSQL
- Doris
- Paimon
- Hudi
- OceanBase
Follow the Apache Hive catalog document.
Key | Description |
---|---|
metastore.uris | The Hive metastore URIs e.g. thrift://127.0.0.1:9083 |
Follow the Lakehouse Iceberg catalog document.
the parameter catalog-backend
provides two values: hive
, and jdbc
.
Key | Description |
---|---|
catalog-backend | hive , or jdbc |
hive
Key | Description |
---|---|
uri | Iceberg catalog URI config |
warehouse | Iceberg catalog warehouse config |
jdbc
Key | Description |
---|---|
uri | Iceberg catalog URI config |
warehouse | Iceberg catalog warehouse config |
jdbc-driver | "com.mysql.jdbc.Driver" or "com.mysql.cj.jdbc.Driver" for MySQL, "org.postgresql.Driver" for PostgreSQL |
jdbc-user | jdbc username |
jdbc-password | jdbc password |
the parameter authentication.type
provides two values: simple
, and Kerberos
.
Kerberos
Key | Description |
---|---|
authentication.type | The type of authentication for Paimon catalog backend, currently Gravitino only supports Kerberos and simple. |
authentication.kerberos.principal | The principal of the Kerberos authentication. |
authentication.kerberos.keytab-uri | The URI of The keytab for the Kerberos authentication. |
Follow the JDBC MySQL catalog document.
Key | Description |
---|---|
jdbc-driver | JDBC URL for connecting to the database. e.g. com.mysql.jdbc.Driver or com.mysql.cj.jdbc.Driver |
jdbc-url | e.g. jdbc:mysql://localhost:3306 |
jdbc-user | The JDBC user name |
jdbc-password | The JDBC password |
Follow the JDBC PostgreSQL catalog document.
Key | Description |
---|---|
jdbc-driver | e.g. org.postgresql.Driver |
jdbc-url | e.g. jdbc:postgresql://localhost:5432/your_database |
jdbc-user | The JDBC user name |
jdbc-password | The JDBC password |
jdbc-database | e.g. pg_database |
Follow the JDBC Doris catalog document.
Key | Description |
---|---|
jdbc-driver | JDBC URL for connecting to the database. e.g. com.mysql.jdbc.Driver |
jdbc-url | e.g. jdbc:mysql://localhost:9030 |
jdbc-user | The JDBC user name |
jdbc-password | The JDBC password |
Follow the lakehouse-paimon-catalog document.
the parameter catalog-backend
provides three values: filesystem
, hive
, and jdbc
.
Key | Description |
---|---|
catalog-backend | filesystem , hive , or jdbc |
filesystem
Key | Description |
---|---|
warehouse | Paimon catalog warehouse config |
hive
Key | Description |
---|---|
uri | Paimon catalog URI config |
warehouse | Paimon catalog warehouse config |
jdbc
Key | Description |
---|---|
uri | Paimon catalog URI config |
warehouse | Paimon catalog warehouse config |
jdbc-driver | "com.mysql.jdbc.Driver" or "com.mysql.cj.jdbc.Driver" for MySQL, "org.postgresql.Driver" for PostgreSQL |
jdbc-user | jdbc username |
jdbc-password | jdbc password |
the parameter authentication.type
provides two values: simple
, and Kerberos
.
Kerberos
Key | Description |
---|---|
authentication.type | The type of authentication for Paimon catalog backend, currently Gravitino only supports Kerberos and simple. |
authentication.kerberos.principal | The principal of the Kerberos authentication. |
authentication.kerberos.keytab-uri | The URI of The keytab for the Kerberos authentication. |
Follow the lakehouse-hudi-catalog document.
Key | Description |
---|---|
catalog-backend | hms |
uri | Hudi catalog URI config |
Follow the jdbc-oceanbase-catalog document.
Key | Description |
---|---|
jdbc-driver | e.g. com.mysql.jdbc.Driver or com.mysql.cj.jdbc.Driver or com.oceanbase.jdbc.Driver |
jdbc-url | e.g. jdbc:mysql://localhost:2881 or jdbc:oceanbase://localhost:2881 |
jdbc-user | The JDBC user name |
jdbc-password | The JDBC password |
Due to the current limitation of the web interface, which only allows for viewing, the functionality to create or modify schema, tables, or filesets is not available. Please refer to the documentation to use the REST API for these operations.
2. Type fileset
- Hadoop
Follow the Hadoop catalog document.
3. Type messaging
- Kafka
Follow the Kafka catalog document.
Key | Description |
---|---|
bootstrap.servers | The Kafka broker(s) to connect to, allowing for multiple brokers by comma-separating them |
After verifying the values of these fields, clicking on the CREATE
button creates a catalog.
Show catalog details
Click on the action icon in the table cell.
You can see the detailed information of this catalog in the drawer component on the right.
Edit catalog
Click on the action icon in the table cell.
Displays the dialog for modifying fields of the selected catalog.
Only the name
, comment
, and custom fields in properties
can be modified, other fields such as type
, provider
, and default fields in properties
cannot be modified.
The fields that are not allowed to be modified cannot be selected and modified in the web UI.
Disable catalog
Catalog defaults to in-use after successful creation.
Mouse over the switch next to the catalog's name to see the 'In-use' tip.
Click on the switch will disable the catalog, mouse over the switch next to the catalog's name to see the 'Not in-use' tip.
Delete catalog
Click on the action icon in the table cell.
Displays a confirmation dialog, clicking on the SUBMIT button deletes this catalog.
Schema
Click the catalog tree node on the left sidebar or the catalog name link in the table cell.
Displays the list schemas of the catalog.
Create schema
Click on the CREATE SCHEMA
button displays the dialog to create a schema.
Creating a schema needs these fields:
- Name(required): the name of the schema.
- Comment(optional): the comment of the schema.
- Properties(optional): Click on the
ADD PROPERTY
button to add custom properties.
Show schema details
Click on the action icon in the table cell.
You can see the detailed information of this schema in the drawer component on the right.
Edit schema
Click on the action icon in the table cell.
Displays the dialog for modifying fields of the selected schema.
Drop schema
Click on the action icon in the table cell.
Displays a confirmation dialog, clicking on the DROP
button drops this schema.
Table
Click the hive schema tree node on the left sidebar or the schema name link in the table cell.
Displays the list tables of the schema.
Create table
Click on the CREATE TABLE
button displays the dialog to create a table.
Creating a table needs these fields:
- Name(required): the name of the table.
- columns(required):
- The name and type of each column are required.
- Only suppport simple types, cannot support complex types by ui, you can create complex types by api.
- Comment(optional): the comment of the table.
- Properties(optional): Click on the
ADD PROPERTY
button to add custom properties.
Show table details
Click on the action icon in the table cell.
You can see the detailed information of this table in the drawer component on the right.
Click the table tree node on the left sidebar or the table name link in the table cell.
You can see the columns and detailed information on the right page.
Edit table
Click on the action icon in the table cell.
Displays the dialog for modifying fields of the selected table.
Drop table
Click on the action icon in the table cell.
Displays a confirmation dialog, clicking on the DROP
button drops this table.
Fileset
Click the fileset schema tree node on the left sidebar or the schema name link in the table cell.
Displays the list filesets of the schema.
Create fileset
Click on the CREATE FILESET
button displays the dialog to create a fileset.
Creating a fileset needs these fields:
- Name(required): the name of the fileset.
- Type(required):
managed
/external
, the default value ismanaged
. - Storage Location(optional):
- It is optional if the fileset is 'Managed' type and a storage location is already specified at the parent catalog or schema level.
- It becomes mandatory if the fileset type is 'External' or no storage location is defined at the parent level.
- Comment(optional): the comment of the fileset.
- Properties(optional): Click on the
ADD PROPERTY
button to add custom properties.
Show fileset details
Click on the action icon in the table cell.
You can see the detailed information of this fileset in the drawer component on the right.
Click the fileset tree node on the left sidebar or the fileset name link in the table cell.
You can see the detailed information on the right page.
Edit fileset
Click on the action icon in the table cell.
Displays the dialog for modifying fields of the selected fileset.
Drop fileset
Click on the action icon in the table cell.
Displays a confirmation dialog, clicking on the DROP
button drops this fileset.
Topic
Click the kafka schema tree node on the left sidebar or the schema name link in the table cell.
Displays the list topics of the schema.
Create topic
Click on the CREATE TOPIC
button displays the dialog to create a topic.
Creating a topic needs these fields:
- Name(required): the name of the topic.
- Comment(optional): the comment of the topic.
- Properties(optional): Click on the
ADD PROPERTY
button to add custom properties.
Show topic details
Click on the action icon in the table cell.
You can see the detailed information of this topic in the drawer component on the right.
Click the topic tree node on the left sidebar or the topic name link in the table cell.
You can see the detailed information on the right page.
Edit topic
Click on the action icon in the table cell.
Displays the dialog for modifying fields of the selected topic.
Drop topic
Click on the action icon in the table cell.
Displays a confirmation dialog, clicking on the DROP
button drops this topic.
Model
Click the model schema tree node on the left sidebar or the schema name link in the table cell.
Displays the list model of the schema.
Register model
Click on the REGISTER MODEL
button displays the dialog to register a model.
Register a model needs these fields:
- Name(required): the name of the model.
- Comment(optional): the comment of the model.
- Properties(optional): Click on the
ADD PROPERTY
button to add custom properties.
Show model details
Click on the action icon in the table cell.
You can see the detailed information of this model in the drawer component on the right.
Drop model
Click on the action icon in the table cell.
Displays a confirmation dialog, clicking on the DROP
button drops this model.
Version
Click the model tree node on the left sidebar or the model name link in the table cell.
Displays the list versions of the model.
Link version
Click on the LINK VERSION
button displays the dialog to link a version.
Link a version needs these fields:
- URI(required): the uri of the version.
- Aliases(required): the aliases of the version, aliase cannot be number or number string.
- Comment(optional): the comment of the model.
- Properties(optional): Click on the
ADD PROPERTY
button to add custom properties.
Show version details
Click on the action icon in the table cell.
You can see the detailed information of this version in the drawer component on the right.
Drop version
Click on the action icon in the table cell.
Displays a confirmation dialog, clicking on the DROP
button drops this version.
Feature capabilities
Page | Capabilities |
---|---|
Metalake | View ✔ / Create ✔ / Edit ✔ / Delete ✔ |
Catalog | View ✔ / Create ✔ / Edit ✔ / Delete ✔ |
Schema | View ✔ / Create ✔ / Edit ✔ / Delete ✔ |
Table | View ✔ / Create ✔ / Edit ✔ / Delete ✔ |
Fileset | View ✔ / Create ✔ / Edit ✔ / Delete ✔ |
Topic | View ✔ / Create ✔ / Edit ✔ / Delete ✔ |
Model | View ✔ / Create ✔ / Edit ✘ / Delete ✔ |
Version | View ✔ / Create ✔ / Edit ✘ / Delete ✔ |
E2E test
End-to-end testing for web frontends is conducted using the Selenium testing framework, which is Java-based.
Test cases can be found in the project directory: integration-test/src/test/java/org/apache/gravitino/integration/test/web/ui
, where the pages
directory is designated for storing definitions of frontend elements, among others.
The root directory contains the actual steps for the test cases.
While writing test cases, running them in a local environment may not pose any issues.
However, due to the limited performance capabilities of GitHub Actions, scenarios involving delayed DOM loading—such as the time taken for a popup animation to open—can result in test failures.
To circumvent this issue, it is necessary to manually insert a delay operation, for instance, by adding such as Thread.sleep(sleepTimeMillis)
.
This ensures that the test waits for the completion of the delay animation before proceeding with the next operation, thereby avoiding the problem.
It is advisable to utilize the waits
methods inherent to Selenium as a substitute for Thread.sleep()
.