Skip to main content
Version: 0.7.0-incubating

Apache Gravitino Python client

Apache Gravitino is a high-performance, geo-distributed, and federated metadata lake. It manages the metadata directly in different sources, types, and regions, also provides users the unified metadata access for data and AI assets.

Gravitino Python client helps data scientists easily manage metadata using Python language.

gravitino-python-client-introduction

Use Guidance

You can use Gravitino Python client library with Spark, PyTorch, Tensorflow, Ray and Python environment.

First of all, You must have a Gravitino server set up and run, You can refer document of How to install Gravitino to build Gravitino server from source code and install it in your local.

Apache Gravitino Python client API

pip install apache-gravitino
  1. Manage metalake using Gravitino Python API
  2. Manage fileset metadata using Gravitino Python API

Apache Gravitino Fileset Example

We offer a playground environment to help you quickly understand how to use Gravitino Python client to manage non-tabular data on HDFS via Fileset in Gravitino. You can refer to the document How to use the playground to launch a Gravitino server, HDFS and Jupyter notebook environment in you local Docker environment.

Waiting for the playground Docker environment to start, you can directly open http://localhost:18888/lab/tree/gravitino-fileset-example.ipynb in the browser and run the example.

The gravitino-fileset-example contains the following code snippets:

  1. Install HDFS Python client.
  2. Create a HDFS client to connect HDFS and to do some test operations.
  3. Install Gravitino Python client.
  4. Initialize Gravitino admin client and create a Gravitino metalake.
  5. Initialize Gravitino client and list metalakes.
  6. Create a Gravitino Catalog and special type is Catalog.Type.FILESET and provider is hadoop
  7. Create a Gravitino Schema with the location pointed to a HDFS path, and use hdfs client to check if the schema location is successfully created in HDFS.
  8. Create a Fileset with type is Fileset.Type.MANAGED, use hdfs client to check if the fileset location was successfully created in HDFS.
  9. Drop this Fileset.Type.MANAGED type fileset and check if the fileset location was successfully deleted in HDFS.
  10. Create a Fileset with type is Fileset.Type.EXTERNAL and location pointed to exist HDFS path
  11. Drop this Fileset.Type.EXTERNAL type fileset and check if the fileset location was not deleted in HDFS.

How to development Apache Gravitino Python Client

You can ues any IDE to develop Gravitino Python Client. Directly open the client-python module project in the IDE.

Prerequisites

Build and testing

  1. Clone the Gravitino project.

    git clone git@github.com:apache/gravitino.git
  2. Build the Gravitino Python client module

    ./gradlew :clients:client-python:build
  3. Run unit tests

    ./gradlew :clients:client-python:test -PskipITs
  4. Run integration tests

    Because Python client connects to Gravitino Server to run integration tests, So it runs ./gradlew compileDistribution -x test command automatically to compile the Gravitino project in the distribution directory. When you run integration tests via Gradle command or IDE, Gravitino integration test framework (integration_test_env.py) will start and stop Gravitino server automatically.

    ./gradlew :clients:client-python:test
  5. Distribute the Gravitino Python client module

    ./gradlew :clients:client-python:distribution
  6. Deploy the Gravitino Python client to https://pypi.org/project/apache-gravitino/

    ./gradlew :clients:client-python:deploy

Resources

License

Gravitino is under the Apache License Version 2.0, See the LICENSE for the details.

ASF Incubator disclaimer

Apache Gravitino is an effort undergoing incubation at The Apache Software Foundation (ASF), sponsored by the Apache Incubator. Incubation is required of all newly accepted projects until a further review indicates that the infrastructure, communications, and decision making process have stabilized in a manner consistent with other successful ASF projects. While incubation status is not necessarily a reflection of the completeness or stability of the code, it does indicate that the project has yet to be fully endorsed by the ASF.