Skip to content

jeannyil-camel-playground/camel-k-example-transformations

 
 

Repository files navigation

Camel K Transformations Example

This example demonstrates how to transform data with Camel K by showing how to deal with common formats like XML and JSON and how to connect to databases.

Flux diagram

We will start by reading a CSV file and loop over each row independently. For each row, we will query an XML API and a database and use all the data collected to build a JSON file. Finally we will collect and aggregate all rows to build a final JSON to be stored on a database. The final JSON is also a valid GeoJSON.

Before you begin

Make sure you check-out this repository from git and open it with VSCode.

Instructions are based on VSCode Didact, so make sure it's installed from the VSCode extensions marketplace.

From the VSCode UI, right-click on the readme.didact.md file and select "Didact: Start Didact tutorial from File". A new Didact tab will be opened in VS Code.

Make sure you've opened this readme file with Didact before jumping to the next section.

Preparing the cluster

This example can be run on any OpenShift 4.3+ cluster or a local development instance (such as CRC). Ensure that you have a cluster available and login to it using the OpenShift oc command line tool.

You can use the following section to check if your environment is configured properly.

Checking requirements

Validate all Requirements at Once!

OpenShift CLI ("oc")

The OpenShift CLI tool ("oc") will be used to interact with the OpenShift cluster.

Check if the OpenShift CLI ("oc") is installed{.didact}

Status: unknown{#oc-requirements-status}

Connection to an OpenShift cluster

You need to connect to an OpenShift cluster in order to run the examples.

Check if you're connected to an OpenShift cluster{.didact}

Status: unknown{#cluster-requirements-status}

We are going to create and use a new project on your cluster to start on a clean environment. This project will be removed at the end of the example.

To create the project, we can use the oc tool we just checked:

oc new-project camel-transformations

(^ execute{.didact})

Now we can proceed with the next requirement.

Apache Camel K CLI ("kamel")

You need to install the Camel K operator in the camel-transformations project. To do so, go to the OpenShift 4.x web console, login with a cluster admin account and use the OperatorHub menu item on the left to find and install "Red Hat Integration - Camel K". You will be given the option to install it globally on the cluster or on a specific namespace.

If using a specific namespace, make sure you select the camel-transformations project from the dropdown list. This completes the installation of the Camel K operator (it may take a couple of minutes).

When the operator is installed, from the OpenShift Help menu ("?") at the top of the WebConsole, you can access the "Command Line Tools" page, where you can download the "kamel" CLI, that is required for running this example. The CLI must be installed in your system path.

Refer to the "Red Hat Integration - Camel K" documentation for a more detailed explanation of the installation steps for the operator and the CLI.

Check if the Apache Camel K CLI ("kamel") is installed{.didact}

Status: unknown{#kamel-requirements-status}

Optional Requirements

The following requirements are optional. They don't prevent the execution of the demo, but may make it easier to follow.

VS Code Extension Pack for Apache Camel

The VS Code Extension Pack for Apache Camel by Red Hat provides a collection of useful tools for Apache Camel K developers, such as code completion and integrated lifecycle management. They are recommended for the tutorial, but they are not required.

You can install it from the VS Code Extensions marketplace.

Check if the VS Code Extension Pack for Apache Camel by Red Hat is installed{.didact}

Status: unknown{#extension-requirement-status}

1. Preparing the project

First, make sure we are on the right project:

oc project camel-transformations

(^ execute{.didact})

Before you continue, you should ensure that the Camel K operator is installed:

oc get csv

(^ execute{.didact})

When Camel K is installed, you should find an entry related to red-hat-camel-k-operator in phase Succeeded.

You can now proceed to the next section.

2. Setting up complementary database

This example uses a PostgreSQL database. We want to install it on a the project camel-transformations. We can go to the OpenShift 4.x WebConsole page, use the OperatorHub menu item on the left hand side menu and use it to find and install "PostgreSQL Operator by Dev4Ddevs.com". This will install the operator and may take a couple minutes to install.

Once the operator is installed, we can create a new database using

oc create -f test/resources/postgres.yaml

(^ execute{.didact})

We connect to the database pod to create a table and add data to be extracted later.

oc rsh $(oc get pods -l cr=mypostgres -o name)

(^ execute{.didact})

psql -U camel-k-example example \
-c "CREATE TABLE descriptions (id varchar(10), info varchar(30));
CREATE TABLE measurements (id serial, geojson varchar);
INSERT INTO descriptions (id, info) VALUES ('SO2', 'Nitric oxide is a free radical');
INSERT INTO descriptions (id, info) VALUES ('NO2', 'Toxic gas');"

(^ execute{.didact})

exit

(^ execute{.didact})

3. Running the integration

The integration is all contained in a single file named Transformations.java (open{.didact}).

Additional generic support classes (customizers) are present in the customizers directory, to simplify the configuration of PostgreSQL and the CSV dataformat.

We're ready to run the integration on our camel-transformations project in the cluster.

Use the following command to run it in "dev mode", in order to see the logs in the integration terminal:

kamel run Transformations.java --dev

(^ execute{.didact})

If everything is ok, after the build phase finishes, you should see the Camel integration running and printing the steps output in the terminal window.

To exit dev mode and terminate the execution, just click here{.didact} or hit ctrl+c on the terminal window.

Note: When you terminate a "dev mode" execution, also the remote integration will be deleted. This gives the experience of a local program execution, but the integration is actually running in the remote cluster.

To keep the integration running and not linked to the terminal, you can run it without "dev mode", just run:

kamel run Transformations.java

(^ execute{.didact})

After executing the command, you should be able to see it among running integrations:

oc get integrations

(^ execute{.didact})

An integration named transformations should be present in the list and it should be in status Running. There's also a kamel get command which is an alternative way to list all running integrations.

Note: the first time you've run the integration, an IntegrationKit (basically, a container image) has been created for it and it took some time for this phase to finish. When you run the integration a second time, the existing IntegrationKit is reused (if possible) and the integration reaches the "Running" state much faster.

Even if it's not running in dev mode, you can still see the logs of the integration using the following command:

kamel log transformations

(^ execute{.didact})

The last parameter ("transformations") is the name of the running integration for which you want to display the logs.

Click here to terminate the log stream{.didact} or hit ctrl+c on the terminal window.

Closing the log does not terminate the integration. It is still running, as you can see with:

oc get integrations

(^ execute{.didact})

Note: Your IDE may provide an "Apache Camel K Integrations" panel where you can see the list of running integrations and also open a window to display the logs.

4. Uninstall

To cleanup everything, execute the following command which will remove the project from OpenShift and drop all resources related to it.

oc delete project camel-transformations

(^ execute{.didact})

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Java 75.2%
  • Shell 18.1%
  • Gherkin 6.7%