Skip to content

Commit

Permalink
Updated README, CHANGELOG
Browse files Browse the repository at this point in the history
  • Loading branch information
rk13 committed Aug 16, 2020
1 parent b49c013 commit 6e84836
Show file tree
Hide file tree
Showing 3 changed files with 39 additions and 39 deletions.
12 changes: 2 additions & 10 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,19 +5,11 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [Unreleased] - yyyy-mm-dd

## [0.0.2] - yyyy-mm-dd
## [1.0] - 2020-08-16

### Changed

- A changelog is a file which contains a curated, chronologically ordered list of notable changes for each version of a project.

## [0.0.1] - yyyy-mm-dd

### Added

- To make it easier for users and contributors to see precisely what notable changes have been made between each release (or version) of the project.
- Initial 1.0 release

<!-- Markdown link dfn's -->
[unreleased]: https://github.com/klarna-incubator/TODO/compare/v1.1.0...HEAD
Expand Down
63 changes: 35 additions & 28 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,46 +1,53 @@
# Project Name
> Short blurb about what your project does.
> Java library provides [Apache Flink](https://flink.apache.org/) connector sink for JDBC database that can be used with Flink 1.8 runtime version.
Connector code is backported from the latest Flink version (1.11) in order to be used in [Amazon Kinesis Data Analytics](https://aws.amazon.com/kinesis/data-analytics/) applications.

[![Build Status][ci-image]][ci-url]
[![License][license-image]][license-url]
[![Developed at Klarna][klarna-image]][klarna-url]

At Klarna we use streaming applications extensively. Amazon Kinesis Data Analytics with Flink 1.8 is starting to be one of the choices for the development of new streaming analytics applications at Klarna. Unfortunately, some of the latest features developed in the Apache Flink project version after 1.8 are not available yet in Amazon Kinesis Data Analytics.

One to two paragraph statement about your project and what it does.

## First steps

<details>
<summary>Installation (for Admins)</summary>

Currently, new repositories can be created only by a Klarna Open Source community lead. Please reach out to us if you need assistance.

1. Create a new repository by clicking ‘Use this template’ button.

2. Make sure your newly created repository is private.

3. Enable Dependabot alerts in your candidate repo settings under Security & analysis. You need to enable ‘Allow GitHub to perform read-only analysis of this repository’ first.
</details>

1. Update `README.md` and `CHANGELOG.md`.

2. Optionally, change `.github/CONTRIBUTING.md`.

3. Do *not* edit `LICENSE`, `.github/CODE_OF_CONDUCT.md`, and `.github/SECURITY.md`.
`flink-connector-jdbc-1.8` is a Java library that contains code backported from the latest Flink version (1.11) `flink-connector-jdbc` library that can be used in Amazon Kinesis Data Analytics / Flink 1.8.

## Usage example

A few motivating and useful examples of how your project can be used. Spice this up with code blocks and potentially more screenshots.

_For more examples and usage, please refer to the [Docs](TODO)._
```java
import com.klarna.org.apache.flink.api.java.io.jdbc.JDBCOptions;
import com.klarna.org.apache.flink.api.java.io.jdbc.JDBCUpsertOutputFormat;
import com.klarna.org.apache.flink.api.java.io.jdbc.JDBCUpsertSinkFunction;

...
env.addSource(createConsumer())
.addSink(new JDBCUpsertSinkFunction(JDBCUpsertOutputFormat.builder()
.setFieldNames(new String[]{
"event_id",
"created_at"
})
.setFieldTypes(new int[]{
Types.VARCHAR,
Types.TIMESTAMP
})
.setFlushIntervalMills(10000)
.setFlushMaxSize(5000)
.setKeyFields(new String[]{ "event_id" })
.setMaxRetryTimes(3)
.setOptions(JDBCOptions.builder()
.setDBUrl(dbUrl)
.setDriverName(Driver.class.getName())
.setUsername(dbUsername)
.setPassword(dbPassword)
.setTableName(tableName)
.build())
.build()));
```

## Development setup

Describe how to install all development dependencies and how to run an automated test-suite of some kind. Potentially do this for multiple platforms.
This project uses [Maven](https://maven.apache.org/) to set up the development environment. The recommended workflow to build and install the library is the following.

```sh
make install
npm test
mvn clean install
```

## How to contribute
Expand Down
3 changes: 2 additions & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -6,14 +6,15 @@

<groupId>com.klarna</groupId>
<artifactId>flink-connector-jdbc-1.8</artifactId>
<version>1.0-SNAPSHOT</version>
<version>1.0</version>

<packaging>jar</packaging>

<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<scala.binary.version>2.12</scala.binary.version>
<flink.version>1.8.2</flink.version>
</properties>

<dependencies>
Expand Down

0 comments on commit 6e84836

Please sign in to comment.