Skip to content

Commit

Permalink
Source Kyriba: Updates CDK, Increases Testing Coverage, Fixes Accepta…
Browse files Browse the repository at this point in the history
…nce Test Config, Adds Expected Records (#34545)
  • Loading branch information
pnilan authored and xiaohansong committed Feb 13, 2024
1 parent 07ebba0 commit e95363e
Show file tree
Hide file tree
Showing 19 changed files with 539 additions and 1,081 deletions.
38 changes: 0 additions & 38 deletions airbyte-integrations/connectors/source-kyriba/Dockerfile

This file was deleted.

66 changes: 58 additions & 8 deletions airbyte-integrations/connectors/source-kyriba/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ For information about how to use this connector within Airbyte, see [the documen
### Prerequisites
**To iterate on this connector, make sure to complete this prerequisites section.**

#### Minimum Python version required `= 3.7.0`
#### Minimum Python version required `= 3.10.0`

#### Build & Activate Virtual Environment and install dependencies
From this connector directory, create a virtual environment:
Expand Down Expand Up @@ -50,19 +50,70 @@ python main.py read --config secrets/config.json --catalog integration_tests/con
### Locally running the connector docker image


#### Build
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**


#### Use `airbyte-ci` to build your connector
The Airbyte way of building this connector is to use our `airbyte-ci` tool.
You can follow install instructions [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#L1).
Then running the following command will build your connector:

```bash
airbyte-ci connectors --name=source-kyriba build
airbyte-ci connectors --name source-kyriba build
```
Once the command is done, you will find your connector image in your local docker registry: `airbyte/source-kyriba:dev`.

##### Customizing our build process
When contributing on our connector you might need to customize the build process to add a system dependency or set an env var.
You can customize our build process by adding a `build_customization.py` module to your connector.
This module should contain a `pre_connector_install` and `post_connector_install` async function that will mutate the base image and the connector container respectively.
It will be imported at runtime by our build process and the functions will be called if they exist.

Here is an example of a `build_customization.py` module:
```python
from __future__ import annotations

from typing import TYPE_CHECKING

if TYPE_CHECKING:
# Feel free to check the dagger documentation for more information on the Container object and its methods.
# https://dagger-io.readthedocs.io/en/sdk-python-v0.6.4/
from dagger import Container


async def pre_connector_install(base_image_container: Container) -> Container:
return await base_image_container.with_env_variable("MY_PRE_BUILD_ENV_VAR", "my_pre_build_env_var_value")

async def post_connector_install(connector_container: Container) -> Container:
return await connector_container.with_env_variable("MY_POST_BUILD_ENV_VAR", "my_post_build_env_var_value")
```

An image will be built with the tag `airbyte/source-kyriba:dev`.
#### Build your own connector image
This connector is built using our dynamic built process in `airbyte-ci`.
The base image used to build it is defined within the metadata.yaml file under the `connectorBuildOptions`.
The build logic is defined using [Dagger](https://dagger.io/) [here](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/pipelines/builds/python_connectors.py).
It does not rely on a Dockerfile.

If you would like to patch our connector and build your own a simple approach would be to:

1. Create your own Dockerfile based on the latest version of the connector image.
```Dockerfile
FROM airbyte/source-kyriba:latest

COPY . ./airbyte/integration_code
RUN pip install ./airbyte/integration_code

**Via `docker build`:**
# The entrypoint and default env vars are already set in the base image
# ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
# ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]
```
Please use this as an example. This is not optimized.

2. Build your image:
```bash
docker build -t airbyte/source-kyriba:dev .
# Running the spec command against your patched connector
docker run airbyte/source-kyriba:dev spec
```

#### Run
Then run any of the connector commands as follows:
```
Expand Down Expand Up @@ -97,4 +148,3 @@ You've checked out the repo, implemented a million dollar feature, and you're re
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
6. Pat yourself on the back for being an awesome contributor.
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.

Original file line number Diff line number Diff line change
@@ -1,30 +1,38 @@
# See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference)
# for more information about how to configure these tests
connector_image: airbyte/source-kyriba:dev
tests:
acceptance_tests:
spec:
- spec_path: "source_kyriba/spec.json"
tests:
- spec_path: "source_kyriba/spec.json"
connection:
- config_path: "secrets/config.json"
status: "succeed"
- config_path: "integration_tests/invalid_config.json"
status: "failed"
tests:
- config_path: "secrets/config.json"
status: "succeed"
- config_path: "integration_tests/invalid_config.json"
status: "failed"
discovery:
- config_path: "secrets/config.json"
tests:
- config_path: "secrets/config.json"
basic_read:
- config_path: "secrets/config.json"
configured_catalog_path: "integration_tests/configured_catalog.json"
empty_streams: []
# TODO uncomment this block to specify that the tests should assert the connector outputs the records provided in the input file a file
# expect_records:
# path: "integration_tests/expected_records.jsonl"
# extra_fields: no
# exact_order: no
# extra_records: yes
incremental: # TODO if your connector does not implement incremental sync, remove this block
- config_path: "secrets/config.json"
configured_catalog_path: "integration_tests/configured_catalog.json"
future_state_path: "integration_tests/abnormal_state.json"
tests:
- config_path: "secrets/config.json"
timeout_seconds: 1200
expect_records:
path: "integration_tests/expected_records.jsonl"
extra_fields: no
exact_order: no
extra_records: yes
fail_on_extra_columns: true
incremental:
tests:
- config_path: "secrets/config.json"
timeout_seconds: 2400
configured_catalog_path: "integration_tests/configured_catalog.json"
future_state:
future_state_path: "integration_tests/abnormal_state.json"
full_refresh:
- config_path: "secrets/config.json"
configured_catalog_path: "integration_tests/configured_catalog.json"
tests:
- config_path: "secrets/config.json"
timeout_seconds: 2400
configured_catalog_path: "integration_tests/configured_catalog.json"
Loading

0 comments on commit e95363e

Please sign in to comment.