diff --git a/.github/workflows/unit_tests.yaml b/.github/workflows/unit_tests.yaml
index 481ced2..f4019fd 100644
--- a/.github/workflows/unit_tests.yaml
+++ b/.github/workflows/unit_tests.yaml
@@ -12,6 +12,18 @@ jobs:
python-version: [ "3.9" ]
os: [ ubuntu-latest ]
runs-on: ${{ matrix.os }}
+ services:
+ postgres:
+ image: postgres:13.8-alpine
+ env:
+ POSTGRES_PASSWORD: postgres
+ options: >-
+ --health-cmd pg_isready
+ --health-interval 10s
+ --health-timeout 5s
+ --health-retries 5
+ ports:
+ - 5432:5432
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
@@ -24,4 +36,6 @@ jobs:
- name: Install dependencies
run: poetry install
- name: Unit tests
- run: poetry run pytest tests/unit -p no:warnings --asyncio-mode=strict
+ run: |
+ ISTORE_PG_SCHEMA_IGNORE_THIS=unit_test_$(date +%s) \
+ poetry run pytest tests/unit -p no:warnings --asyncio-mode=strict
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index 7626647..0ae360f 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -1,2 +1,81 @@
# Contributing to Keystone
+### Development using Docker
+
+#### Build the Image
+
+Requirements:
+
+* **A WSL2/MacOS/Linux shell**: You'll need a Linux shell to build and run the container locally.
+ A standard Linux/Unix shell includes GNU Make, which is required in order to use the shortcut commands
+ in this project's Makefile.
+* **Docker**: You'll need a Docker daemon such as the one included with
+ [Docker Desktop](https://www.docker.com/products/docker-desktop/). This is required to build the container
+ image locally.
+
+To build the image, run the following command:
+
+```shell
+make build-image
+```
+
+#### Run the Container Locally
+
+In short, you can use the following Make command to run the container locally:
+
+```shell
+make docker-run-dev
+```
+
+This Make command will expand to the following `docker run` command:
+
+```shell
+docker run \
+ --rm -it --name scim-2-api-python-dev \
+ -p 5001:5001 \
+ --mount type=bind,source=$(pwd)/config/dev.yaml,target=/tmp/config.yaml \
+ --env CONFIG_PATH=/tmp/config.yaml scim-2-api-python-dev:latest
+```
+
+Running the container with the default values will expose the API on port 5001.
+You should now be able to inspect the OpenAPI specifications (Swagger docs) opening [http://localhost:5001](http://localhost:5001)
+in your browser.
+
+### Development using bare Python
+
+This project uses [Poetry](https://python-poetry.org/). Installing Poetry alongside
+[pyenv](https://github.com/pyenv/pyenv) will make it very easy for you to get started
+with local development that doesn't rely on Docker.
+
+After installing Python 3.9 and Poetry, you can follow the instructions above to run
+the API locally:
+
+1. Run the following command to spawn a new virtual environment:
+
+ ```shell
+ poetry shell
+ ```
+
+2. Run the following command to install the project dependencies in the virtual environment:
+
+ ```shell
+ poetry install
+ ```
+
+ This will also install the dev dependencies, which you can use for running the tests.
+
+3. Running `poetry install` inside the virtual environment also registers scripts and their triggering aliases:
+ For this reason, you can run the project using the following command:
+
+ ```shell
+ LOG_LEVEL=info CONFIG_PATH=./config/dev.yaml aad-scim2-api
+ ```
+
+You should now be able to inspect the OpenAPI specifications (Swagger docs) opening
+[http://localhost:5001](http://localhost:5001) in your browser.
+
+### Implementing a Store
+
+Implementing your store is possible implementing the [`BaseStore`](./keystone/store/__init__.py)
+class. See [`CosmosDbStore`](./keystone/store/cosmos_db_store.py) and
+[`MemoryStore`](./keystone/store/memory_store.py) classes for implementation references.
diff --git a/Makefile b/Makefile
index ed66809..8bf556b 100644
--- a/Makefile
+++ b/Makefile
@@ -16,7 +16,7 @@ build-image:
.PHONY: unit-tests
unit-tests: export CONFIG_PATH=./config/unit-tests.yaml
unit-tests:
- $(POETRY_BIN) run pytest tests/unit -p no:warnings --cov=keystone --asyncio-mode=strict
+ $(POETRY_BIN) run pytest tests/unit -p no:warnings --cov-report=html --cov=keystone --asyncio-mode=strict
.PHONY: integration-tests-mem-store
integration-tests-mem-store: export CONFIG_PATH=./config/integration-tests-memory-store.yaml
diff --git a/README.md b/README.md
index 9b89437..2b7322a 100644
--- a/README.md
+++ b/README.md
@@ -1,34 +1,50 @@
-# Keystone
-
-
-
-
-
-
-
+
**Keystone** is a fully containerized lightweight SCIM 2.0 API implementation.
-## Getting started
+## Getting Started
-Run the service with zero config to test it:
+Run the container with zero config to test it:
```shell
# Pull the image:
-docker pull yuvalherziger/keystone:latest
+docker pull keystone-scim/keystone:latest
# Run the container:
-docker run -p 5001:5001 yuvalherziger/keystone:latest
+docker run -p 5001:5001 keystone-scim/keystone:latest
```
-See also [Keystone configuration](./config).
+Read the [Keystone documentation](https://keystone-scim.github.io) to understand how you can configure Keystone with
+its different backends.
**What's Keystone?**
**Keystone** implements the SCIM 2.0 REST API. If you run your identity management
operations with an identity manager that supports user provisioning (e.g., Azure AD, Okta, etc.),
-you can use **Keystone** to persist directory changes. Keystone v0.1.0 supports two practical
-persistence layers: Azure Cosmos DB and PostgreSQL.
+you can use **Keystone** to persist directory changes. Keystone v0.1.0 supports two
+persistence layers: PostgreSQL and Azure Cosmos DB.
Key features:
@@ -36,166 +52,19 @@ Key features:
implementation for Users and Groups.
* Stateless container - deploy it anywhere you want (e.g., Kubernetes).
* Pluggable store for users and groups. Current supported storage technologies:
- * Azure Cosmos DB
- * PostgreSQL (>= v.10)
- * In-Memory (for testing purposes only)
+ * [Azure Cosmos DB](https://docs.microsoft.com/en-us/azure/cosmos-db/introduction)
+ * [PostgreSQL](https://www.postgresql.org) (version 10 or higher)
+ * In-Memory (for testing purposes only).
* Azure Key Vault bearer token retrieval.
* Extensible stores.
-TODO: CITEXT caveat: https://docs.microsoft.com/en-us/azure/postgresql/flexible-server/concepts-extensions
-
Can't use Cosmos DB or PostgreSQL? Open an issue and/or consider
[becoming a contributor](./CONTRIBUTING.md).
-A containerized, Python-based [SCIM 2.0 API](https://datatracker.ietf.org/doc/html/rfc7644) implementation in asynchronous
-Python 3.9, using [asyncio](https://docs.python.org/3/library/asyncio.html)
-on top of an [aiohttp](https://docs.aiohttp.org/en/stable/) Server.
-
-This SCIM 2.0 API implementation includes a pluggable user and group store module. Store types are pluggable
-in the sense that the REST API is separated from the caching layer, allowing you to extend this project to include any
-type of user and group store implementation. This namely allows you to have the SCIM 2.0 API persist users and groups
-in any type of store.
-
-**Please note:** The API is currently experimental. Please take that into consideration before
-integrating it into production workloads.
-
-Currently, the API implements the following stores:
-
-* **Azure Cosmos DB Store**: This store implementation reads and writes to an
- [Azure Cosmos DB](https://docs.microsoft.com/en-us/azure/cosmos-db/introduction) container for users and groups.
-* **In-memory Store**: This store implementation reads and writes to an in-memory store.
- Given its ephemeral nature, the in-memory store is meant to be used _strictly_ for development purposes.
- Inherently, the in-memory store shouldn't and cannot be used in a replicated deployment, since
- each node running the container will have its own store.
-
## Configure the API
-See [Keystone Configuration Reference](./config).
-
-You can configure the API in two ways, whilst both can be used in conjunction with one another:
-
-1. **YAML file:** You can mount a volume with a YAML file adhering to the configuration schema (see table below)
- and instruct the container to load the file from a path specified in the `CONFIG_PATH` environment variable.
-
- The following example uses a Cosmos DB store and loads the API bearer token from an Azure key vault,
- both interacting with their respective Azure service with a managed identity (default credentials):
-
- ```yaml
- store:
- type: CosmosDB
- cosmos_account_uri: https://mycosmosdbaccount.documents.azure.com:443/
- authentication:
- akv:
- vault_name: mykeyvault
- secret_name: scim2bearertoken
- ```
-2. **Environment variables:** You can populate some, all, or none of the configuration keys using environment
- variables. All configuration keys can be represented by an environment variable by
- the capitalizing the entire key name and replacing the nesting dot (`.`) annotation with
- an underscore (`_`).
-
- For example, `store.cosmos_account_key` can be populated with the
- `STORE_COSMOS_ACCOUNT_KEY` environment variable in the container the API is running in.
-
-**Please note:**
-
-| **Key** | **Type** | **Description** | **Default Value** |
-|-----------------------------------------------------------------------------------|----------|--------------------------------------------------------------------------------------|------------------------|
-| store.
type | string | The persistence layer type. Supported values: `CosmosDB`, `InMemory` | `CosmosDB` |
-| store.
tenant_id | string | Azure Tenant ID, if using a Cosmos DB store with Client Secret Credentials auth. | - |
-| store.
client_id | string | Azure Client ID, if using a Cosmos DB store with Client Secret Credentials auth. | - |
-| store.
secret | string | Azure Client Secret, if using a Cosmos DB store with Client Secret Credentials auth. | - |
-| store.
cosmos_account_uri | string | Cosmos Account URI, if using a Cosmos DB store | - |
-| store.
cosmos_account_key | string | Cosmos DB account key, if using a Cosmos DB store with Account Key auth. | - |
-| store.
cosmos_db_name | string | Cosmos DB database name, if using a Cosmos DB store | `scim_2_identity_pool` |
-| authentication.
secret | string | Plain secret bearer token | - |
-| authentication.
akv.
vault_name | string | AKV name, if bearer token is stored in AKV. | - |
-| authentication.
akv.
secret_name | string | AKV secret name, if bearer token is stored in AKV. | `scim-2-api-token` |
-| authentication.
akv.
credentials_client | string | Credentials client type, if bearer token is stored in AKV. | `default` |
-| authentication.
akv.
force_create | bool | Try to create an AKV secret on startup, if bearer token to be stored in AKV. | `false` |
-
-## Deploy the API
-
-TBA.
+See [Keystone Documentation](https://keystone-scim.github.io).
## Development
-### Development using Docker
-
-#### Build the Image
-
-Requirements:
-
-* **A WSL2/MacOS/Linux shell**: You'll need a Linux shell to build and run the container locally.
- A standard Linux/Unix shell includes GNU Make, which is required in order to use the shortcut commands
- in this project's Makefile.
-* **Docker**: You'll need a Docker daemon such as the one included with
- [Docker Desktop](https://www.docker.com/products/docker-desktop/). This is required to build the container
- image locally.
-
-To build the image, run the following command:
-
-```shell
-make build-image
-```
-
-#### Run the Container Locally
-
-In short, you can use the following Make command to run the container locally:
-
-```shell
-make docker-run-dev
-```
-
-This command will the following expanded command:
-
-```shell
-docker run \
- --rm -it --name scim-2-api-python-dev \
- -p 5001:5001 \
- --mount type=bind,source=$(pwd)/config/dev.yaml,target=/tmp/config.yaml \
- --env CONFIG_PATH=/tmp/config.yaml scim-2-api-python-dev:latest
-```
-
-Running the container with the default values will expose the API on port 5001.
-You should now be able to inspect the OpenAPI specifications (Swagger docs) opening [http://localhost:5001](http://localhost:5001)
-in your browser.
-
-### Development using bare Python
-
-This project uses [Poetry](https://python-poetry.org/). Installing Poetry alongside
-[pyenv](https://github.com/pyenv/pyenv) will make it very easy for you to get started
-with local development that doesn't rely on Docker.
-
-After installing Python 3.9 and Poetry, you can follow the instructions above to run
-the API locally:
-
-1. Run the following command to spawn a new virtual environment:
-
- ```shell
- poetry shell
- ```
-
-2. Run the following command to install the project dependencies in the virtual environment:
-
- ```shell
- poetry install
- ```
-
- This will also install the dev dependencies, which you can use for running the tests.
-
-3. Running `poetry install` inside the virtual environment also registers scripts and their triggering aliases:
- For this reason, you can run the project using the following command:
-
- ```shell
- LOG_LEVEL=info CONFIG_PATH=./config/dev.yaml aad-scim2-api
- ```
-
-You should now be able to inspect the OpenAPI specifications (Swagger docs) opening
-[http://localhost:5001](http://localhost:5001) in your browser.
-
-### Implementing a Store
-
-Implementing your store is possible implementing the [`BaseStore`](./keystone/store/__init__.py)
-class. See [`CosmosDbStore`](./keystone/store/cosmos_db_store.py) and
-[`MemoryStore`](./keystone/store/memory_store.py) classes for implementation references.
+Please see the [Contribution Guide](./CONTRIBUTING.md) to get started
\ No newline at end of file
diff --git a/keystone/__init__.py b/keystone/__init__.py
index 6005b91..63d080d 100644
--- a/keystone/__init__.py
+++ b/keystone/__init__.py
@@ -6,27 +6,27 @@
_nc = "\033[0m"
VERSION = "0.1.0-rc.1"
-LOGO = f"""
- {_red}7P555555555555555P!{_nc}
- .^!7{_red}^55YYYYYYYYYYYYY55^{_nc}?!^.
- .^!7??7?!{_red}7PYYYYYYYYYYYYYP!{_nc}7?77?7!^.
- .^!7??7!!!!7J{_red}^55YYYYYYYYYYY5Y{_nc}~J7!!!!7??7!^.
- ^~7??7!!!!!!!!!?7{_red}7PYYYYYYYYYYYP!{_nc}7?!!!!!!!!!7??7!^
- ~!7J?!!!!!!!!!!!!!J{_red}~55YYYYYYYYY5Y{_nc}~J!!!!!!!!!!!!!?J7!~
- :????!7?7!!!!!!!!!!~?7{_red}7PYYYYYYYYYP{_nc}!?7~!!!!!!!!!!7?7!????:
- ~J7!!7??7??7!!!!!!!!~!J{_red}~55YYYYYYY5Y{_nc}~J~~!!!!!!!!7?77??7!!7J~
- .7J7!!!!!7?77??7!!!!!~~~??{_red}!5YYYYYYY5{_nc}!??~~~!!!!!7??7??7!!!!!7J7.
- ^??!!!!!!!!!!?77??!!!!777!^...........^!777!!!!??77?!!!!!!!!!!??^
- !J7!!!!!!!!!!~~!?77??7!~: :^!7??77?!~!!!!!!!!!!!7J!
- :?J!!!!!!!!!!!!!~~~7J^: .^J7~~~!!!!!!!!!!!!!J?:
- .77777777!!!!!!!!!~7?^ ^?7~!!!!!!!!!77777777.
- ~J777777777777!!!!?7. .7?!!!!777777777777J~
- 7J!!!!77777777777?~ ~?77777777777!!!!J!
- ??!!!!!!!!!!!!77?^ ^?77!!!!!!!!!!!!??
-.J7!!!!!!!!!!~~~!J: :J!~~~!!!!!!!!!!7J.
-^J7!!!!!!!!!!!!~7? ---------- .J7!!!!!!!!!!!!!7J:
-!J!!!!!!!!!!!!!!?7 KEYSTONE ??!!!!!!!!!!!!!!J~
-J??77!!~~^^::.. ---------- ...::^^~!!777??
+LOGO = """
+ ..............
+ .--------------. :
+ :=#%. -------------: -@%*=.
+ .=*%@@@@+ :------------. #@@@@@%+-.
+ .-*%@@@@@@@@%. ------------ -@@@@@@@@@@#+:
+ *@@@@@@@@@@@@@+ :----------. #@@@@@@@@@@@@@%:
+ -#- -%@@@@@@@@@@@% ---------- :@@@@@@@@@@@@%= :*#.
+ +@@@#. =%@@@@@@@@@@= :--------. *@@@@@@@@@@@+..*@@@%:
+ .#@@@@@@*..*@@@@@@@@@% -------- .@@@@@@@@@@#: +%@@@@@@=
+ -%@@@@@@@@%+ :#@@@@@@@%- ...... :#@@@@@@@#- =%@@@@@@@@@*.
+ +@@@@@@@@@@@@%= -#@%*=: -+#@%= -#@@@@@@@@@@@@%:
+ .#@@@@@@@@@@@@@@@%- : . :#@@@@@@@@@@@@@@@@+
+ .-+*%@@@@@@@@@@@@%= .#@@@@@@@@@@@@%#+-:
+ =#*=:..:=*%@@@@@@%: =%@@@@@%*+-. :-+#%
+ *@@@@@%*=-. :=+#* :#*=-..:=*#@@@@@@:
+ #@@@@@@@@@@%#+-. -+*%@@@@@@@@@@@=
+ @@@@@@@@@@@@@@@+ @@@@@@@@@@@@@@@*
+ :@@@@@@@@@@@@@@@- #@@@@@@@@@@@@@@#
+ =@@@@@@@@@@@@@@@: +@@@@@@@@@@@@@@@
+ -+++++++++++++++ :+++++++++++++++
"""
diff --git a/keystone/store/__init__.py b/keystone/store/__init__.py
index 8c4d27c..37695c4 100644
--- a/keystone/store/__init__.py
+++ b/keystone/store/__init__.py
@@ -22,9 +22,6 @@ async def create(self, resource: Dict):
async def delete(self, resource_id: str):
raise NotImplementedError("Method 'delete' not implemented")
- async def parse_filter_expression(self, expr: str):
- raise NotImplementedError("Method 'parse_filter' not implemented")
-
def clean_up_store(self):
raise NotImplementedError("Method 'clean_up_store' not implemented")
diff --git a/keystone/store/cosmos_db_store.py b/keystone/store/cosmos_db_store.py
index 0c7c0ea..cc313f8 100644
--- a/keystone/store/cosmos_db_store.py
+++ b/keystone/store/cosmos_db_store.py
@@ -197,9 +197,6 @@ async def delete(self, resource_id: str):
pass
return
- async def parse_filter_expression(self, expr: str):
- pass
-
def init_client(self):
account_uri = CONFIG.get("store.cosmos_account_uri")
if not account_uri:
diff --git a/keystone/store/pg_sql_queries.py b/keystone/store/pg_sql_queries.py
index a9013e9..bfd722c 100644
--- a/keystone/store/pg_sql_queries.py
+++ b/keystone/store/pg_sql_queries.py
@@ -1,5 +1,5 @@
-citext_extension = "CREATE EXTENSION IF NOT EXISTS citext WITH SCHEMA {};"
scim_schema = "CREATE SCHEMA IF NOT EXISTS {};"
+citext_extension = "CREATE EXTENSION IF NOT EXISTS citext WITH SCHEMA {};"
users_tbl = """
CREATE TABLE IF NOT EXISTS {}.users (
@@ -50,5 +50,5 @@
CREATE INDEX IF NOT EXISTS user_emails_value_index ON {}.user_emails("value");
"""
-ddl_queries = [citext_extension, scim_schema, users_tbl, users_idx, groups_tbl, groups_idx, users_groups_tbl,
+ddl_queries = [scim_schema, citext_extension, users_tbl, users_idx, groups_tbl, groups_idx, users_groups_tbl,
user_emails_tbl, user_emails_idx]
diff --git a/keystone/store/postgresql_store.py b/keystone/store/postgresql_store.py
index 052a32e..860147c 100644
--- a/keystone/store/postgresql_store.py
+++ b/keystone/store/postgresql_store.py
@@ -1,3 +1,4 @@
+import asyncio
import re
from datetime import datetime
import logging
@@ -25,28 +26,28 @@
CONN_REFRESH_INTERVAL_SEC = 1 * 60 * 60
-def build_dsn():
- host = CONFIG.get("store.pg_host")
- port = CONFIG.get("store.pg_port", 5432)
- username = CONFIG.get("store.pg_username")
- password = CONFIG.get("store.pg_password")
- database = CONFIG.get("store.pg_database")
- ssl_mode = CONFIG.get("store.pg_ssl_mode")
+def build_dsn(**kwargs):
+ host = kwargs.get("host", CONFIG.get("store.pg_host"))
+ port = kwargs.get("port", CONFIG.get("store.pg_port", 5432))
+ username = kwargs.get("username", CONFIG.get("store.pg_username"))
+ password = kwargs.get("password", CONFIG.get("store.pg_password"))
+ database = kwargs.get("database", CONFIG.get("store.pg_database"))
+ ssl_mode = kwargs.get("ssl_mode", CONFIG.get("store.pg_ssl_mode"))
cred = username
if password:
cred = f"{cred}:{urllib.parse.quote(password)}"
return f"postgres://{cred}@{host}:{port}/{database}?sslmode={ssl_mode}"
-def set_up_schema():
+def set_up_schema(**kwargs):
conn = psycopg2.connect(
- dsn=build_dsn()
+ dsn=build_dsn(**kwargs)
)
- schema = CONFIG.get("store.pg_schema")
+ schema = CONFIG.get("store.pg_schema", "public")
cursor = conn.cursor()
for q in ddl_queries:
cursor.execute(q.format(schema))
- conn.commit()
+ conn.commit()
conn.close()
@@ -75,35 +76,45 @@ async def _transform_user(user_record: RowProxy) -> Dict:
class PostgresqlStore(RDBMSStore):
- engine: aiopg.sa.Engine
+ engine: aiopg.sa.Engine = None
schema: str
entity_type: str
nested_store_attr: str
last_conn = None
attr_map = {
- ('userName', None, None): 'users."userName"',
- ('displayName', None, None): 'users."displayName"',
- ('externalId', None, None): 'users."externalId"',
- ('id', None, None): 'users.id',
- ('active', None, None): 'users.active',
- ('emails', None, None): 'c.emails.value',
- ('emails', 'value', None): 'c.emails.value',
- ('value', None, None): 'users_groups."userId"',
+ ("userName", None, None): "users.\"userName\"",
+ ("displayName", None, None): "users.\"displayName\"",
+ ("externalId", None, None): "users.\"externalId\"",
+ ("id", None, None): "users.id",
+ ("active", None, None): "users.active",
+ ("locale", None, None): "users.locale",
+ ("name", None, None): "users.name.formatted",
+ ("emails", None, None): "user_emails.value",
+ ("emails", "value", None): "user_emails.value",
+ ("value", None, None): "users_groups.\"userId\"",
}
- def __init__(self, entity_type: str):
+ def __init__(self, entity_type: str, **conn_args):
self.schema = CONFIG.get("store.pg_schema")
self.entity_type = entity_type
+ self.conn_args = conn_args
async def get_engine(self):
- if not self.last_conn or (datetime.now() - self.last_conn).total_seconds() > CONN_REFRESH_INTERVAL_SEC:
+ if not self.engine or not self.last_conn or (
+ datetime.now() - self.last_conn).total_seconds() > CONN_REFRESH_INTERVAL_SEC:
LOGGER.debug("Establishing new PostgreSQL connection")
self.last_conn = datetime.now()
- self.engine = await create_engine(dsn=build_dsn())
+ self.engine = await create_engine(dsn=build_dsn(**self.conn_args))
LOGGER.debug("Established new PostgreSQL connection")
return self.engine
+ async def term_connection(self):
+ engine = await self.get_engine()
+ if engine:
+ engine.close()
+ return await engine.wait_closed()
+
async def _get_user_by_id(self, user_id: str) -> Dict:
em_agg = text("""
array_agg(json_build_object(
@@ -159,7 +170,6 @@ async def get_by_id(self, resource_id: str):
return await self._get_user_by_id(resource_id)
if self.entity_type == "groups":
return await self._get_group_by_id(resource_id)
- return
async def search(self, _filter: str, start_index: int = 1, count: int = 100) -> tuple[list[Dict], int]:
where = ""
@@ -189,7 +199,6 @@ async def search(self, _filter: str, start_index: int = 1, count: int = 100) ->
where = insensitive_like.sub(" ILIKE ", where)
q = q.where(text(where))
q = q.group_by(text("1,2,3,4,5,6,7,8,9")).offset(start_index - 1).fetch(count)
-
engine = await self.get_engine()
async with engine.acquire() as conn:
users = []
@@ -205,10 +214,9 @@ async def update(self, resource_id: str, **kwargs: Dict):
return await self._update_user(resource_id, **kwargs)
if self.entity_type == "groups":
return await self._update_group(resource_id, **kwargs)
- return
async def _update_user(self, user_id: str, **kwargs: Dict) -> Dict:
- # "id" should be immutable, and "groups" are updated through the groups API:
+ # "id" is immutable, and "groups" are updated through the groups API:
immutable_cols = ["id", "groups"]
for immutable_col in immutable_cols:
if immutable_col in kwargs:
@@ -252,14 +260,13 @@ async def _update_group(self, group_id: str, **kwargs: Dict) -> Dict:
engine = await self.get_engine()
async with engine.acquire() as conn:
_ = await conn.execute(q)
- return await self._get_group_by_id(group_id)
+ return await self._get_group_by_id(group_id)
async def create(self, resource: Dict):
if self.entity_type == "users":
return await self._create_user(resource)
if self.entity_type == "groups":
return await self._create_group(resource)
- return
async def _create_group(self, resource: Dict) -> Dict:
group_id = resource.get("id") or str(uuid.uuid4())
@@ -267,7 +274,7 @@ async def _create_group(self, resource: Dict) -> Dict:
insert_members = None
if len(members) > 0:
insert_members = insert(tbl.users_groups).values([
- {"value": u.get("id") for u in members}
+ {"userId": u.get("value"), "groupId": group_id} for u in members
])
insert_group = insert(tbl.groups).values(
id=group_id,
@@ -277,7 +284,7 @@ async def _create_group(self, resource: Dict) -> Dict:
engine = await self.get_engine()
async with engine.acquire() as conn:
_ = await conn.execute(insert_group)
- if insert_members:
+ if insert_members is not None:
_ = await conn.execute(insert_members)
return await self._get_group_by_id(group_id)
@@ -319,7 +326,6 @@ async def delete(self, resource_id: str):
return await self._delete_user(resource_id)
if self.entity_type == "groups":
return await self._delete_group(resource_id)
- return
async def _delete_user(self, user_id: str):
del_q = [
@@ -354,9 +360,6 @@ async def _delete_group(self, group_id: str):
_ = await conn.execute(del_q)
return {}
- async def parse_filter_expression(self, expr: str):
- raise NotImplementedError("Method 'parse_filter_expression' not implemented")
-
async def remove_users_from_group(self, user_ids: List[str], group_id: str):
user_ids_s = ",".join([f"'{uid}'" for uid in user_ids])
q = delete(tbl.users_groups).where(
@@ -396,7 +399,7 @@ async def search_members(self, _filter: str, group_id: str):
parsed_params = parsed_q.params_dict
for k in parsed_params.keys():
where = where.replace(f"{{{k}}}", f"'{parsed_params[k]}'")
- q = select([tbl.users_groups]).join(tbl.users, tbl.users.c.id == tbl.users_groups.c.userId).\
+ q = select([tbl.users_groups]).join(tbl.users, tbl.users.c.id == tbl.users_groups.c.userId). \
where(and_(tbl.users_groups.c.groupId == group_id, text(where)))
engine = await self.get_engine()
async with engine.acquire() as conn:
@@ -408,5 +411,10 @@ async def search_members(self, _filter: str, group_id: str):
async def clean_up_store(self) -> None:
engine = await self.get_engine()
async with engine.acquire() as conn:
- _ = await conn.execute(text(f"DROP SCHEMA IF EXISTS {self.schema} CASCADE"))
- return
\ No newline at end of file
+ _ = await conn.execute(delete(tbl.users))
+ _ = await conn.execute(delete(tbl.user_emails))
+ _ = await conn.execute(delete(tbl.users_groups))
+ _ = await conn.execute(delete(tbl.groups))
+ if self.schema != "public":
+ _ = await conn.execute(text(f"DROP SCHEMA IF EXISTS {self.schema} CASCADE"))
+ return
diff --git a/logo/logo.ascii b/logo/logo.ascii
index 5d502ce..d17961d 100644
--- a/logo/logo.ascii
+++ b/logo/logo.ascii
@@ -1,20 +1,27 @@
- 7P555555555555555P!
- .^!7^55YYYYYYYYYYYYY55^?!^.
- .^!7??7?!7PYYYYYYYYYYYYYP!7?77?7!^.
- .^!7??7!!!!7J^55YYYYYYYYYYY5Y~J7!!!!7??7!^.
- ^~7??7!!!!!!!!!?77PYYYYYYYYYYYP!7?!!!!!!!!!7??7!^
- ~!7J?!!!!!!!!!!!!!J~55YYYYYYYYY5Y~J!!!!!!!!!!!!!?J7!~
- :????!7?7!!!!!!!!!!~?77PYYYYYYYYYP!?7~!!!!!!!!!!7?7!????:
- ~J7!!7??7??7!!!!!!!!~!J~55YYYYYYY5Y~J~~!!!!!!!!7?77??7!!7J~
- .7J7!!!!!7?77??7!!!!!~~~??!5YYYYYYY5!??~~~!!!!!7??7??7!!!!!7J7.
- ^??!!!!!!!!!!?77??!!!!777!^...........^!777!!!!??77?!!!!!!!!!!??^
- !J7!!!!!!!!!!~~!?77??7!~: :^!7??77?!~!!!!!!!!!!!7J!
- :?J!!!!!!!!!!!!!~~~7J^: .^J7~~~!!!!!!!!!!!!!J?:
- .77777777!!!!!!!!!~7?^ ^?7~!!!!!!!!!77777777.
- ~J777777777777!!!!?7. .7?!!!!777777777777J~
- 7J!!!!77777777777?~ ~?77777777777!!!!J!
- ??!!!!!!!!!!!!77?^ ^?77!!!!!!!!!!!!??
-.J7!!!!!!!!!!~~~!J: :J!~~~!!!!!!!!!!7J.
-^J7!!!!!!!!!!!!~7? .J7!!!!!!!!!!!!!7J:
-!J!!!!!!!!!!!!!!?7 ??!!!!!!!!!!!!!!J~
-J??77!!~~^^::.. ...::^^~!!777??
+ #####################
+ &?77777777777777777777Y
+ P!!!!!!!!!!!!!!!!!!!!# #G#&
+ &BGPPP# &7!7777777777777777!Y GPPPPG#&
+ &&BGPPPPPPPG P!777777777777777!7# #PPPPPPPPPGB&
+ BGPPPPPPPPPPPP# &?!77777777777777!5 GPPPPPPPPPPPPPGB&
+ BGPPPPPPPPPPPPPPPPP P!7777777777777!7# #PPPPPPPPPPPPPPPPPPGB#
+ #PPPPPPPPPPPPPPPPPPPPB &?!777777777777!5 GPPPPPPPPPPPPPPPPPPPPB
+ BG# #GPPPPPPPPPPPPPPPPPP& P!77777777777!7& #PPPPPPPPPPPPPPPPPPPB &GP#
+ &GPPPG# #PPPPPPPPPPPPPPPPPB &?!7777777777!5 GPPPPPPPPPPPPPPPPPB &GPPPPG&
+ BPPPPPPPG# BPPPPPPPPPPPPPPPP& P!777777777!7& #PPPPPPPPPPPPPPPPB& &GPPPPPPPP#
+ &GPPPPPPPPPPG& BPPPPPPPPPPPPPPB &?!!!!!!!!!!P GPPPPPPPPPPPPPPB& &GPPPPPPPPPPPB
+ #PPPPPPPPPPPPPPG& BPPPPPPPPPPPPG #GBBBBBBBGB& &GPPPPPPPPPPPPB& &BPPPPPPPPPPPPPPG#
+ &GPPPPPPPPPPPPPPPPPG& &BPPPPPPPGB& &&BGPPPPPPG& &BPPPPPPPPPPPPPPPPPPB
+ #PPPPPPPPPPPPPPPPPPPPPG& &BPGB& &BGG& &BPPPPPPPPPPPPPPPPPPPPPG&
+ BPPPPPPPPPPPPPPPPPPPPPPPPG& BPPPPPPPPPPPPPPPPPPPPPPPPP#
+ BGGPPPPPPPPPPPPPPPPPPPPB &GPPPPPPPPPPPPPPPPPPPPGB#&&
+ #G#&& &BGPPPPPPPPPPPPPP# BPPPPPPPPPPPPPGGB#& &BG&
+ BPPPPPGB#&& BGGPPPPPPG& &GPPPPPGGB#& BGGPPPPP#
+ GPPPPPPPPPPGGB#&& &BG# BB#&& BGGPPPPPPPPPPPB
+ &PPPPPPPPPPPPPPPPPGGB#& BGPPPPPPPPPPPPPPPPPG
+ &PPPPPPPPPPPPPPPPPPPPPP& #PPPPPPPPPPPPPPPPPPPPPP
+ #PPPPPPPPPPPPPPPPPPPPPP &PPPPPPPPPPPPPPPPPPPPPP&
+ BPPPPPPPPPPPPPPPPPPPPPG &PPPPPPPPPPPPPPPPPPPPPP#
+ GPPPPPPPPPPPPPPPPPPPPPB GPPPPPPPPPPPPPPPPPPPPPB
+ &GGGGGGGGGGGGGGGGGGGGGG# BGGGGGGGGGGGGGGGGGGGGGB
+
diff --git a/logo/logo.png b/logo/logo.png
index 4968225..d42d59a 100644
Binary files a/logo/logo.png and b/logo/logo.png differ
diff --git a/poetry.lock b/poetry.lock
index f271150..2ce126a 100644
--- a/poetry.lock
+++ b/poetry.lock
@@ -172,7 +172,7 @@ six = ">=1.12.0"
[[package]]
name = "azure-keyvault-secrets"
-version = "4.4.0"
+version = "4.5.1"
description = "Microsoft Azure Key Vault Secrets Client Library for Python"
category = "main"
optional = false
@@ -182,6 +182,7 @@ python-versions = ">=3.6"
azure-common = ">=1.1,<2.0"
azure-core = ">=1.20.0,<2.0.0"
msrest = ">=0.6.21"
+six = ">=1.11.0"
[[package]]
name = "bandit"
@@ -561,7 +562,7 @@ python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,>=2.7"
[[package]]
name = "pbr"
-version = "5.9.0"
+version = "5.10.0"
description = "Python Build Reasonableness"
category = "dev"
optional = false
@@ -1112,10 +1113,7 @@ azure-identity = [
{file = "azure-identity-1.10.0.zip", hash = "sha256:656e5034d9cef297cf9b35376ed620085273c18cfa52cea4a625bf0d5d2d6409"},
{file = "azure_identity-1.10.0-py3-none-any.whl", hash = "sha256:b386f1ccbea6a48b9ab7e7f162adc456793c345193a7c1a713959562b08dcbbd"},
]
-azure-keyvault-secrets = [
- {file = "azure-keyvault-secrets-4.4.0.zip", hash = "sha256:c0b732db9de855d9c39766067cf43e2f66cdbb28b859f03c787e33924cca82d7"},
- {file = "azure_keyvault_secrets-4.4.0-py3-none-any.whl", hash = "sha256:7143c6e83398a7aba048e44413f7f26b6ce43505afb3e3c89ba62b25f06dd729"},
-]
+azure-keyvault-secrets = []
bandit = [
{file = "bandit-1.7.4-py3-none-any.whl", hash = "sha256:412d3f259dab4077d0e7f0c11f50f650cc7d10db905d98f6520a95a18049658a"},
{file = "bandit-1.7.4.tar.gz", hash = "sha256:2d63a8c573417bae338962d4b9b06fbc6080f74ecd955a092849e1e65c717bd2"},
@@ -1423,10 +1421,7 @@ pathspec = [
{file = "pathspec-0.9.0-py2.py3-none-any.whl", hash = "sha256:7d15c4ddb0b5c802d161efc417ec1a2558ea2653c2e8ad9c19098201dc1c993a"},
{file = "pathspec-0.9.0.tar.gz", hash = "sha256:e564499435a2673d586f6b2130bb5b95f04a3ba06f81b8f895b651a3c76aabb1"},
]
-pbr = [
- {file = "pbr-5.9.0-py2.py3-none-any.whl", hash = "sha256:e547125940bcc052856ded43be8e101f63828c2d94239ffbe2b327ba3d5ccf0a"},
- {file = "pbr-5.9.0.tar.gz", hash = "sha256:e8dca2f4b43560edef58813969f52a56cef023146cbb8931626db80e6c1c4308"},
-]
+pbr = []
platformdirs = [
{file = "platformdirs-2.5.2-py3-none-any.whl", hash = "sha256:027d8e83a2d7de06bbac4e5ef7e023c02b863d7ea5d079477e722bb41ab25788"},
{file = "platformdirs-2.5.2.tar.gz", hash = "sha256:58c8abb07dcb441e6ee4b11d8df0ac856038f944ab98b7be6b27b2a3c7feef19"},
diff --git a/tests/unit/conftest.py b/tests/unit/conftest.py
index 2e4976e..9b129dd 100644
--- a/tests/unit/conftest.py
+++ b/tests/unit/conftest.py
@@ -1,15 +1,36 @@
import random
import uuid
+import asyncio
import names
import pytest
from aiohttp import web
from keystone.store.memory_store import MemoryStore
+from keystone.store.postgresql_store import PostgresqlStore, set_up_schema
from keystone.util.config import Config
from keystone.util.store_util import init_stores
+@pytest.fixture
+def postgresql_stores(event_loop):
+ conn_args = dict(
+ host="localhost",
+ port=5432,
+ username="postgres",
+ password="postgres",
+ ssl_mode="disable",
+ database="postgres",
+ )
+ set_up_schema(**conn_args)
+ user_store = PostgresqlStore("users", **conn_args)
+ group_store = PostgresqlStore("groups", **conn_args)
+ yield user_store, group_store
+ event_loop.run_until_complete(user_store.clean_up_store())
+ event_loop.run_until_complete(user_store.term_connection())
+ event_loop.run_until_complete(group_store.term_connection())
+
+
def generate_random_user():
first_name = names.get_first_name()
last_name = names.get_last_name()
diff --git a/tests/unit/store/test_postgresql_store.py b/tests/unit/store/test_postgresql_store.py
new file mode 100644
index 0000000..fdacdc0
--- /dev/null
+++ b/tests/unit/store/test_postgresql_store.py
@@ -0,0 +1,329 @@
+import uuid
+from random import choice
+
+import asyncio
+import pytest
+from psycopg2.errors import UniqueViolation
+
+from keystone.util.exc import ResourceNotFound
+
+
+class TestPostgreSQLStore:
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_get_user_by_id_fails_for_nonexistent_user(postgresql_stores):
+ user_store, _ = postgresql_stores
+ exc_thrown = False
+ try:
+ _ = await user_store.get_by_id(str(uuid.uuid4()))
+ except ResourceNotFound:
+ exc_thrown = True
+ assert exc_thrown
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_delete_user_by_id_fails_for_nonexistent_user(postgresql_stores):
+ user_store, _ = postgresql_stores
+ exc_thrown = False
+ try:
+ _ = await user_store.delete(str(uuid.uuid4()))
+ except ResourceNotFound:
+ exc_thrown = True
+ assert exc_thrown
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_create_user_success(postgresql_stores, single_user):
+ user_store, _ = postgresql_stores
+ returned_user = await user_store.create(single_user)
+ user_id = returned_user.get("id")
+ looked_up_user = await user_store.get_by_id(user_id)
+ assert looked_up_user.get("id") == user_id
+ assert looked_up_user.get("userName") == returned_user.get("userName")
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_create_user_fails_on_duplicate_username(postgresql_stores, single_user):
+ user_store, _ = postgresql_stores
+ _ = await user_store.create(single_user)
+ duplicate_user = {**single_user}
+ del duplicate_user["id"]
+ exc_thrown = False
+ try:
+ _ = await user_store.create(duplicate_user)
+ except UniqueViolation:
+ exc_thrown = True
+ assert exc_thrown
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_delete_user_success(postgresql_stores, single_user):
+ user_store, _ = postgresql_stores
+ user = await user_store.create(single_user)
+ user_id = user.get("id")
+ _ = await user_store.delete(user_id)
+ exc_thrown = False
+ try:
+ _ = await user_store.get_by_id(user_id)
+ except ResourceNotFound:
+ exc_thrown = True
+ assert exc_thrown
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_create_user_fails_on_duplicate_id(postgresql_stores, users):
+ user_store, _ = postgresql_stores
+ returned_user = await user_store.create(users[0])
+ duplicate_user = {**users[1], "id": returned_user.get("id")}
+ exc_thrown = False
+ try:
+ _ = await user_store.create(duplicate_user)
+ except UniqueViolation:
+ exc_thrown = True
+ assert exc_thrown
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_search_user_by_username(postgresql_stores, single_user):
+ user_store, _ = postgresql_stores
+ username = single_user.get("userName")
+ _ = await user_store.create(single_user)
+ _filter = f"userName Eq \"{username}\""
+ res, count = await user_store.search(_filter)
+ assert 1 == count == len(res)
+ assert res[0].get("userName") == username
+
+ mixed_case_username = "".join(choice((str.upper, str.lower))(c) for c in username)
+ _filter = f"userName Eq \"{mixed_case_username}\""
+ res, count = await user_store.search(_filter)
+ assert 1 == count == len(res)
+ assert res[0].get("userName") == username
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_search_user_by_id(postgresql_stores, single_user):
+ user_store, _ = postgresql_stores
+ user_id = single_user.get("id")
+ _ = await user_store.create(single_user)
+ _filter = f"id Eq \"{user_id}\""
+ res, count = await user_store.search(_filter)
+ assert 1 == count == len(res)
+ assert res[0].get("userName") == single_user.get("userName")
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_search_user_by_email(postgresql_stores, single_user):
+ user_store, _ = postgresql_stores
+ email = single_user.get("userName")
+ _ = await user_store.create(single_user)
+ _filter = f"emails.value Eq \"{email}\""
+ res, count = await user_store.search(_filter)
+ assert 1 == count == len(res)
+ assert res[0].get("userName") == single_user.get("userName")
+
+ _filter = f"emails Co \"{email}\""
+ res, count = await user_store.search(_filter)
+ assert 1 == count == len(res)
+ assert res[0].get("userName") == single_user.get("userName")
+
+ email_username = email.split("@")[0]
+ _filter = f"emails.value Sw \"{email_username}\""
+ res, count = await user_store.search(_filter)
+ assert 1 == count == len(res)
+ assert res[0].get("userName") == single_user.get("userName")
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_search_user_pagination(postgresql_stores, users):
+ user_store, _ = postgresql_stores
+ _ = await asyncio.gather(*[user_store.create(u) for u in users])
+ email = users[0].get("userName")
+ email_domain = email.split("@")[1]
+ _filter = f"emails.value co \"{email_domain}\""
+ res, count = await user_store.search(_filter, start_index=1, count=3)
+ assert len(users) == count
+ assert 3 == len(res)
+
+ res, count = await user_store.search(_filter, start_index=4, count=3)
+ assert len(users) == count
+ assert 2 == len(res)
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_update_user_success(postgresql_stores, single_user):
+ user_store, _ = postgresql_stores
+ res = await user_store.create(single_user)
+ user_id = res.get("id")
+ update_attr = {
+ "groups": [], # To be ignored
+ "id": user_id, # To be ignored
+ "invalidAttribute": "foo", # To be ignored
+ "name": {
+ "formatted": "John Doe",
+ "givenName": "Doe",
+ "familyName": "John"
+ },
+ "locale": "pt-BR",
+ "displayName": "John Doe",
+ "emails": single_user.get("emails") + [{
+ "value": "johndoe@emailprovider.com",
+ "primary": False,
+ "type": "home"
+ }],
+ }
+ updated_user = await user_store.update(user_id, **update_attr)
+
+ assert "pt-BR" == updated_user.get("locale")
+ assert 2 == len(updated_user.get("emails"))
+ assert "John Doe" == updated_user.get("displayName") == updated_user.get("name").get("formatted")
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_create_group_success(postgresql_stores, single_group):
+ _, group_store = postgresql_stores
+ res = await group_store.create(single_group)
+ group_id = res.get("id")
+
+ group = await group_store.get_by_id(group_id)
+
+ assert single_group.get("displayName") == group.get("displayName")
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_create_group_with_members_success(postgresql_stores, single_group, users):
+ user_store, group_store = postgresql_stores
+ user_res = await asyncio.gather(*[user_store.create(u) for u in users])
+ group_payload = {**single_group, "members": [{
+ "value": u.get("id"),
+ "display": u.get("userName"),
+ } for u in user_res]}
+ res = await group_store.create(group_payload)
+ group_id = res.get("id")
+ group = await group_store.get_by_id(group_id)
+ assert len(users) == len(group.get("members"))
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_get_group_by_id_fails_for_nonexistent_group(postgresql_stores):
+ _, group_store = postgresql_stores
+ exc_thrown = False
+ try:
+ _ = await group_store.get_by_id(str(uuid.uuid4()))
+ except ResourceNotFound:
+ exc_thrown = True
+ assert exc_thrown
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_update_group_metadata_success(postgresql_stores, single_group):
+ _, group_store = postgresql_stores
+ res = await group_store.create(single_group)
+ group_id = res.get("id")
+ update_attr = {
+ "id": group_id, # To be ignored
+ "displayName": "New Group Name"
+ }
+ group = await group_store.update(group_id, **update_attr)
+ assert "New Group Name" == group.get("displayName")
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_delete_group_success(postgresql_stores, single_group):
+ _, group_store = postgresql_stores
+ group = await group_store.create(single_group)
+ group_id = group.get("id")
+ _ = await group_store.delete(group_id)
+ exc_thrown = False
+ try:
+ _ = await group_store.get_by_id(group_id)
+ except ResourceNotFound:
+ exc_thrown = True
+ assert exc_thrown
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_delete_group_fails_for_nonexistent_group(postgresql_stores):
+ _, group_store = postgresql_stores
+ exc_thrown = False
+ try:
+ _ = await group_store.delete(str(uuid.uuid4()))
+ except ResourceNotFound:
+ exc_thrown = True
+ assert exc_thrown
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_add_users_to_group(postgresql_stores, single_group, users):
+ user_store, group_store = postgresql_stores
+ _ = await asyncio.gather(*[user_store.create(u) for u in users])
+ res = await group_store.create(single_group)
+ group_id = res.get("id")
+ _ = await asyncio.gather(*[group_store.add_user_to_group(u.get("id"), group_id) for u in users])
+ group = await group_store.get_by_id(group_id)
+ assert len(users) == len(group.get("members"))
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_remove_users_from_group(postgresql_stores, single_group, users):
+ user_store, group_store = postgresql_stores
+ _ = await asyncio.gather(*[user_store.create(u) for u in users])
+ res = await group_store.create(single_group)
+ group_id = res.get("id")
+ _ = await asyncio.gather(*[group_store.add_user_to_group(
+ user_id=u.get("id"),
+ group_id=group_id
+ ) for u in users])
+ group = await group_store.get_by_id(group_id)
+ assert len(users) == len(group.get("members"))
+
+ _ = await group_store.remove_users_from_group(
+ user_ids=[u.get("id") for u in users],
+ group_id=group_id
+ )
+ group = await group_store.get_by_id(group_id)
+ assert 0 == len(group.get("members"))
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_set_group_members(postgresql_stores, single_group, users):
+ user_store, group_store = postgresql_stores
+ cohort_1 = users[:2]
+ cohort_2 = users[2:len(users)]
+ _ = await asyncio.gather(*[user_store.create(u) for u in users])
+ res = await group_store.create(single_group)
+ group_id = res.get("id")
+ _ = await asyncio.gather(*[group_store.add_user_to_group(u.get("id"), group_id) for u in cohort_1])
+ group = await group_store.get_by_id(group_id)
+ assert 2 == len(group.get("members"))
+ _ = await group_store.set_group_members(
+ user_ids=[u.get("id") for u in cohort_2],
+ group_id=group_id
+ )
+ group = await group_store.get_by_id(group_id)
+ assert len(users) - 2 == len(group.get("members"))
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_search_group_members(postgresql_stores, single_group, users):
+ user_store, group_store = postgresql_stores
+ _ = await asyncio.gather(*[user_store.create(u) for u in users])
+ res = await group_store.create(single_group)
+ first_user_id = users[0].get("id")
+ second_user_id = users[1].get("id")
+ nonexistent_user_id = str(uuid.uuid4())
+ group_id = res.get("id")
+ _ = await asyncio.gather(*[group_store.add_user_to_group(u.get("id"), group_id) for u in users])
+ _filter = f"value eq \"{first_user_id}\" OR value eq \"{second_user_id}\" OR value eq \"{nonexistent_user_id}\""
+ res = await group_store.search_members(_filter=_filter, group_id=group_id)
+ assert 2 == len(res)
+ user_ids = [u["value"] for u in res]
+ assert first_user_id in user_ids
+ assert second_user_id in user_ids
+ assert nonexistent_user_id not in user_ids
+
+ @staticmethod
+ @pytest.mark.asyncio
+ async def test_search_groups(postgresql_stores, single_group):
+ # TODO: implement
+ assert True