Skip to content

Commit

Permalink
Merge branch 'djeck1432:main' into test/dbconnector
Browse files Browse the repository at this point in the history
  • Loading branch information
tosoham authored Oct 25, 2024
2 parents 5b3c49d + 73bd4a4 commit 97da42d
Show file tree
Hide file tree
Showing 10 changed files with 241 additions and 7 deletions.
6 changes: 5 additions & 1 deletion .env.dev
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,8 @@ DB_USER=postgres
DB_PASSWORD=password
DB_NAME=spotnet
DB_HOST=db
DB_PORT=5432
DB_PORT=5432

# Redis
REDIS_HOST=redis
REDIS_PORT=6379
31 changes: 31 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,3 +79,34 @@ If you have made changes to the code or Docker configuration, rebuild the contai
docker-compose -f docker-compose.dev.yaml up --build
```

## About Celery

This project utilizes Celery to handle asynchronous tasks. The Celery workers and scheduler are defined within the Docker Compose setup.

### Services Overview

- **Celery Worker**: Executes tasks in the background.
- **Celery Beat**: Schedules periodic tasks.
- **Redis**: Used as the message broker for Celery.

### Running Celery

To start the Celery worker and Celery Beat services, use the following command in the terminal within your project directory:

```bash
docker-compose up -d celery celery_beat
```
### Stopping Celery
To stop the Celery worker and Beat services, run

```bash
docker-compose stop celery celery_beat
```

### Purging Celery Tasks
If you want to purge all tasks from the Celery queue, you can do this by executing

```bash
docker-compose run --rm celery celery -A spotnet_tracker.celery_config purge
```

33 changes: 31 additions & 2 deletions docker-compose.dev.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,36 @@ services:
depends_on:
- backend

volumes:
postgres_data_dev:
celery:
build: .
command: celery -A spotnet_tracker.celery_config worker --loglevel=INFO
volumes:
- .:/app
depends_on:
- redis
networks:
- app_network

celery_beat:
build: .
command: celery -A spotnet_tracker.celery_config beat --loglevel=INFO
volumes:
- .:/app
depends_on:
- redis
networks:
- app_network

redis:
image: redis:latest
restart: always
ports:
- "6379:6379"
volumes:
- redis_data_dev:/data
networks:
- app_network

volumes:
postgres_data_dev:
redis_data_dev:
33 changes: 32 additions & 1 deletion docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -53,5 +53,36 @@ services:
timeout: 5s
retries: 5

celery:
build: .
command: celery -A spotnet_tracker.celery_config worker --loglevel=INFO
volumes:
- .:/app
depends_on:
- redis
networks:
- app_network

celery_beat:
build: .
command: celery -A spotnet_tracker.celery_config beat --loglevel=INFO
volumes:
- .:/app
depends_on:
- redis
networks:
- app_network

redis:
image: redis:latest
restart: always
ports:
- "6379:6379"
volumes:
- redis_data:/data
networks:
- app_network

volumes:
postgres_data:
postgres_data:
redis_data:
4 changes: 3 additions & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,6 @@ pytest==8.3.3
pytest-asyncio==0.24.0
pytest-env==1.1.5
pytest-mock==3.14.0
httpx==0.27.2
httpx==0.27.2
celery==5.4.0
redis==5.2.0
Empty file added spotnet_tracker/__init__.py
Empty file.
46 changes: 46 additions & 0 deletions spotnet_tracker/celery_config.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
"""
This module configures the Celery application for the project.
It sets up the Celery instance using Redis as both the message broker
and result backend. The Redis connection details (host and port) are
loaded from environment variables using the `dotenv` library.
Additionally, this module defines a scheduled task configuration that
periodically executes scheduled tasks.
Key Components:
- Loads environment variables using `load_dotenv`.
- Configures Redis connection settings for Celery.
- Defines a Celery beat schedule for recurring tasks.
Usage:
- The Celery app can be imported and used in other parts of the application
to execute tasks or manage workers.
"""

import os

from celery import Celery
from dotenv import load_dotenv

load_dotenv()

# Redis credentials
REDIS_HOST = os.environ.get("REDIS_HOST", "")
REDIS_PORT = os.environ.get("REDIS_PORT", 6379)

app = Celery(
main="spotnet",
broker=f"redis://{REDIS_HOST}:{REDIS_PORT}/0",
backend=f"redis://{REDIS_HOST}:{REDIS_PORT}/0",
)

app.conf.beat_schedule = {
"test-celery-and-redis": {
"task": "test_task",
"schedule": 10,
},
}

from .tasks import test_task
26 changes: 26 additions & 0 deletions spotnet_tracker/tasks.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
"""
This module contains Celery tasks for the application.
It imports the logging module to facilitate logging operations and the
Celery app instance from the `celery_config` module.
Tasks:
- test_task: A simple test task that logs a confirmation message.
"""

import logging

from .celery_config import app

logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)


@app.task(name="test_task")
def test_task() -> None:
"""
A task cybled to test that all is working as expected.
:return: None
"""
# TODO: remove on production
logger.info("Running test_task. All is working as expected.")
57 changes: 56 additions & 1 deletion web_app/api/position.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,18 +4,73 @@

from fastapi import APIRouter, HTTPException

from pydantic import BaseModel
from web_app.api.serializers.transaction import (
LoopLiquidityData,
RepayTransactionDataResponse,
)
from web_app.api.serializers.position import PositionFormData
from web_app.contract_tools.constants import TokenParams
from web_app.contract_tools.constants import (
TokenParams,
TokenMultipliers,
)
from web_app.contract_tools.mixins.deposit import DepositMixin
from web_app.db.crud import PositionDBConnector

router = APIRouter() # Initialize the router
position_db_connector = PositionDBConnector() # Initialize the PositionDBConnector

class TokenMultiplierResponse(BaseModel):
"""
This class defines the structure of the response for the token multiplier
endpoint, encapsulating a dictionary where each token symbol:
(e.g., "ETH", "STRK")
is mapped to its respective multiplier value.
### Parameters:
- **multipliers**: A dictionary containing token symbols as keys:
(e.g., "ETH", "STRK", "USDC")
and their respective multipliers as values.
### Returns:
A structured JSON response with each token and its multiplier.
"""
multipliers: dict[str, float]

class Config:
"""
Metadata for TokenMultiplierResponse
with example JSON response format in **schema_extra**.
"""
schema_extra = {
"example": {
"multipliers": {
"ETH": 5.0,
"STRK": 2.5,
"USDC": 5.0
}
}
}


@router.get(
"/api/get-multipliers",
tags=["Position Operations"],
response_model=TokenMultiplierResponse,
summary="Get token multipliers",
response_description="Returns token multipliers",
)
async def get_multipliers() -> TokenMultiplierResponse:
"""
This Endpoint retrieves the multipliers for tokens like ETH, STRK, and USDC.
"""
multipliers = {
"ETH": TokenMultipliers.ETH,
"STRK": TokenMultipliers.STRK,
"USDC": TokenMultipliers.USDC
}
return TokenMultiplierResponse(multipliers=multipliers)


@router.post(
"/api/create-position",
Expand Down
12 changes: 11 additions & 1 deletion web_app/contract_tools/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,22 @@ class TokenConfig:
"""
Class to hold the token configuration for the pools.
"""

address: str
decimals: int
name: str


@dataclass(frozen=True)
class TokenMultipliers:
"""
Class to hold the predefined multipliers for supported tokens/
"""
ETH: float = 5.0
STRK: float = 2.5
USDC: float = 5.0


class TokenParams:
"""
Class to hold the token configurations for tokens as class-level variables.
Expand Down

0 comments on commit 97da42d

Please sign in to comment.