This project enables users to play the classic Snake game online. It integrates Redis for connection data management, Flask API for game data retrieval, Kafka for event processing, Flink SQL for real-time analysis, and ClickHouse for data storage. A dashboard designed with Chart.js displays player rankings, updated every 5 seconds.
- Online Snake game with score recording.
- Real-time analytics with Kafka, Flink SQL, and ClickHouse.
- Interactive dashboard with automatic updates.
- Confluent Cloud Account: If you do not have a Confluent Cloud account, you can create one here. It's free for a trial period of more than 30 days, and no credit card is required.
- clickhouse Cloud: You can also try clickhouse for free at clickhouse Free Trial.
- Redis: You can also try Redis for free at Redis Free Trial.
Alternatively, if you prefer, you can deploy a local Kafka and Redis cluster using Docker Compose.
- Clone the repository:
git clone https://github.com/Stefen-Taime/Real-Time-Data-Pipeline-Snake-Game.git
- Navigate to the cloned directory:
cd Real-Time-Data-Pipeline-Snake-Game
- Your directory should look like this:
. ├── app.py ├── dashboard │ ├── index.html │ ├── package.json │ ├── package-lock.json │ ├── scoreboard.css │ ├── scoreboard.js │ └── unnamed.png ├── Dockerfile ├── flink-cluster │ ├── docker-compose.yml │ ├── jobs │ │ └── job.sql │ ├── LICENSE │ ├── README.md │ └── sql-client │ └── Dockerfile ├── requirements.txt ├── static │ ├── img.jpg │ ├── snake.js │ └── style.css └── templates └── index.html
To configure ClickHouse to import real-time data from Kafka, follow these steps:
- Access the ClickHouse web console.
- Open the SQL console.
- On the left-hand side of the interface, select the 'Import' option.
- Choose 'Kafka' as import source.
- Enter the necessary credentials:
- API Key
- API Secret
- Servers
- Integration Name
- In the next step, select the 'SUMMARY_STATS_TOPIC' topic in JSON format.
- Go to Confluent Cloud and create two topics:
game_over_topic
andSUMMARY_STATS_TOPIC
. - In the
app.py
file, fill in the connection values for Redis and Kafka. - Build and start the Flask API and game server:
Once done, navigate to
docker build -t my-flask-app . docker run -p 5000:5000 my-flask-app
localhost:5000
to see the game interface.
- In a separate terminal, navigate to the
flink-cluster
directory and start the Flink cluster locally:docker-compose up --build -d
- Submit the Flink job:
You can check
docker exec -it <container_id> /opt/flink/bin/sql-client.sh embedded -f job.sql
localhost:8081
to see if the job is running correctly.
- Navigate to the
dashboard
directory and execute the dashboard application to view real-time player rankings:npm install chart.js python -m http.server
- Access it on port 8000.
- Refresh the page to switch users when playing game on port 5000.
- You can also check your topics after each game over to view the data.