Important for (dev or prod) to first connect via the psql client cli tool remember to use the host flag to specific localhost, to create the database CREATE database cruddur
(required one one-time user be created).
psql -Upostgres --host localhost
Common PSQL Commands:
\x on -- expanded display when looking at data
\q -- Quit PSQL
\l -- List all databases
\c database_name -- Connect to a specific database
\dt -- List all tables in the current database
\d table_name -- Describe a specific table
\du -- List all users and their roles
\dn -- List all schemas in the current database
CREATE DATABASE database_name; -- Create a new database
DROP DATABASE database_name; -- Delete a database
CREATE TABLE table_name (column1 datatype1, column2 datatype2, ...); -- Create a new table
DROP TABLE table_name; -- Delete a table
SELECT column1, column2, ... FROM table_name WHERE condition; -- Select data from a table
INSERT INTO table_name (column1, column2, ...) VALUES (value1, value2, ...); -- Insert data into a table
UPDATE table_name SET column1 = value1, column2 = value2, ... WHERE condition; -- Update data in a table
DELETE FROM table_name WHERE condition; -- Delete data from a table
CREATE EXTENSION IF NOT EXISTS "uuid-ossp"; -- With Postgres generate out UUIDs
I learned about RDS- PSQL database this week. I first connected psql via CLI. Created tables and inserted data into the table by running some of the bash scripts. Created the same connection in both the environments Dev and Prod.
So firstly, I installed the Postgres container which is in my docker-compose.yml
file.
db:
image: postgres:13-alpine
restart: always
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=<enteryourpassword>
ports:
- '5432:5432'
volumes:
- db:/var/lib/postgresql/data
volumes:
db:
driver: local
Then I connect psql in my terminal by running psql -U postgres -localhost
and it ask for password then I am connected to Postgres in terminal.
As I mentioned above we have certain bash scripts to create tables, drop tables, insert data into tables. Before this I had set env vars in Gitpod for the Connection Url and Prod Connection Url.
And to connect to PROD environment, you can suffix the command with PROD. ./bin/db-connect prod
I created a Database instance in Amazon RDS Service. But as it costs us, so I had stoppped that temprorarily and was running only when required. I took the end point of that instance for the connection URL; security group ID and security group rules ID and added those in my ./bin/rds-update-sg-rules
shell script. Also had set the Inbound rules as Postgres : port 5432 to Custom : (My Gitpod IP).
There's a Provisioning done for RDS (You need to wait for arounf 10 mins to get it activated)
aws rds create-db-instance \
--db-instance-identifier cruddur-db-instance \
--db-instance-class db.t3.micro \
--engine postgres \
--engine-version 14.6 \
--master-username root \
--master-user-password huEE33z2Qvl383 \
--allocated-storage 20 \
--availability-zone ca-central-1a \
--backup-retention-period 0 \
--port 5432 \
--no-multi-az \
--db-name cruddur \
--storage-type gp2 \
--publicly-accessible \
--storage-encrypted \
--enable-performance-insights \
--performance-insights-retention-period 7 \
--no-deletion-protection
These all tasks helped us to get the IP from which we were creating database/ inserting data. And we used psycopg3
driver as you can see at the end of requirement.txt
to help the backend-flask
to get the data from Postgres
NB1: more draft details in git original repo https://github.com/omenking/aws-bootcamp-cruddur-2023/blob/main/journal/week04.md?plain=1
NB2: some details could be found only in the video, (like how to update the sg_rule, and conversation to JSON via PGQL, With docker-compose using db
name image like a host in the connection URL, ...)
Post Confirmation Lambda : Here I added some code to get logs recorded in as I sign-in to the cruddur app. Created a Lambda Function by using psycopg3 lib.
Lambda function
import json
import psycopg2
import os
def lambda_handler(event, context):
user = event['request']['userAttributes']
print('userAttributes')
print(user)
user_display_name = user['name']
user_email = user['email']
user_handle = user['preferred_username']
user_cognito_id = user['sub']
try:
print('entered-try')
sql = f"""
INSERT INTO public.users (
display_name,
email,
handle,
cognito_user_id
)
VALUES(%s,%s,%s,%s)
"""
print('SQL Statement ----')
print(sql)
conn = psycopg2.connect(os.getenv('CONNECTION_URL'))
cur = conn.cursor()
params = [
user_display_name,
user_email,
user_handle,
user_cognito_id
]
cur.execute(sql,*params)
conn.commit()
except (Exception, psycopg2.DatabaseError) as error:
print(error)
finally:
if conn is not None:
cur.close()
conn.close()
print('Database connection closed.')
return event
In order to make the commincation btw the Lambda and Coginito and RDS (PostgreSQL) 1- Add Trigger (in our case is on sign-in confirmation) 2- Add VPC (network purpose) 3- Make sure we have good SGR (Security groupe rule) to acces RDS 4- Add the Permission (...)