Skip to content

Commit

Permalink
Merge branch 'master' into dependabot/pip/src/recommendations/src/rec…
Browse files Browse the repository at this point in the history
…ommendations-service/flask-cors-5.0.0
  • Loading branch information
BastLeblanc authored Oct 2, 2024
2 parents af3f745 + 49f2070 commit c457180
Show file tree
Hide file tree
Showing 12 changed files with 502 additions and 133 deletions.
10 changes: 9 additions & 1 deletion aws/cloudformation-templates/base/cloudfront.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -257,7 +257,15 @@ Resources:
s3 = boto3.resource('s3')
target_bucket = s3.Bucket(target_bucket_name)
# Sync custom images from SourceBucket to TargetBucket
source_bucket = s3.Bucket(source_bucket_name)
for obj in source_bucket.objects.filter(Prefix=source_path + 'images/'):
source = { 'Bucket': source_bucket_name, 'Key': obj.key }
target_key = obj.key.replace(source_path, '')
print(f'Copying {source} to {target_bucket_name}/{target_key}')
target_bucket.copy(source, target_key)
# For all files in tmpdirname
for root, dirs, files in os.walk('/tmp/images'):
for filename in files:
Expand Down
1 change: 1 addition & 0 deletions aws/cloudformation-templates/deployment-support.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -326,6 +326,7 @@ Resources:
- personalize:Describe*
Resource:
- !Sub 'arn:aws:personalize:${AWS::Region}:${AWS::AccountId}:*/retaildemo*'
- arn:aws:personalize:::recipe/aws*
- Effect: Allow
Action:
- personalize:DescribeEventTracker
Expand Down
3 changes: 3 additions & 0 deletions aws/cloudformation-templates/template.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -198,6 +198,8 @@ Parameters:
Description: >
S3 bucket name where the Retail Demo Store deployment resources are staged (product images, nested CloudFormation templates, source code snapshot,
notebooks, deployment Lambda code, etc).
REQUIRED: If you plan to use S3 for the deployment type (which is the default). The S3 bucket has to have versioning enabled for the pipeline to work
ResourceBucketRelativePath:
Type: String
Expand Down Expand Up @@ -236,6 +238,7 @@ Parameters:
Retail Demo Store in your own fork. S3 is useful when you just want to get up and going quickly for a demo or
evaluation or for workshop scenarios, such as Event Engine, where you want attendees to have their own source
repositories provisioned.
REQUIRED: The S3 bucket has to have versioning enabled for the pipeline to work
AllowedValues:
- "GitHub"
- "S3"
Expand Down
File renamed without changes.
31 changes: 17 additions & 14 deletions docs/Deployment/getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,16 +42,8 @@ git clone https://github.com/aws-samples/retail-demo-store
```

!!! Note
If you plan to customize the demo, we recommend using your fork
If you plan to customize the demo, we recommend using your fork instead of the aws-samples one (see [fork this repo](#optional-fork-this-repo))

## Install and configure the AWS CLI:

```
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip
sudo ./aws/install
aws configure
```

## Packages required for building staging:

Expand All @@ -63,6 +55,16 @@ sudo apt install nodejs
sudo apt install npm
```

## Install and configure the AWS CLI:

```
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip
sudo ./aws/install
aws configure
```


## (optional) Fork this Repo

We recommend to create a fork of the Retail Demo Store respository in your own GitHub account. That enables you to customize the code before deployment.
Expand All @@ -77,10 +79,10 @@ Save your access token in a secure location, you will use it the CloudFormation

## Step 2: Create a S3 Staging Bucket

We recommend to create a dedicated bucket for deployment with **versioning enabled.**
Create a dedicated S3 bucket specifically for staging/deployment, and ensure that [**versioning is enabled**](https://docs.aws.amazon.com/AmazonS3/latest/userguide/Versioning.html) for this bucket.

!!! Note
Bucket Region: Your staging bucket must be in the **region** in which you plan to deploy the Retail Demo Store.
> [!IMPORTANT]
> Your staging bucket must be in the **region** with in which you plan to deploy the Retail Demo Store.
### Enabling Event Notifications

Expand Down Expand Up @@ -156,8 +158,9 @@ All the others will work by default, take the time to read and decide which para

!!!NOTE

You can also use the command line below. (replace REGION, MY_CUSTOM_BUCKET and S3_PATH value)
You can also use the command line below. (replace REGION, MY_CUSTOM_BUCKET and S3_PATH value).
This script deploys the retail demo store with standard options, you cannot change any parameters directly

```bash
./scripts/deploy-cloudformation-stacks.sh DEPLOYMENT_S3_BUCKET REGION STACK_NAME
./scripts/deploy-cloudformation-stacks.sh DEPLOYMENT_S3_BUCKET S3_PATH REGION STACK_NAME
```
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Personalization

[1 - Creating the Account](1-Creating-account.md) > 2 - Personalization
[1 - Creating the Account](../../Deployment/1-Creating-account.md) > 2 - Personalization

Personalized user experiences are implemented across several features within the Retail Demo Store web user interface that demonstrate three core use-cases of Amazon Personalize as well as real-time recommendations.

Expand Down
12 changes: 7 additions & 5 deletions scripts/deploy-cloudformation-stacks.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
# You can use the following flags to pre create resources
#
# Example usage
# ./scripts/deploy-cloudformation-stacks.sh S3_BUCKET REGION [--pre-create-personalize] [--pre-index-elasticsearch]
# ./scripts/deploy-cloudformation-stacks.sh S3_BUCKET S3_PATH REGION STACK_NAME [--pre-create-personalize] [--pre-index-elasticsearch]
#

set -e
Expand All @@ -13,7 +13,7 @@ set -e
# Parse arguments and flag
########################################################################################################################################
# The script parses the command line argument and extract these variables:
# 1. "args" contains an array of arguments (e.g. args[0], args[1], etc.) In this script, we use only 2 arguments (S3_BUCKET, REGION)
# 1. "args" contains an array of arguments (e.g. args[0], args[1], args[2], args[3], etc.) In this script, we use only 3 arguments (S3_BUCKET, S3_PATH, REGION, STACK_NAME)
# 2. "pre_create_personalize" contains a boolean value whether "--pre-create-personalize" is presented
# 3. "pre_index_elasticsearch" contains a boolean value whether "--pre-index-elasticsearch" is presented
########################################################################################################################################
Expand Down Expand Up @@ -56,13 +56,15 @@ do
done

S3_BUCKET=${args[0]}
REGION=${args[1]}
STACK_NAME=${args[2]}
S3_PATH=${args[1]}
REGION=${args[2]}
STACK_NAME=${args[3]}

echo "=============================================="
echo "Executing the script with following arguments:"
echo "=============================================="
echo "S3_BUCKET = ${S3_BUCKET}"
echo "S3_PATH = ${S3_PATH}"
echo "pre_create_personalize = ${pre_create_personalize}"
echo "pre_index_elasticsearch = ${pre_index_elasticsearch}"
echo "=============================================="
Expand All @@ -88,7 +90,7 @@ aws cloudformation deploy \
SourceDeploymentType="S3" \
AlexaSkillId="" \
AlexaDefaultSandboxEmail="" \
ResourceBucketRelativePath="store/" \
ResourceBucketRelativePath="${S3_PATH}/" \
mParticleSecretKey="" \
AmazonPayPublicKeyId="" \
mParticleApiKey="" \
Expand Down
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
boto3==1.34.44
botocore==1.34.44
boto3==1.35.28
botocore==1.35.28
aws-lambda-powertools==2.33.1
jmespath==1.0.1
python-dateutil==2.8.2
s3transfer==0.10.0
six==1.16.0
typing_extensions==4.9.0
urllib3==2.0.7
crhelper
urllib3==2.2.2
crhelper
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
opensearch-py==2.4.2
pillow==10.3.0
aws-lambda-powertools==2.33.1
boto3==1.34.44
botocore==1.34.44
certifi==2024.2.2
boto3==1.35.28
botocore==1.35.28
certifi==2024.7.4
charset-normalizer==3.3.2
idna==3.7
jmespath==1.0.1
Expand All @@ -15,4 +15,4 @@ typing_extensions==4.9.0
urllib3==2.0.7
pydantic==2.6.2
pydantic_core==2.16.3
crhelper
crhelper
9 changes: 7 additions & 2 deletions src/products/src/products_service/db.py
Original file line number Diff line number Diff line change
Expand Up @@ -112,8 +112,13 @@ def gets(self, ids: list):
return response['Responses'][self.table.name]

def get_all(self):
items = []
response = self.table.scan()
return response['Items'] if 'Items' in response else []
while 'LastEvaluatedKey' in response:
items.extend(response['Items'])
response = self.table.scan(ExclusiveStartKey=response['LastEvaluatedKey'])
items.extend(response['Items'])
return items

def upsert(self, item):
self.table.put_item(Item=item)
Expand Down Expand Up @@ -256,4 +261,4 @@ def create_table(self):
key_schema=[
{"AttributeName": "id", "KeyType": "HASH"},
]
)
)
Loading

0 comments on commit c457180

Please sign in to comment.