The task of this project is to replicate the results from the paper titled Beer Label Classification for Mobile Applications. Here, we attempt to automatically identify beer types using SIFT-based image matching of bottle labels
The dataset used for this project is created by scraping and manually downloading the images from Bing and Google.
The dataset directory structure is as follows:
images
├── database
├── query
├── samuel_adams
| ├── database
| ├── query
- The
database
folder consists of images having clean labels. - The
query
folder contains images of beer bottles with the labels. - Each query image has a corresponding database image, i.e. each beer bottle has its corresponding clean label image.
- The primary dataset consist of 100 database and corresponding 100 query images.
- The
samuel_adams
folder consists of 30 database and corresponding 30 query images for the Samuel Adams brewery.
Python = 3.6+
- Clone the repo to your local machine (~85 MB).
git clone https://github.com/Digital-Image-Processing-IIITH/project-dipsum
- Move to the project directory.
cd project-dipsum
- Install the required dependencies.
pip install -r requirements.txt
- Create a folder for downloading/saving the descriptor file.
mkdir lookup
Follow the following steps for running a quick demo using a single beer bottle image:
- Download the pre-computed SIFT descriptors from the given link (
main_sift.pkl
, ~69 MB) and save them in thelookup
folder. - Run the following command:
python src/demo.py -l lookup/main_sift.pkl -q images/query/amstel_light.jpg
You can use any query image fromimages/query/
.
NOTE: This might take approximately 3-5 minutes on your personal laptops.
There are two ways in which you can generate the results for all the query images:
-
Execute the end-to-end code
src/main.py
. It will generate the SIFT descriptors for all the images in theimages/database/
. Then it will iteratively generate and match SIFT descriptors of the query images fromimages/query/
with the database image's descriptors. For this, run the following command:
python src/main.py -load N
-
Download the pre-computed SIFT descriptors from the given link (
main_sift.pkl
, ~69 MB) and save them in thelookup
folder. For this, run the following command:
python src/main.py
We also provide an end-to-end script for generating results on the Samuel Adams dataset. Similar to src/main.py
, this can be done in two ways.
-
Execute the end-to-end code
src/samuel_adams_end2end.py
. For this, run the following command:
python src/samuel_adams_end2end.py -load N
-
Download the pre-computed SIFT descriptors from the given link (
samadams_sift.pkl
, ~31 MB) and save them in thelookup
folder. For this, run the following command:
python src/samuel_adams_end2end.py
For src/main.py
and src/samuel_adams_end2end.py
we provide the following arguments:
-d
path to the folder containing database images;
-q
path to the folder containing query images;
-l
path to the pre-computed descriptors file;
-load
if Y/y, it will load the precomputed descriptors from the file provided using -l
, or else, it will run the end-to-end code.
NOTE: Running these scripts might take more than a couple of hours on your personal laptops.
Dataset | No. Database Images | No. of Query Images | Accuracy(%) |
---|---|---|---|
Primary Dataset | 100 | 100 | 100 |
Samuel Adams | 30 | 30 | 100 |
We have provided a python script to visualize the feature mapping using SIFT.
python src/sift_visualization.py
It takes two arguments:
-d
Path to the database image
-q
Path to the query image
To study the effect of camera motion, an ipython notebook is provided CameraMotion.ipynb.
Here we study how the camera motion affects the SIFT algorithm. The following graph summarizes the results obtained:
To study the effect of distance of beer bottle from the camera.
Download the pre-computed SIFT descriptors from the given link (distance_sift.pkl
, ~8.1 MB) and save them in the lookup
folder.
Run the following command:
python src/main.py -d images/distances/database/ -q images/distances/query/ -l lookup/distance_sift.pkl