Skip to content
/ dlrm Public
forked from facebookresearch/dlrm

An implementation of a deep learning recommendation model (DLRM)

License

Notifications You must be signed in to change notification settings

apd10/dlrm

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ROBE-Z in DLRM

Running the UMA/RMA 1). create the Kaggle/CriteTB dataset . Generally this happens on the first run.

  1. Download the repo https://github.com/apd10/universal_memory_allocation
  • and run python3 setup install in that repo
  1. A) run the bench/train_tb_rma_final.sh to run 1000x compression on official MLPerf DLRM Model run the bench/train_tb_rma_final.chunk32.sh to run 1000x compression on official MLPerf DLRM Model using Z = 32

    B) run the bench/train_kaggle_rma.sh to run 1000x compression on kaggle model (embedding size 16)

    • You can change the --rma-size to use required memory

Deep Learning Recommendation Model for Personalization and Recommendation Systems: Refer to the Original dlrm repo for dlrm description


This source code is licensed under the MIT license found in the LICENSE file in the root directory of this source tree.

About

An implementation of a deep learning recommendation model (DLRM)

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 93.7%
  • Shell 6.2%
  • Dockerfile 0.1%