Skip to content
/ lshmatmul-op Public template
forked from tensorflow/custom-op

Guide for building custom op for TensorFlow

License

Notifications You must be signed in to change notification settings

nhasabni/lshmatmul-op

 
 

Repository files navigation

LSH-based implmentations of various TensorFlow layers

This repository contains implementation of various TensorFlow layers that can benefit of using locality-sensitive hashing.

Quick build and test default zero_out op

Build the pip package with make as:

   make zero_out_pip_pkg

Install the pip package as:

   pip3 install artifacts/*.whl

Test zero_out op as:

cd ..
python3 -c "import tensorflow as tf;import tensorflow_zero_out;print(tensorflow_zero_out.zero_out([[1,2], [3,4]]))"

And you should see the op zeroed out all input elements except the first one:

[[1 0]
 [0 0]]

LSH-MatMul build and test (WIP)

Build the pip package with make as:

   make lsh_matmul_pip_pkg

Install the pip package as:

   pip3 install artifacts/*.whl

Test LSHMatMul Keras layer as:

python examples/lsh_matmul_keras_test.py

And you should see the output as:

About

Guide for building custom op for TensorFlow

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 37.6%
  • C++ 29.3%
  • Starlark 15.5%
  • Shell 13.8%
  • Makefile 3.2%
  • Smarty 0.6%