Skip to content

The MacGraph network. An attempt to get MACnets running on graph knowledge

License

Notifications You must be signed in to change notification settings

Octavian-ai/mac-graph

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MacGraph with Population-based Training

The Population-based training code is imported from Genetic Curriculum experiment conducted at Octavian.AI. The idea of Population-based training originates from DeepMind.

The core codebase implements graph question answering (GQA), using CLEVR-graph as the dataset and MACnets as the reasoning architecture.

Project status

Apologies that the training data isn't available - I've yet to find a quick solution to this, when I get the system working on more questions I'll publish a stable "all question" dataset with 1M items. For now, you can easily build you own data - ask David for help.

ObjectiveStatusNotes
Basic MAC cell structure Complete Implemented as per paper, now diverging to achieve below objectives
Recall station (node) properties Complete 99.9% accuracy after 10k training steps
Answer if stations adjacent Complete 99% accuracy after 20k training steps
Stations N apart Semi-complete 98% accuracy up to ~9 apart after 25k training steps
Station existence Complete 99.9% accuracy after 30k training steps
Station with property adjacent Complete 98.8% accuracy after 30k training steps `164ddc2`
Station adjacent to two other stations In progress 98% accuracy after 20k training steps

For more in-depth information about what works/doesn't work, check out the experiment log.

Running the code

See RUNNING.md for how to both run the network and also use a cluster to optimize its hyper-parameters.

The short summary of how to train locally:

$ pipenv install
$ pipenv shell
(mac-graph-sjOzWQ6Y) $ python -m macgraph.train

AOB

Acknowledgments

Thanks to Drew Hudson and Christopher Manning for publishing their work, Compositional Attention Networks for Machine Reasoning upon which this is based. Thanks also to DeepMind for publishing their Differentiable Neural Computer results in Nature with a demonstration of that architecture solving graph problems, it is a reassurance that this endeavor is not ill-founded.

A limerick

Since you're here.

There once was an old man of Esser,
Whose knowledge grew lesser and lesser,
It at last grew so small
He knew nothing at all
And now he's a college professor.

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •