Skip to content

Implementation of a Policy Iteration for a simple Markov Decision Process example.

Notifications You must be signed in to change notification settings

awkbr549/markov-decision-process

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

To build and run properly, use:
	$ ./run.sh

Makefile is only used for cleaning

About

Implementation of a Policy Iteration for a simple Markov Decision Process example.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published