We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
So I am trying to implemented this RNN word generator model in jupytor notebook. When I was trying to use the trained model to generate some words:
with open(os.path.join(cfgs['save_dir'], 'config.pkl'), 'rb') as f: saved_args = cPickle.load(f) with open(os.path.join(cfgs['save_dir'], 'words_vocab.pkl'), 'rb') as f: words, vocab = cPickle.load(f) with tf.Session() as sess: model = Model(saved_args, True) tf.global_variables_initializer().run() saver = tf.train.Saver(tf.global_variables()) ckpt = tf.train.get_checkpoint_state(cfgs['save_dir']) if ckpt and ckpt.model_checkpoint_path: saver.restore(sess, ckpt.model_checkpoint_path) print(model.sample(sess, words, vocab, cfgs['n'], cfgs['prime'], cfgs['sample'], cfgs['pick'], cfgs['width']))
It works for the first time, but if I run the code again there is an error:
ValueError: Variable rnnlm/softmax_w already exists, disallowed. Did you mean to set reuse=True in VarScope?
Right now I have to shut down the ipynb file then run the code to get a new sample. Any idea to change the code to avoid this situation?
The text was updated successfully, but these errors were encountered:
You could also try adding this line in the code tf.reset_default_graph() before the new session is created.
tf.reset_default_graph()
Sorry, something went wrong.
No branches or pull requests
So I am trying to implemented this RNN word generator model in jupytor notebook. When I was trying to use the trained model to generate some words:
It works for the first time, but if I run the code again there is an error:
Right now I have to shut down the ipynb file then run the code to get a new sample. Any idea to change the code to avoid this situation?
The text was updated successfully, but these errors were encountered: