We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Curiosity driven exploration to PPO.
It's been shown [citation needed] that Curiosity improves agents' performance on sparse reward environments.
The text was updated successfully, but these errors were encountered:
Citation: Deepak Pathak, Pulkit Agrawal, Alexei A. Efros and Trevor Darrell. Curiosity-driven Exploration by Self-supervised Prediction. In ICML 2017., Yuri Burda, Harri Edwards, Deepak Pathak, Amos Storkey, Trevor Darrell and Alexei A. Efros. Large-Scale Study of Curiosity-Driven Learning. In ICLR 2019?
Sorry, something went wrong.
Yes :)
No branches or pull requests
What
Add Curiosity driven exploration to PPO.
Why
It's been shown [citation needed] that Curiosity improves agents' performance on sparse reward environments.
The text was updated successfully, but these errors were encountered: