=========================== Announcing Theano 0.5 =========================== ## You can select and adapt one of the following templates. ## Basic text for major version release: This is a release for a major version, with lots of new features, bug fixes, and some interface changes (deprecated or potentially misleading features were removed). Upgrading to Theano 0.5 is recommended for everyone, but you should first make sure that your code does not raise deprecation warnings with Theano 0.4.1. Otherwise, in one case the results can change. In other cases, the warnings are turned into errors (see below for details). For those using the bleeding edge version in the git repository, we encourage you to update to the `0.5` tag. ## Basic text for major version release candidate: This is a release candidate for a major version, with lots of new features, bug fixes, and some interface changes (deprecated or potentially misleading features were removed). The upgrade is recommended for developers who want to help test and report bugs, or want to use new features now. If you have updated to 0.5rc1, you are highly encouraged to update to 0.5rc2. For those using the bleeding edge version in the git repository, we encourage you to update to the `0.5rc2` tag. ## Basic text for minor version release: TODO ## Basic text for minor version release candidate: TODO What's New ---------- [Include the content of NEWS.txt here] Download and Install -------------------- You can download Theano from http://pypi.python.org/pypi/Theano Installation instructions are available at http://deeplearning.net/software/theano/install.html Description ----------- Theano is a Python library that allows you to define, optimize, and efficiently evaluate mathematical expressions involving multi-dimensional arrays. It is built on top of NumPy. Theano features: * tight integration with NumPy: a similar interface to NumPy's. numpy.ndarrays are also used internally in Theano-compiled functions. * transparent use of a GPU: perform data-intensive computations much faster than on a CPU. * efficient symbolic differentiation: Theano can compute derivatives for functions of one or many inputs. * speed and stability optimizations: avoid nasty bugs when computing expressions such as log(1+ exp(x)) for large values of x. * dynamic C code generation: evaluate expressions faster. * extensive unit-testing and self-verification: includes tools for detecting and diagnosing bugs and/or potential problems. Theano has been powering large-scale computationally intensive scientific research since 2007, but it is also approachable enough to be used in the classroom (IFT6266 at the University of Montreal). Resources --------- About Theano: http://deeplearning.net/software/theano/ Theano-related projects: http://github.com/Theano/Theano/wiki/Related-projects About NumPy: http://numpy.scipy.org/ About SciPy: http://www.scipy.org/ Machine Learning Tutorial with Theano on Deep Architectures: http://deeplearning.net/tutorial/ Acknowledgments --------------- I would like to thank all contributors of Theano. For this particular release, many people have helped, notably (in alphabetical order): [Generate the list of commiters: git shortlog -s ...| cut -c8-] I would also like to thank users who submitted bug reports, notably: [TODO] Also, thank you to all NumPy and Scipy developers as Theano builds on their strengths. All questions/comments are always welcome on the Theano mailing-lists ( http://deeplearning.net/software/theano/#community )