
Open access
Author
Show all
Date
2016-03-03Type
- Journal Article
Citations
Cited 32 times in
Web of Science
Cited 31 times in
Scopus
ETH Bibliography
yes
Altmetrics
Abstract
Optimisation problems typically involve finding the ground state (i.e. the minimum energy configuration) of a cost function with respect to many variables. If the variables are corrupted by noise then this maximises the likelihood that the solution is correct. The maximum entropy solution on the other hand takes the form of a Boltzmann distribution over the ground and excited states of the cost function to correct for noise. Here we use a programmable annealer for the information decoding problem which we simulate as a random Ising model in a field. We show experimentally that finite temperature maximum entropy decoding can give slightly better bit-error-rates than the maximum likelihood approach, confirming that useful information can be extracted from the excited states of the annealer. Furthermore we introduce a bit-by-bit analytical method which is agnostic to the specific application and use it to show that the annealer samples from a highly Boltzmann-like distribution. Machines of this kind are therefore candidates for use in a variety of machine learning applications which exploit maximum entropy inference, including language processing and image recognition. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000114222Publication status
publishedExternal links
Journal / series
Scientific ReportsVolume
Pages / Article No.
Publisher
Nature Publishing GroupSubject
Information theory and computation; Mathematics and computing; Quantum informationMore
Show all metadata
Citations
Cited 32 times in
Web of Science
Cited 31 times in
Scopus
ETH Bibliography
yes
Altmetrics