Links to Useful and Interesting Research Papers


Papers of Interest

  • Ho, C.-L.; Ide, Y.; Konno, N.; Segawa, E.; Takumi, K. 2017. A spectral analysis of discrete-time quantum walks with related to birth and death chains arXiv:1706.01005 [quant-ph]
  • LaMont, C.H.; Wiggins, P.A. 2017. A correspondence between thermodynamics and inference arXiv:1706.01428 [math.ST]
  • Małkiewicz, P.; Miroszewski, A. 2017. Internal clock formulation of quantum mechanics. arXiv:1706.00743 [gr-qc]
  • Melnikov, A.A.; Nautrup, H.P.; Krenn, M.; Dunjko, V.; Tiersch, M.; Zeilinger, A.; Briegel, H.J. 2017. Active learning machine learns to create new quantum experiments. arXiv:1706.00868 [quant-ph]

Experimental Design and Active Learning:

  • Bell, A.J. 2003. The co-information lattice. In Proc. 4th International Symposium on Independent Component Analysis and Blind Source Separation (ICA2003), pp. 921-926. http://www.menem.com/~ilya/digital_library/dependence/bell-02.pdf
  • Cox R.T. 1979. Of inference and inquiry. In: The Maximum Entropy Formalism (eds. R.D. Levine & M. Tribus), MIT Press, Cambridge, pp. 119-167.
  • Fedorov V.V. 1972. Theory of Optimal Experiments. New York:Academic.
  • Fry R. L. 2002. The engineering of cybernetic systems. In: R.L. Fry (ed.) Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Baltimore MD, USA, AIP Conf. Proc. 617, Melville NY:AIP, pp. 497–528.
  • Lindley D.V. 1956. On the measure of information provided by an experiment. Ann. Math. Statist. 27, 986–1005.
  • Loredo T.J. 2003. Bayesian adaptive exploration. In: G. J. Erickson, Y. Zhai (eds.) Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Jackson Hole WY, USA, AIP Conf. Proc. 707, Melville NY:AIP, pp. 330–346.
  • MacKay D.J.C. 1992. Information-based objective functions for active data selection. Neural Computation, 4(4), 589–603. http://www.mitpressjournals.org/doi/pdf/10.1162/neco.1992.4.4.590
  • Olsson L., Nehaniv C.L., Polani D. 2005. Sensor adaptation and development in robots by entropy maximization of sensory data, In Proceedings of the 6th IEEE International Symposium on Computational Intelligence in Robotics and Automation (CIRA 2005).
  • Polani D., Kim J.T., and Martinetz T. 2001. An Information-Theoretic Approach for the Quantification of Relevance. In: J. Kelemen and P. Sosik (eds.), Advances in Artificial Life (Proc. 6th European Conference on Artificial Life, Prague, Sept 10-14), LNCS. Springer 2001.
  • Sebastiani P. and Wynn H.P. 1997 Bayesian experimental design and Shannon information. In 1997 Proceedings of the Section on Bayesian Statistical Science, 176-181. American Statistical Association, 1997. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.56.6037&rep=rep1&type=pdf
  • Sebastiani P. and Wynn H.P. 2000. Maximum entropy sampling and optimal Bayesian experimental design. J. Roy. Stat. Soc. B, 62:145-157, 2000.
  • Shannon C.E., Weaver W. 1949. The Mathematical Theory of Information, University of Illinois Press, Urbana IL.
  • Thrun, S., Burgard, W., Fox, D. Probabilistic Robotics. 2005. MIT Press.
  • Wiener N. 1948. Cybernetics or Control and Communication in the Animal and the Machine, Cambridge:MIT Press.

Information Physics and Bayesian Data Analysis