Sie sind auf Seite 1von 1

PAPER REVIEW

Bayesian Inference is a principle implemented in the MACHINE LEARNING to practice and deal with
uncertainty. Bayesian is focused on non linear models. In machine learning, most methods to date search
for maximum a posteriori sparse solutions and neglect to represent posterior uncertainties. Regression is
one of the simple technique to relate them to practical. In the Probabilistic Regression Framework least
square is applied to reduce complexity. But this practice with result in 'OVERFITTING'. So Bayesian
prior is implemented to overcome this problem.
In Regularisation a common, and generally very reasonable, assumption is that we typically expect that
data is generated from smooth, rather than complex, functions .
Marginalisation and Ockhams Razor concepts are implemented in Bayesian Inference by integrating
irrelevant variables thus reducing the complexity.
In my view so here are some number of practices implemented to overcome the problems in the previous
practices. I think Bayesian inference given solution to the machine learning very clearly and removed the
complexity.
Marginalisation is the key element of Bayesian inference, and hopefully some of the examples have
persuaded the reader that it can be an exceedingly powerful one .
Marginalisation is a valuable component of the Bayesian paradigm which offers a number of
advantageous features applicable to many data modeling tasks. Disadvantageously, we also saw that the
integrations required for full Bayesian inference can often be analytically intractable, although
approximations for simple linear models could be very effective.
Finally Bayesian inference is focused on non linear models. The examples made the things simple.

Das könnte Ihnen auch gefallen