Sie sind auf Seite 1von 1

Q. A portfolio manager is considering making investment in ABC Limited.

In order to calculate the risk of his investment he used Variance, Standard Deviation and VaR (Value at Risk) tools as a measure of risk. What do you think which one of the above mentioned tools is the most effective in calculating the risk of an investment and why?
There are three tools which are mentioned for the risk analysis and used by the portfolio manager. The most appropriate tool from the above three is the VaR (value at Risk) because the variance only tells us that how much data is dispersed from the mean point, same like that standard deviation also tells how much data is deviating from its mean point but it is better tool than the Variance. Both tools used for the calculation the deviation of the data from the mean point, but both the tools are not appropriate for the calculation of investment risk.
Value at Risk is used to estimate the probability of portfolio losses based on the statistical analysis of historical price trends and volatilities, and is widely used risk measure of the risk of loss on a

specific portfolio of financial assets. As far as this question is concerned is why it is most effective? Is because, VaR risk measure defines risk as mark-to-market loss on a fixed portfolio over a fixed time horizon, assuming normal markets.

Das könnte Ihnen auch gefallen