Beruflich Dokumente
Kultur Dokumente
Name : R. Divya
In February 2003, following the repeal of the Unit Trust of India Act 1963
UTIwas bifurcated into two separate entities. One is the Specified Undertaking
of the Unit Trust of India with assets under management of Rs.29,835 crores
asa t t h e e n d o f J a n u a r y 2 0 0 3 , r e p r e s e n t i n g b r o a d l y , t h e
US64s c h e me , a s s u r e d r e t u r n a n d c e r t a i n o t h e r s c h e m
e s . T h e S p e c i f i e d Undertaking of Unit Trust of India, functioning
under an administrator andunder the rules framed by Government of India
and does not come under thepurview of the Mutual Fund Regulations.The
second is the UTI Mutual Fund Ltd, sponsored by SBI, PNB, BOB
andL I C . I t i s r e g i s t e r e d w i t h S E B I a n d f u n c t i o n s u n d
e r t h e M u t u a l F u n d R e g u l a t i o ns . Wi t h t h e b i f u r c a t i o n o f t h e
e r s t wh i l e U TI whi c h h a d i n M a r c h 2000 more than Rs.76,000 crores
of assets under management and with thes e t t i n g u p o f a U T I
Mutual Fund, conforming to the SEBI Mutual
F u n d Regulations, and with recent mergers taking place among
different privates e c t o r f u n d s , t h e m u t u a l f u n d i n d u s t r y h a s
e n t e r e d i t s c u r r e n t p h a s e o f consolidation and growth. As at the
end of September, 2004, there were 29funds, which manage assets of
Rs.153108 crores under 421 scheme.
While the Indian mutual fund industry has grown in size by about 320%
fromMarch, 1993 (Rs. 470 billion) to December, 2004 (Rs. 1505 billion) in
terms of AUM, the AUM of the sector excluding UTI has grown over 8 times
from Rs.152 billion in March 1999 to $ 148 billion as at March 2008.
Many nationalized banks got into the mutual fund business in the
earlyn i n e t i e s a n d g o t o f f t o a s t a r t d u e t o t h e s t o c k m a r k e t
b o o m w a s p r e v a i l i n g . Th e s e b a n k s d i d n o t r e a l l y u n d e r s t a n d
t h e mu t u a l f u n d business and they just viewed it as another kind of
banking activity.Few hired specialized staff and generally chose to
transfer staff fromt h e p a r e n t o r g a ni z a t i o n s . Th e p e r f o r ma n c e o f
mo s t o f t h e s c h e me s f l o a t e d b y t he s e f u n d s wa s n o t g o o d . S o me
s c h e me s h a d o f f e r e d guaranteed returns and their parent organizations had
to bail out theseAMCs by paying large amounts of money as a difference
between theguaranteed and actual returns. The service levels were also very
bad.M o s t o f t h e s e A M C s h a v e n o t b e e n a b l e t o r e t a i n
s t a f f , f l o a t n e w schemes etc.
Components of the Study
Research can be classified into two categories: Basic research, which is done
in a lab or a clinical setting and applied research, which is done with real
subjects in real-world situations. And from these categories of research, we
have the following general types of studies:
Animal Study: An animal or in vivo study is a study in which animals are used
as subjects. A common use of an animal study is with a clinical trial (see
below) and as a precursor to evaluating a medical intervention on humans.
However, it is critical to recognize that results from animal studies should not
be extrapolated to draw conclusions on what WILL happen in humans.
Case Study: A case study provides significant and detailed information about a
single participant or a small group of participants. “Case studies are often
referred to interchangeably with ethnography, field study, and participant
observation.” Unlike other studies which rely heavily on statistical analysis,
the case study is often undertaken to identify areas for additional research and
exploration.
Clinical Trial Study: A clinical trial study is often used in the areas of health
and medical treatments that will presumably yield a positive effect. Typically a
small group of people or animals are selected based upon the presence of a
specific medical condition. This group is used to evaluate the effectiveness of a
new medication or treatment, differing dosages, new applications of existing
treatments. Due to the risk involved with many new medical treatments, the
initial subjects in a clinical trial may be animals and not humans. After positive
outcomes are obtained, research then can proceed to a human study where the
treatment is compared against results from the existing.
Benchmark
It provides a yardstick against which you can measure fund performance.
It indicates how much returns the fund has generated as against how
much it should have delivered. As per SEBI ‘s mandate, each fund
declares its benchmark and considers it as a target to analyse
performance. If index rises by 12% but NAV of the fund rises by 14%,
then the fund is said to have outperformed the index. Conversely, if the
index falls by 10% but the fund loses by 12%, then the fund is said to
have under-performed the index. Basically, comparison should be made
to look for a fund which gains more in a market rally and loses less in a
slump.
Investment Horizon
Your investment horizon becomes a driving factor for fund selection and
comparison. Investment horizon relates to the time period for which you
stay invested in the given fund. The type of fund chosen for comparison
should be according to the investment horizon. Equity funds are suitable
for a long term horizon of at least 7 years or more. The fund objective in
this time period is wealth accumulation at a relatively high risk.
From this context, the fund returns selected for comparison should match
the investment horizon. It means that while comparing two equity funds,
you may consider fund returns of past 5 to 10 years. In the same way, for
comparing two liquid funds, consider fund returns of past 6 months to 1
year. Select the fund which has given superior performance across
different time intervals.
Riskiness
Whenever you invest in any mutual fund, you undertake some kind of
risk. This risk relates to variability of fund NAV as per the overall market
movements. According to investment thumb rule, higher risk needs to be
rewarded with higher returns. But plain vanilla returns do not reflect this
aspect of a mutual fund. Thus, you need to use a better measure for
comparing two funds on the grounds of risk-adjusted returns.
You may use alpha and beta for this. These are financial ratios which tell
you about rewarding potential of a mutual fund. Beta tells you how much
risk is involved in investing in a fund. Alpha tells you how much extra
return will the fund generate over and above the underlying benchmark.
See, your target is to beat the benchmark and not to imitate it. Supposes
two funds are at the same level of beta i.e. 1.5. Fund A and Fund B has
alpha of 2 and 2.5 respectively. Fund B is better because it gives higher
risk-adjusted returns.
Your investment in mutual funds comes at a cost called the expense ratio.
It is an annual fee which the fund house charges to the unitholders to
manage portfolio on their behalf. The level of expense ratio has a direct
impact on the level of fund returns earned by the investors. It is because
expense ratio is charged as a percentage of fund’s asset under
management. A higher expense ratio ultimately dents the profits earned
by the investors. Look for a fund which has the lowest expense ratio in
the given category.
While using the expense ratio, you need to keep a few things in mind.
The expense ratio of direct plans is lower than that of regular plans due to
absence of distributor commission. Compare one regular fund with other
and one direct plan with other. Do not compare index fund with an
actively managed fund. The expense ratio of index fund is lower due to
low fund management fee. Compare active funds with active funds. Do
not compare equity fund with debt fund. Owing to higher transaction cost
and brokerage, equity funds have higher expense ratio than debt funds.
Advantages
The Securities Act of 1933 requires that all investments sold to the public,
including mutual funds, be registered with the SEC and that they provide
prospective investors with a prospectus that discloses essential facts about
the investment.
The Securities and Exchange Act of 1934 requires that issuers of securities,
including mutual funds, report regularly to their investors. This act also
created the Securities and Exchange Commission, which is the principal
regulator of mutual funds.
The Revenue Act of 1936 established guidelines for the taxation of mutual
funds.
The Investment Company Act of 1940 established rules specifically
governing mutual funds.
These new regulations encouraged the development of open-end mutual funds
(as opposed to closed-end funds).
Growth in the U.S. mutual fund industry remained limited until the 1950s, when
confidence in the stock market returned. By 1970, there were approximately
360 funds with $48 billion in assets.[3]
The introduction of money market funds in the high interest rate environment of
the late 1970s boosted industry growth dramatically. The first retail index fund,
First Index Investment Trust, was formed in 1976 by The Vanguard Group,
headed by John Bogle; it is now called the "Vanguard 500 Index Fund" and is
one of the world's largest mutual funds. Fund industry growth continued into the
1980s and 1990s.
According to Pozen and Hamacher, growth was the result of three factors:
The purpose of analysing data is to obtain usable and useful information. The
analysis, irrespective of whether the data is qualitative or quantitative, may:
• describe and summarise the data
• identify relationships between variables
• compare variables
• identify the difference between variables
• forecast outcomes
assumption of the qualitative researcher is that the human instrument is capable
of ongoing finetuning in order to generate the most fertile array of data. Morgan
and Krueger (1998:Vol. 6:3-17) on the other hand, provide important views
when they reiterate that the analysis of qualitative methods must be systematic,
sequential, verifiable and continuous. It requires time, is jeopardised by delay, is
a process of comparison, is improved by feedback, seeks to enlighten and
should entertain alternative explanations. As with qualitative methods for data
analysis, the purpose of conducting a quantitative study, is to produce findings,
but whereas qualitative methods use words (concepts, terms, symbols, etc.) to
construct a framework for communicating the essence of what the data reveal,
procedures and techniques are used to analyse data numerically, called
quantitative methods (Sesay, 2011:74). On the whole, regardless of the method
(qualitative or quantitative), cf. par. 1.4.2, p. 13; 1.4.5, p. 16; 1.4.6, p. 17; 5.4.2,
p. 318), the purpose of conducting a study, is to produce findings, and in order
to do so, data should be analysed to transform data into findings. In this study,
data will be analysed using both the qualitative and quantitative method. At this
point in time, one has to take a closer look at both methods of analysis.
Regarding qualitative and quantitative analysis of data, Kreuger and Neuman
(2006:434) offer a useful outline of the differences and similarities between
qualitative (cf. par. 6.2.1, p. 358) and quantitative methods (cf. par. 6.2.2, p.
367) of data analysis. According to these authors, qualitative and quantitative
analyses are similar in four ways. Both forms of data analysis involve:
Inference - the use of reasoning to reach a conclusion based on evidence;
A public method or process - revealing their study design in some way;
Comparison as a central process – identification of patterns or aspects that are
similar
inferences.
TableTitle: Balance In Personal&Proffesional Life.
Sr. No Attributes No. Of Percentage
Respondents
1. Yes 09 45%
2. No 03 15%
3. To some extent. 08 40%
TOTAL - 20 100%
No. Of Respondents
40% a) Yes
45%
b) No
c) To some extent.
15%
Interpretation:
Here the 40% employees are saying to some extent 45% are saying yes and 15%
are saying no.
We can predict that some of the employees are not able to balance personal and
professional life.
13.The people here are pleasant and co-operative to work with.
a. Yes b. No c. To some extent.
No. Of Respondents
15%
5%
a) Yes
b) No
c) To some extent.
80%
Interpretation:
Here the 80% employees are saying yes 15% are saying to some extent and 5%
are saying no.
14.There is someone at work who encourages my development.
a. Yes b. No c. To some extent.
No. Of Respondents
20%
a) Yes
b) No
15%
c) To some extent.
65%
Interpretation:
Here the 65% employees are saying yes 20% are saying to some extent and 15%
are saying no.
15.Even if I had the opportunity to get a similar job with
another organization, I would stay with my present company.
a. Yes b. No c. To some extent.
No. Of Respondents
35%
a) Yes
b) No
60% c) To some extent.
5%
Interpretation:
Here the 60% employees are saying to some extent 35% are saying yes and 5%
are saying no.
The core differences between qualitative (cf. par. 6.2.1, p. 358) and
quantitative data (cf. par. 6.2.2, p. 367) analysis are as follows (Kreuger &
Neuman, 2006:434-435): Qualitative data analysis is less standardised with the
wide variety in approaches to qualitative
research matched by the many approaches to data analysis, while quantitative
researchers choose from a specialised, standard set of data analysis techniques.
Elaborating a set of generalisations, which suggest that certain relationships
hold firm in the setting being examined, and affirming that these cover all the
known eventualities in the data set. Formalizing these theoretical constructs and
making inferences from them to other cases in
place and time. As we have seen so far from our discussion of qualitative data
analysis, there are always variations in the number and description of steps for
the same process by different authors. To the preceding body of knowledge,
outlined by different authors, one can add the views of Watling and James
(2012:385-395). According to these authors, the process of qualitative data
analysis consists of six stages (steps), namely: Defining and identifying data.
From the outset, it is crucial to obtain a clear understanding of
the meaning of data, and fundamentally, even more importantly, the data
required in accordance with the research question and aims. Collecting and
storing data. When collecting data, most researchers start to form opinions and
judgement, which result in theories being developed, in the mind of the
researcher, and as such one has to consider not only ways to collect data, but
also to store data to make them accessible for analysis. So the interviews for
instance can be recorded by means of a digital recorder, transcribed and stored
(loaded) on a computer programme such as Atlas.tiTM Version 6 (Atlas.tiTM).
Data reduction and sampling. During the data collection process (cf. par. 5.8.4,
p. 330),
reaching a point of saturation implies that all data were reduced, filtered and
sampled through the process of analysis. It is therefore critical for the researcher
when analysing data to determine what one already knows to be important or
relevant, in accordance with the intended purpose of the investigation. Stated
differently, the researcher needs to establish, on the one hand, which data are
not relevant, and on the other hand, which data encapsulate the essence and
evidence one wishes to focus on for a more detailed analysis. Hence, from the
preceding can be inferred that it is important to establish incidences and
similarities in the respective interviews. In addition, one should establish
whether the expected reactions (responses) were obtained and if there are still
deficiencies regarding certain questions. Structuring and coding data.
Structuring and coding of data underpin the key research.
outcomes and can be used to shape the data to test, refine or confirm
established theory, apply theory to new circumstances or use it to generate a
new theory or model, or even in the case of CHAPTER 6: DATA ANALYSIS
AND INTERPRETATION 361 this study, develop a new measurement
instrument, such as a questionnaire (cf. par. 5.9.3, p. 339). During coding, the
corpus of data has to be divided into segments and these segments are assigned
codes which relate to analytic themes being developed (Fielding, 2002:163) and
applied consistently over the period of analysis and over a range of data. Basic
coding, carried out as a first step in the analysis of data, is both useful in itself
and acts as a preparation of the data for more advanced analysis at higher levels
of abstraction (Punch, 2011:175). It can therefore be deduced that structuring
and coding signifies an analytical process of elaboration of data, as for instance
obtained from semi-structured interviews in related themes, on the hands of
codes and structures to form (establish) an understandable framework and
associations derived from the language of participants. The process of coding
for this study will be considered in a later paragraph (cf. par. 6.2.2.2, p. 370).
Theory building and testing. An important purpose of research is to generate
new knowledge.
(Watling & James, 2012:392). To this end, it might be helpful to take into
consideration the set of tactics for generating meaning from qualitative data,
described by Miles and Huberman (1994:245-246), commented on in an
ensuing paragraph. More specifically in relation to theory building and testing
as part of the process of data analysis, it can be said that based upon the created
framework, relevant diversions (distractions) can be made and insight in the
research question under investigation can be obtained. In building and testing
theory, it is important to view the reactions of respondents and whether they
correspond or not, and also to ensure that a point of saturation of data is
reached. Reporting and writing up research. In brief, the reporting and writing
up of research entails to.
put words on paper, in the form of a report, constructing an argument based on
the findings of what you have done, what you have seen and heard, participants
you interviewed and the information that comes forth from the process of data
analysis. Ultimately, the conclusions drawn from the information should
contribute to the body of knowledge and represent new meaning and insight in
the research question.
Data interpretation refers to the implementation of processes through which data is
reviewed for the purpose of arriving at an informed conclusion. The interpretation of
data assigns a meaning to the information analyzed and determines its signification
and implications.
Yet, before any serious data interpretation inquiry can begin, it should be
understood that visual presentations of data findings are irrelevant unless a
sound decision is made regarding scales of measurement. Before any serious
data analysis can begin, the scale of measurement must be decided for the data
as this will have a long-term impact on data interpretation ROI. The varying
scales include:
For a more in-depth review of scales of measurement, read our article on data
analysis questions. Once scales of measurement have been selected, it is time to
select which of the two broad interpretation processes will best suit your data
needs. Let’s take a closer look at those specific data interpretation methods and
possible data interpretation problems.
Interviews: one of the best collection methods for narrative data. Enquiry
responses can be grouped by theme, topic or category. The interview approach
allows for highly-focused data segmentation.
Regression analysis
Cohort analysis
Predictive and prescriptive analysis
Now that we have seen how to interpret data, let’s move on and ask ourselves
some questions: what are some data interpretation benefits? Why do all
industries engage in data research and analysis? These are basic questions, but
that often don’t receive adequate attention.
Data analysis and interpretation, in the end, helps improve processes and
identify problems. It is difficult to grow and make dependable improvements
without, at the very least, minimal data collection and interpretation. What is the
key word? Dependable. Vague ideas regarding performance enhancement exist
within all institutions and industries. Yet, without proper research and analysis,
an idea is likely to remain in a stagnant state forever (i.e., minimal growth).
So… what are a few of the business benefits of digital age data analysis and
interpretation? Let’s take a look!
If institutions only follow that simple order, one that we should all be familiar
with from grade school science fairs, then they will be able to solve issues as
they emerge in real time. Informed decision making has a tendency to be
cyclical. This means there is really no end, and eventually, new questions and
conditions arise within the process that need to be studied further. The
monitoring of data results will inevitably return the process to the start with new
data and sights.
When industry trends are identified, they can then serve a greater industry
purpose. For example, the insights from Shazam’s monitoring benefits not only
Shazam in understanding how to meet consumer needs, but it grants music
executives and record label companies an insight into the pop-culture scene of
the day. Data gathering and interpretation processes can allow for industry-wide
climate prediction and result in greater revenue streams across the market. For
this reason, all institutions should follow the basic data cycle of collection,
interpretation, decision making and monitoring.
3) Cost efficiency: Proper implementation of data analysis processes can
provide businesses with profound cost advantages within their industries. A
recent data study performed by Deloitte vividly demonstrates this in finding that
data analysis ROI is driven by efficient cost reductions. Often, this benefit is
overlooked because making money is typically viewed as “sexier” than saving
money. Yet, sound data analyses have the ability to alert management to cost-
reduction opportunities without any significant exertion of effort on the part of
human capital.
A great example of the potential for cost efficiency through data analysis is
Intel. Prior to 2012, Intel would conduct over 19,000 manufacturing function
tests on their chips before they could be deemed acceptable for release. To cut
costs and reduce test time, Intel implemented predictive data analyses. By using
historic and current data, Intel now avoids testing each chip 19,000 times by
focusing on specific and individual chip tests. After its implementation in 2012,
Intel saved over $3 million in manufacturing costs. Cost reduction may not be
as “sexy” as data profit, but as Intel proves, it is a benefit of data analysis that
should not be neglected.
4) Clear foresight: companies that collect and analyze their data gain better
knowledge about themselves, their processes and performance. They can
identify performance challenges when they arise and take action to overcome
them. Data interpretation through visual representations lets them process their
findings faster and make better-informed decisions on the future of the
company.
The oft-repeated mantra of those who fear data advancements in the digital age
is “big data equals big trouble.” While that statement is not accurate, it is safe to
say that certain data interpretation problems or “pitfalls” exist and can occur
when analyzing data, especially at the speed of thought. Let’s identify three of
the most common data misinterpretation risks and shed some light on how they
can be avoided:
2) Confirmation bias: our second data interpretation problem occurs when you
have a theory or hypothesis in mind, but are intent on only discovering data
patterns that provide support, while rejecting those that do not.
Digital age example: your boss asks you to analyze the success of a recent
multi-platform social media marketing campaign. While analyzing the potential
data variables from the campaign (one that you ran and believe performed well),
you see that the share rate for Facebook posts were great, while the share rate
for Twitter Tweets were not. Using only the Facebook posts to prove your
hypothesis that the campaign was successful would be a perfect manifestation
of confirmation bias.
Remedy: as this pitfall is often based on subjective desires, one remedy would
be to analyze data with a team of objective individuals. If this is not possible,
another solution is to resist the urge to make a conclusion before data
exploration has been completed. Remember to always try to disprove a
hypothesis, not prove it.
3) Irrelevant data: the third and final data misinterpretation pitfall is especially
important in the digital age. As large data is no longer centrally stored, and as it
continues to be analyzed at the speed of thought, it is inevitable that analysts
will focus on data that is irrelevant to the problem they are trying to correct.
2) Mobile Data. Related to the notion of “connected and blended data” is that
of mobile data. In today’s digital world, employees are spending less time at
their desks and simultaneously increasing production. This is made possible by
the fact that mobile solutions for analytical tools are no longer standalone.
Today, mobile analysis applications seamlessly integrate with everyday
business tools. In turn, both quantitative and qualitative data are now available
on demand where they’re needed, when they’re needed and how they’re needed.
3) Visualization. Data dashboards are merging the data gap between qualitative
and quantitative methods of interpretation of data, through the science of
visualization. Dashboard solutions come “out of the box” well-equipped to
create easy-to-understand data demonstrations. Modern online data
visualization tools provide a variety of color and filter patterns, encourage user
interaction and are engineered to help enhance future trend predictability. All of
these visual characteristics make for an easy transition among data methods –
you only need to find the right types of data visualization to tell your data story
the best way possible.
To give you an idea of how a market research dashboard fulfils the need of
bridging quantitative and qualitative analysis, and helps in understanding how
to interpret data in research thanks to visualization, have a look at the following
one. It brings together both qualitative and quantitative data knowledgeably
Chapter-V
StockTotalReturn
We begin our illustration with a share of XYZ Company that is bought for $30
at the beginning of the year. During the year, its price fluctuates, but it closes
the year at $33, which represents a nice percentage return on the investment of
10% ($3/$30).
But, things get even better because XYZ paid an annual dividend of $1 per
share. This dividend equals an additional 3.3% return ($1/$30). Adding together
the capital appreciation (price increase) of 10% and the income return
(dividend) of 3.3% gives us a one-year total return for XYZ Company stock of
13.3%. However, remember that unless you sell XYZ stock, the price
appreciation gain remains in the stock price, or is unrealized. (For more on this
concept,see With mutual funds, explaining total return is a bit more
complicated. We begin with a share of the ABC Fund, which is purchased at its
net asset value (price) of $16 per share. A fund's NAV is derived by dividing the
value of its portfolio securities (the fund's assets), less any accrued fees and
expenses (the fund's liabilities), by the number of fund shares outstanding.
Here's an illustration of the computation of net asset value for the ABC Fund:
The ABC Fund passed along all the earnings and capital appreciation it
generated - $4 ($1 in dividend distributions and $3 in a capital gains
distribution) to its shareholders for a total return of 25% ($4/$16). Here again,
unlike a stock, by paying out all its capital gains, the ABC Fund's price, or
NAV, remains at or close to $16. In this scenario, if a fund investor only
focused on the movement in ABC's NAV, the results would not look very good.
It's even possible for a fund's NAV to decline, but still have good
income/capital gain distributions, which will be reflected in a positive total
return.
Obviously, a fund's NAV does not tell the whole mutual fund performance
story, but its total return does. It captures a fund's changes in NAV, its income
distribution and capital gains distribution, which, as a whole, are the true test of
fund's return on investment.
Conclusions
Let's recap what we've learned in this mutual fund tutorial:
A mutual fund brings together a large group of people and invests their
aggregated money in stocks, bonds, and other securities.
The advantages of mutual funds are professional
management, diversification, economies of scale, and wide range of
offerings.
The disadvantages of mutuals are high costs, over-diversification,
possible tax consequences, liquidity concerns, and the inability of
management to guarantee a superior return.
There are many, many types of mutual funds. You can classify funds
based on asset class, investing strategy, region, etc.
Mutual funds have expenses that can be broken down generally into
ongoing fees (represented by the expense ratio) and transaction fees
(loads).
Some funds carry no broker fee, known as no-load mutual funds.
One of the biggest problems with mutual funds are their costs and fees.
Mutual funds are easy to buy and sell. You can either buy them directly
from the fund company or through a third party.
Comparing fund returns across a number of metrics is important, such as
over time, compared to its benchmark, and compared to other funds in its
peer group.
Suggestions
1. Expenses-Ratio
Mutual funds do not run themselves. They need to be managed and this
management is not free! The expenses to operate a mutual fund can be as
involved as a corporation. But all you need to know is that higher
expenses do not always translate into higher mutual fund returns. In fact,
lower expenses usually translate into higher returns, especially over long
periods of time.
But what expense ratio is high? Which is best? When doing your
research, keep in mind average expense ratios for mutual funds. Here are
Never buy a mutual fund with expense ratios higher than these! Notice
that the average expenses change by fund category. The fundamental
reason for this is that research costs for portfolio management are higher
for certain niche areas, such as small-cap stocks and foreign stocks,
where information is not as readily available compared to large domestic
companies. Also, index funds are passively managed. Therefore costs can
be kept extremely low.
1.
Manager Tenure refers to the amount of time, usually measured in years,
a mutual fund manager or management team has been managing a
particular,mutual.fund.
Manager tenure is most important to know when investing in actively-
managed mutual funds. Managers of actively managed funds are actively
trying to outperform a particular benchmark, such as the S&P 500;
whereas the manager of a passively-managed fund is only investing in the
same,securities,asthebenchmark.
When looking at a mutual fund's historical performance, be sure to
confirm the manager or management team has been managing the fund
for the time frame you are reviewing. For example, if you are attracted to
the 5-year return of a mutual fund but the manager tenure is only one
year, the 5-year return is not meaningful in making the decision to buy
this fund.
BIBILIGROPHY
References
books
journals
magazines
Websites
www.imasion.com
www.imasionindia.com
www.managementstudyguide.com
www.citehr.com