Sie sind auf Seite 1von 22

International Journal of Market Research Vol.

54 Issue 3

Did you tell me the truth?


The influence of online community on eWOM
Jun Yang
University of Houston-Victoria

Enping (Shirley) Mai Joseph Ben-Ur

East Carolina University University of Houston-Victoria

With the rapid development of online communities and social networks, marketers have started to use online opinion leaders to influence their social circles. In this study, we use a review dataset generated from an online forum to empirically investigate social influence on reviewers eWOM motives and readers feedback. Our results show that, first, community members reviews are not influenced by their forum involvement. Their evaluations mainly depend on product attributes. Second, the reviews from those who have established their expertise in the community generate more buzz and more trust among online forum readers compared to reviewers with less expertise. The findings indicate that certain marketing strategies, such as seeding targeted towards opinion leaders, may work better than a general buzz marketing strategy targeted towards a general audience. Our results also provide useful guidance on how to identify opinion leaders in the online community.

Introduction
With the rapid development of online communities and social networks, marketers have started to use online opinion leaders to influence their social circles (Watts & Dodds 2007; Kozinets etal. 2010). Marketing strategies such as seeding campaigns (where companies give products to influential consumers and expect those opinion leaders will talk favourably to other consumers) and firms participation in online communities are commonly adopted in practice. Though the influence of those online communities has been recognised, how consumers interact with one another and utilise those communities
Received (in revised form): 6 October 2011

2012 The Market Research Society DOI: 10.2501/IJMR-54-3-369-389

369

Did you tell me the truth?

is still obscure (Wasko & Faraj 2005; De Valck etal. 2009; Kozinets etal. 2010). Furthermore, most of the existing studies of electronic-word-of-mouth (hereafter eWOM) are focused on its impact on firms performances (Chevalier & Mayzlin 2006; Liu 2006; Duan etal. 2008). Very few studies investigate the influence of consumers eWOM in online communities (Wasko & Faraj 2005; Eccleston & Griseri 2008; DeValck etal. 2009; Kozinets etal. 2010). Our study tries to fill this research gap. We use a review dataset generated from an online forum to empirically investigate the social influence on reviewers eWOM motives and readers feedback. We collected individual user review data on seven main massively multiplayer online role-playing games (hereafter MMORPGs) in the market from Gamespot.com, a popular third-party website. Our research makes the following contributions. First, our results show that community members reviews are not influenced by their forum involvement. Second, online forum readers place more trust in the opinions from those community members who have established expertise than those who do not. Third, the findings indicate that, besides the games attributes, the installed base of the online game is also important for both reviewers and potential adopters. The remainder of the paper is organised as follows. We discuss the related literature and develop our theoretical framework in the next section. We further describe the data collection procedure and discuss the statistics methodology. After that, the empirical results are presented. We conclude with contributions and limitations.

Conceptual framework eWOM: product attributes and network externalities


MMORPGs are different from the traditional console games. For an MMORPG, a large number of subscribers can play the same game simultaneously online. To play an MMORPG, players usually need to purchase an installation CD first and to pay a monthly subscription fee. They can register accounts on this MMORPGs website and create their own avatars in the games virtual world. Players can then explore the virtual world joined by millions of other subscribers, and build up their avatars power to advance in the game. Most often, they need to team up with other players to complete a quest simultaneously in an MMORPG. Besides the common game attributes such as the CD price, sound system, graphic quality, difficulty level of the game, the total number of players of the game (i.e. the installed base) is also an important predictor of a consumers

370

International Journal of Market Research Vol. 54 Issue 3

evaluation. With each additional player joining the game, the virtual world expands and the fun of the game increases for all existing users. This phenomenon is called the direct network externalities effect, which refers to the situation where the value of a product increases with a larger installed base of users (Katz & Shapiro 1985; Tirole 1988). Thus we propose that a games installed base and other product attributes are important for players: H1a: The product attributes (including its installed base) are important factors of existing users evaluation. Product attributes can signal product quality to potential consumers. Potential adopters often rely on WOM to obtain this information (Rogers 1995; Kozinets etal. 2010; Samson 2010). Especially when a consumer considers an experience product, he or she tends to seek others recommendations and their product experiences (Bearden & Etzel 1982; Childers & Rao 1992; Senecal & Nantel 2004; Weathers etal. 2007). The emergence of online review systems provides existing users reviews for readers to evaluate experience products (Klein 1998; Chevalier & Mayzlin 2006; Liu 2006). The reviews from existing players offer first-hand experiences for potential adopters. More and more retailers have started to offer users reviews on their e-tailer websites (e.g. Amazon.com, Nordstrom.com). Furthermore, a game with a larger number of subscribers is more attractive for potential adopters. First, a large installed base will generate a big virtual world because of the direct network externalities effect. Second, a large installed base indicates the game has high quality, which is a signalling effect. Based on the result from these two effects, we propose the following hypothesis for potential adopters: H1b: The existing users evaluations on product attributes (including the games installed base) are important for potential adopters.

eWOM: community influence


The development of online communities has reshaped consumers information-seeking and sharing behaviour. Consumers write their opinions and share their consumption experience in various types of online platform (Hennig-Thurau etal. 2004; Eccleston & Griseri 2008; Ewing 2008; Kozinets etal. 2010). Potential adopters also actively seek recommendations and advice online (Senecal & Nantel 2004; Mathwick etal. 2008; Bronner & de Hoog 2010).

371

Did you tell me the truth?

In contrast to offline WOM, where opinions may disappear into thin air, eWOM provides a persistent public record (Dellarocas etal. 2007). Because of this characteristic, eWOM has recently received extensive attention from both academics and practitioners. The number of studies related to online communities, such as users interaction in online communities (Wasko & Faraj 2005; Kozinets etal. 2010) and the community influence on consumers decision making (De Valck etal. 2009; Bronner & de Hoog 2010), is also increasing. We argue that the social capital developed in an online community would also have an influence on a consumers behaviour in terms of articulation and seeking opinions online. Social capital is broadly defined as the resources accumulated through the relationship among people (Coleman 1988; Putman 1995). The existence of social capital explains why community members would forgo the tendency to free-ride and contribute to the community instead (Coleman 1990; Putman 1995). The forms of social capital in the traditional offline setting usually include obligations and expectations, information channels and social norms (Coleman 1988). Recent studies in business and social science have extended the social capital construct into online communities (Lin 2001; Mathwick etal. 2008; Miller etal. 2009). Voluntarism, reciprocity and social trust are the commonly found attributes of social capital for an online community (Wasko & Faraj 2005; Mathwick etal. 2008). The interactions among forum users build up consumers commitment to the community, and establish social capital within this community (Lin 2001; Mathwick etal. 2008; Miller etal. 2009). Consumers voluntary sharing of their opinions with others may be the result of various motives, such as altruism, social benefits, anxiety reduction and self-enhancement (Sundaram etal. 1998; Hennig-Thurau etal. 2004). Recent studies show that the social benefits and commitment to the communities are the main driving forces for consumers to visit online communities and articulate their opinions (Hennig-Thurau etal. 2004; Wiertz & De Ruyter 2007). Furthermore, studies have shown that interaction and information exchange among online community members are often generalised in nature, but are not dyadic (Wasko etal. 2009). Thus, in a well-established online community, forum members do not expect their contributions to receive a direct and immediate reciprocity from other members. Instead, they expect their generosity will help the community and will be repaid in the future (Coleman 1988; Mathwick etal. 2008). For example, they may receive other users truthful opinions later when they themselves have a question. Thus we believe a community members review will be quite objective. It will depend on the product

372

International Journal of Market Research Vol. 54 Issue 3

attributes only, but not on the reviewers forum involvement. Thus we propose: H2a: In online communities, after controlling other product attributes, consumers reviews will not be influenced by their forum involvement. We expect potential adopters to place more trust in the reviews written by well-established community members. In the trust literature, ability (i.e. trustees skills and competencies), benevolence (i.e. trustees concern about other consumers interests) and integrity (trustees keeping his or her commitments) have been proposed as the antecedents of trust (Mayer etal. 1995; McKnight etal. 2002). Reviewers online reputations are often measured by their tenure in the community and their contribution to the forum (De Valck etal. 2009; Wasko etal. 2009). Studies have shown that, often, a small group of active members make the majority of contributions in the online community (De Valck etal. 2009; Wasko etal. 2009; Trusov etal. 2010). They voluntarily address others questions and actively express their opinions in the community. On the other hand, many other community visitors are lurkers. They often visit the community only to retrieve information, but seldom make contributions (Wasko & Faraj 2005; Mathwick etal. 2007; De Valck etal. 2009). For those reputable reviewers, their active participation in the community has demonstrated their altruism and benevolence towards other community members. The number of reviews they have written also proves their expertise in the gaming field. As we discussed earlier, commitment to the community is the main driving force for many community participants (Wiertz & De Ruyter 2007). Their active participation and knowledge contribution are often targeted towards the community as a generalised norm, but not towards specific individuals (Mathwick etal. 2008; Wasko etal. 2009). In addition, the support from other community members reinforces the norms of the community (Coleman 1988). This gives contributors a strong motivation to share their knowledge and show their expertise (Wasko & Faraj 2005). Thus we propose the reviewer whose expertise has been recognised in the community tends to earn more trust from other users: H2b: In online communities, people tend to trust the evaluations from reputable online reviewers more.

373

Did you tell me the truth?

Figure 1 summarises our theoretical framework. The solid lines represent our proposed hypotheses to be tested by the empirical data. The interactions among forum users, such as reviewers contributions and readers feedback, increase the community value. Furthermore, in an online community, those who write reviews and contribute to the community may expect to get some social rewards (such as approval and respect) from other forum readers (Blau 1964; Hennig-Thurau etal. 2004; Wasko & Faraj 2005). Those contributors not only want to help others, but also tend to build their online reputation in the community (Donath 1999; Wasko & Faraj 2005). The accumulated social capital will further enhance their willingness to contribute. The dash lines in Figure 1 represent the accumulation of community value, and how the increased social capital may reinforce the contribution of the reviewers.
Readers

H1b Contributors Evaluation of product attributes Community involvement H1a Reviews H2a Community value H2b

Figure 1 Theoretical framework

Empirical analysis Data


We have collected the individual user review data from a popular third-party website, Gamespot.com.1 This website provides detailed information and tips for all kinds of video games, including console games, PC games and online games, as well as user reviews and discussion forums. Any person who is interested in games can register an account on Gamespot.com, participate in its discussion forums, and post his or her own game reviews
1

Based on the information from Alexa.com, a web traffic monitoring site, www.gamespot.com is the most popular website for video games in the United States.

374

International Journal of Market Research Vol. 54 Issue 3

on the website. Our dataset contains very detailed information on each review and other readers feedback towards this specific review. For each review, we have collected the information about the reviewers detailed ratings on product attributes (e.g. difficulty level, graphics, sound), the reviewers overall involvement in the Gamespot.com community (such as number of reviews he or she has written, number of forum posts he or she has contributed, the number of people who tracked his or her posts), and other readers evaluation of this review (number of readers who rated the focal review helpful or not helpful). Table 1 lists the detailed definition of variables we have collected. Most of the information came from Gamespot.com, including game attributes, consumer review and online community forum information. The installed base data came from Mmogchart.com. To our knowledge, Mmogchart. com is the most comprehensive website providing publicly accessible information about active subscriptions to the major MMORPGs. We have used two data collection criteria because the installed base data are not available for all the games or for all the months. First, a game should have at least three consecutive months of installed base numbers. Second, the total number of a games reviews should be at least 100. Our final dataset contains seven games with 1,695 user reviews covering a 49-month period, from May 2003 to March 2007. The final seven games are World of Warcraft, Final Fantasy XI, EVE Online, Star Wars Galaxies, Lineage II, City of Heroes and EverQuest II, which accounted for a 72.5% market share back in December 2006.

Variables
Our main research purpose is to study the influence of online communities on forum users eWOM sharing and seeking behaviours. For reviewers rating, the explanatory variables are as follows. Product attributes of the focal game: these include prices (monthly subscription fee, MONTHFEE, and the price of the installation CD, CDPRICE); installed base of the focal MMORPG to capture the direct network externalities effect (NUMUSERS, the number of users in millions in the month when the review was posted), visual and acoustics design (GRAPHICS and SOUND); and content design (DIFFICULT, the reviewers evaluation of the games overall difficulty level). Because the existing literature generally supports the existence of optimum stimulation level (Raju 1980), we also include

375

Did you tell me the truth?

Table 1 List of variables


Variables HELPFUL RESPONDENT RATING PLAYLEVEL FORUMPOST REGISTERTIME Definition The number of readers who rated the focal review helpful Data type Numeric

The total number of readers who rated the focal Numeric review as either helpful or not helpful The reviewers overall evaluation of the game The level measuring a reviewers overall performance on the forum Numerical within 110 Numeric

The number of posts the reviewer has written in Numeric the forum discussion board The time lag in days between the reviewer registered date and the date when a specific review was posted on the forum Numeric

NUMREVIEWS NUMRATINGS COLLECTION TRACKING NUMUSERS GRAPHICS SOUND DIFFICULT

The number of reviews the reviewer has written Numeric in the online forum The number of games the reviewer has rated in the online forum The number of games the reviewer has played The number of forum users who track the focal reviewers posts The number of users in millions in that month when the review was posted The reviewers evaluation of the games graphics attribute The reviewers evaluation of the games sound attribute The reviewers evaluation of the games overall difficulty level Numeric Numeric Numeric Numeric Categorical on a 110 scale Categorical on a 110 scale Categorical where 1 = very easy, 2 = easy, 3 = just right, 4 = hard, 5 = very hard Numeric Numeric Numeric Categorical where 1 = [0,10 hours] 2 = [10 hours, 20 hours] 3 = [20 hours, 40 hours] 4 = [40 hours, 100 hours] 5 = above 100 hours Numeric

DIFFICULT_SQUARE The square term of DIFFICULT to capture the quadratic effect MONTHFEE CDPRICE TIMESPENT The monthly subscription fee in USD The price of the installation CD in USD The number of hours the reviewer spent playing the specific game until the review date

RATINGTIME

The time lag in days between the release date of the game and the date when a specific review was posted on the forum

376

International Journal of Market Research Vol. 54 Issue 3

DIFFICULT_SQUARE to capture the potential non-linear relationship of DIFFICULT level and reviewers RATING. The reviewers community involvement: we separate reviewers community involvement into forum involvement and reviewers expertise. PLAYLEVEL is used to measure a reviewers forum involvement on the Gamespot.com forum. COLLECTION (the number of games the reviewer has played) and TRACKING (number of other forum users who track this focal reviewers posts) are used to measure a reviewers established expertise in the community. Other control variables to capture the reviewers experience of the focal game: these include TIMESPENT (how much time the reviewer had devoted to this specific game up to the review date) and RATINGTIME (the time lag between the game release date and the date when a specific review was posted on the forum). To test H1b and H2b on readers feedback, besides all those variables as described above, we further included the reviewers overall rating (RATING) and an interaction term of NUMUSERS and RATING (NUMUSERRATING) to capture the moderating effect of direct network externalities. We use two variables to measure readers feedback on a focal review. The first variable is HELPFUL, which captures the total number of users who rated the focal review as helpful. The second variable is RESPONDENT, which is the total number of users who responded to the focal review to rate it as either helpful or not helpful.

Models
Because reviewers PLAYLEVEL on the Gamespot forum also depends on the number of their forum posts (FORUMPOST), the number of reviews they wrote (NUMREVIEWS), the number of ratings they gave for different video games (NUMRATINGS), their REGISTERTIME and COLLECTION, we use two-stage least-squares (2SLS) regression in our analysis to obtain consistent estimates. In the first stage, FORUMPOST, REGISTERTIME, NUMREVIEWS, NUMRATINGS and COLLECTION serve as instrument variables to regress on the endogenous PLAYLEVEL. In the second stage, this estimated PLAYLEVEL is one of the explanatory variables in the main regression functions as follows. RATINGi,j,t is reviewer js rating of game i, which was posted during month t after the game release. The following model is used to test H1a and H2a:

377

Did you tell me the truth?

R A TINGi , j ,t = 0 + 1PLA Y LEV EL j + 2TR A CKINGj + 3COLLECTIONj + 4 NUMUSER Si ,t + 5DIFFICULTi , j ,t + 6 DIFFICULT _ SQUA R Ei , j ,t + 7GR A PHICSi , j ,t + 8 SOUNDi , j ,t + 9 MONTHFEEi + 10CDPR ICEi + 11TIMESPENTi , j ,t + 12 R A TINGTIMEi , j ,t + i , j ,t (1)

A second empirical model to test the remaining hypotheses (H1b and H2b) is specified as follows. HELPFULi,j,t indicates the number of forum participants who rated reviewer js review of game i helpful, which was posted during month t after the game release: HELPFULi , j ,t = 0 + 1R A TINGi , j ,t + 2 NUMUSER Si , j ,t + 3 NUMUSER R A T INGi , j ,t + 4 PLA Y LEV EL j + 5TR A CKINGj + 6COLLECTION j + 7 MONTHFEEi + 8CDPR ICEi + 9GR A PHICSi , j ,t + 10 SOUNDi , j ,t + 11DIFFICU LTi , j ,t + 12TIMESPENTi , j ,t + 13R A TINGTIMEi , j ,t + i , j ,t (2)

Empirical results
The descriptive statistics of the variables and the correlation matrix are given in Tables 2 and 3, respectively. The 2SLS results for Models (1) and (2) are summarised in Tables 46. The first stage result is reported in Table 4. Table 4 confirms that FORUMPOST, REGISTERTIME, NUMREVIEWS, NUMRATINGS and COLLECTION are appropriate instruments for PLAYLEVEL (F = 165.76, p < 0.0001). The second stage results for Model (1) are reported in Table 5. Our results indicate that all product attributes (e.g. SOUND, GRAPHICS, DIFFICULT), except for MONTHFEE, are significant predictors for users ratings. Furthermore, the quadratic term DIFFICULT is negatively significant. This result indicates that a game that is too advanced might deter a players interest. On the other hand, a very easy game is also less appealing to the gamers. Players prefer a game with a certain challenge. This is consistent with the existing literature (Raju 1980; Rau etal. 2006).

378

International Journal of Market Research Vol. 54 Issue 3

Table 2 Descriptive statistics


Variable HELPFUL RESPONDENT RATING PLAYLEVEL FORUMPOST REGISTERTIME* NUMREVIEWS NUMRATINGS COLLECTION TRACKING NUMUSERS GRAPHICS SOUND DIFFICULT MONTHFEE CDPRICE TIMESPENT RATINGTIME Mean 1.42 3.87 8.16 13.45 524 431 9 29.94 49.58 14 3.40 8.5 8.33 2.9 14.69 24.00 4.13 409 Min 0 0 1.00 0 0 1488 0 0.04 0 0 0.02 1 1 1 12.95 19.95 1 0 Max 62 149 10.00 64 32,438 397 243 66 1243 729 6.60 10 10 5 15.00 29.99 5 1332 Standard deviation 3.25 9.78 2.25 8.31 2039.79 341.49 17.18 24.72 101.43 56.48 2.66 1.83 1.97 1.8 0.77 4.33 1.21 290

* Since most of the reviewers became registered Gamespot users before they wrote their reviews, most of the REGISTERTIME is negative

In addition, the result confirms that the installed base of the MMORPG does have a significant positive impact on the users rating. Thus H1a is supported. After controlling the product attributes, our results indicate that community influences (which are captured by PLAYLEVEL, COLLECTION and TRACKING) do not have significant impact on a reviewers rating. Though reviewers with more community experience (a higher PLAYLEVEL) or with more established expertise (a higher TRACKING/COLLECTION) tend to give a relatively higher rating, none of these coefficients is significant at the 0.05 level. Overall, our results show that community influence does not have a significant impact on the reviewers rating and H2a is supported. The second stage results for Model (2) are reported in Table 6. Our findings confirm that most of the product attributes (except for difficulty level) are important influencers for potential adopters. Among them, the

379

380 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
1 0.47*** 0.24*** 0.08*** 0.07*** 0.39*** 0.39*** 0.44*** 0.09*** 0.04 0.01 0.01 0.0006 0.03 0.06** 0.07*** 0.002 0.007 0.04 0.002 0.03 0.04 0.006 0.06** 0.03 0.01 0.03 0.008 0.05* 0.04 0.03 0.04 0.02 0.009 0.05* 0.05* 0.05** 0.05** 0.05** 0.007 0.04 0.03 0.04 0.02 0.0008 0.01 0.04 0.02 1 0.15*** 0.26*** 0.05** 1 0.72*** 0.27*** 0.23*** 0.006 0.09*** 0.06** 1 0.23*** 1 0.02*** 0.12*** 0.60*** 0.34*** 0.42*** 0.06** 0.05** 0.06*** 0.28*** 0.24*** 0.04* 0.01 1 0.06** 0.02 0.06** 1 0.02 0.29*** 1 0.04 1 0.08*** 0.09*** 0.05* 0.08*** 0.03 0.01 0.04 0.03 0.02 0.008 0.63*** 0.06** 0.44*** 0.39*** 0.45*** 1 0.32*** 0.14*** 0.40*** 0.61*** 1 0.32*** 0.13*** 0.55*** 1 0.30*** 0.06** 1 1 1

Table 3 Correlation matrix

Did you tell me the truth?

1. HELPFUL

2. RESPONDENT

0.76***

3. RATING

0.20*** 0.22***

4. PLAYLEVEL

0.09***

0.07***

0.13***

5. FORUMPOST

0.10***

0.08***

0.04*

6. REGISTERTIME 0.04*

0.02

0.04

7. NUMREVIEWS

0.07***

0.06**

0.08***

8. NUMRATINGS

0.09***

0.05**

0.06**

9. COLLECTION

0.14***

0.15***

0.07***

10. TRACKING

0.16***

0.14***

0.04*

11. NUMUSERS

0.02

0.06**

0.31*** 0.08*** 0.05**

12. GRAPHICS

0.11*** 0.13***

0.74***

13. SOUND

0.09*** 0.13***

0.75***

14. DIFFICULT

0.07*** 0.07***

0.25***

15. MONTHFEE

0.02

0.03

0.04

16. CDPRICE

0.05**

0.11*** 0.48*** 0.05**

17. TIMESPENT

0.07*** 0.15***

0.09*** 0.05*

18. RATINGTIME

0.12*** 0.07*** 0.32*** 0.18*** 0.08***

Notes: n = 1695; *p < 0.1, **p < 0.05, ***p < 0.01

International Journal of Market Research Vol. 54 Issue 3

Table 4 Regression results for the first stage Dependent variable = individual PLAYLEVEL
Explanatory variable FORUMPOST REGISTERTIME NUMREVIEWS NUMRATINGS COLLECTION
Model: R2 = 37.49%; Adj R2 = 37.26%; F = 165.76*** Notes: n = 1387; a *p < 0.1; **p < 0.05; ***p < 0.01

Estimatea 0.001*** 0.004*** 0.088*** 0.006* 0.012***

Standard error 0.0001 0.0005 0.012 0.003 0.002

Table 5 Regression results for the second stage: Model (1) Dependent variable = individual RATING
Explanatory variable PLAYLEVEL COLLECTION TRACKING NUMUSERS GRAPHICS SOUND DIFFICULT DIFFICULT_SQUARE MONTHFEE CDPRICE TIMESPENT RATINGTIME INTERCEPT
Model: R2 = 74.2%; Adj R2 = 73.98%; F = 329.62*** Notes: n = 1387; a *p < 0.05; **p < 0.01; ***p < 0.001

Estimatea 0.01 0.0002 0.0002 0.068*** 0.40*** 0.36*** 2.29*** 0.37*** 0.10 0.06*** 0.11*** 0.0009*** 0.84

Standard error 0.01 0.0004 0.0008 0.02 0.03 0.02 0.19 0.03 0.10 0.01 0.03 0.0001 1.60

installed base of the corresponding MMORPGs network has the largest positive significant impact on the readers feedback (its coefficient is 1.19 with p value less than 0.0001). In addition, this positive impact from a large installed base could further reduce the negative influence from a bad review (the coefficient of the interactive term, NUMUSERS, is negatively

381

Did you tell me the truth?

Table 6 Regression results for the second stage: Model (2) Dependent variable = the number of HELPFUL
Explanatory variable NUMUSERS RATING NUMUSERRATING PLAYLEVEL COLLECTION TRACKING GRAPHICS SOUND DIFFICULT MONTHFEE CDPRICE TIMESPENT RATINGTIME INTERCEPT
Model: R2 = 19.99%; Adj R2 = 19.004%; F = 20.21*** Notes: n = 1387; a *p < 0.05; **p < 0.01; ***p < 0.001

Estimate 1.19*** 0.27*** 0.15*** 0.014 0.003*** 0.009*** 0.174* 0.169* 0.13 0.279 0.18** 0.02 0.002*** 0.444

Standard error 0.14 0.08 0.02 0.03 0.001 0.002 0.07 0.07 0.11 0.277 0.037 0.07 0.0004 4.299

significant). Thus H1b is supported overall. The non-significant finding on DIFFICULT is also consistent with the literature. The difficulty level is an experience attribute of a game. Previous study shows that readers of hedonic product reviews tend to attribute the negative opinions towards the reviewers personal reasons (Sen & Lerman 2007). Furthermore, our results indicate that the reviewers PLAYLEVEL itself does not have a significant impact on readers impression, but TRACKING and COLLECTION do. This may be because TRACKING (number of other forum members tracking this specific reviewer) and COLLECTION (number of video games collected by the reviewer) are better indicators of the reviewers expertise and ability in the field. Other factors, such as how long the reviewer has been in the community and how many posts the reviewer has written on the discussion board, are not direct indicators of his or her ability. Thus H2b is partially supported. Reviewers tenure in the community is not sufficient to gain readers trust, but his or her expertise is. To fully address H1b and H2b, we conduct a robustness check. For each review, instead of using the total number of readers who rated it helpful,

382

International Journal of Market Research Vol. 54 Issue 3

Table 7 Regression results for the robustness check of Model (2) Dependent variable = the number of RESPONDENT
Explanatory variable NUMUSERS RATING NUMUSERRATING PLAYLEVEL COLLECTION TRACKING GRAPHICS SOUND DIFFICULT MONTHFEE CDPRICE TIMESPENT RATINGTIME INTERCEPT
Model: R2 = 23.15%; Adj R2 = 22.39%; F = 30.61*** Notes: n = 1334; a *p < 0.05; **p < 0.01; ***p < 0.001

Estimate 2.657*** 1.35** 0.356*** 0.118 0.014*** 0.02*** 0.48* 0.19 0.2 0.67 0.62*** 0.52** 0.002* 18.95

Standard error 0.416 0.22 0.05 0.077 0.003 0.006 0.21 0.2 0.32 0.76 0.10 0.20 0.001 12.03

we use the total number of respondents (those who rated the review as helpful and those who rated the review as not helpful) as the dependent variable in Model (2). This approach helps us to investigate the leading factors that could generate big buzz. The results for this robustness check can be found in Table 7. The results further confirm our previous findings. The reviews from those who have demonstrated their expertise in the field (those with higher TRACKING and COLLECTION) will generate more buzz among readers. The coefficients for both variables are positively significant and p < 0.0001. On the other hand, those who are highly involved in the forum community do not generate significant buzz. The coefficient for PLAYLEVEL is not significant. For product attributes, most of the findings are consistent with those in Table 6. Most importantly, the impact from the installed base becomes much bigger (its coefficient value increases from 1.19 to 2.66). This finding further verifies the importance of a large installed base. A popular game with a large network size could generate more buzz than others.

383

Did you tell me the truth?

Contributions and limitations Theoretical contributions


With the fast development of Web 2.0, the importance of consumer online networks has received enormous attention in academic research studies recently. However, how consumers interact with one another and utilise those communities remains open (Wasko & Faraj 2005; De Valck etal. 2009; Wasko etal. 2009; Kozinets etal. 2010). Our study contributes to this stream of literature to investigate the community influence on consumers motives to write and evaluate online reviews. Our results show that, contrary to the traditional wisdom, reviewers tend to be objective in a well-established online community. Community involvement has little influence on their reviews. Furthermore, we find that reviewers with high community status may not be the opinion leaders, but community participants do pay more attention to reviewers with established expertise. Our results also contribute to the long-term discussion on the role of opinion leaders in WOM theory. Researchers developing WOM theory have started to question the influence from opinion leaders. Watts and Dodds (2007) have argued that the diffusion process could be driven by a critical mass of easily influenced adopters, though they themselves may not be influential (i.e. opinion leaders). Our findings indicate that the importance of opinion leaders in online communities should not be ignored.

Managerial implications
Buzz marketing is receiving increasing appraisals from practitioners (PQ Media 2009), and most existing eWOM studies conclude that generating WOM volume (i.e. letting more people discuss the product/service) is most important (Liu 2006; Duan etal. 2008). Our results imply that certain marketing strategies, such as seeding, that are targeted towards opinion leaders may work better than a general buzz marketing strategy targeted towards a general audience. This study also provides some guidance on how to identify those opinion leaders. Those who are actively involved in forum discussion and write numerous posts might not be the true opinion leaders. Instead, managers should focus on those who have demonstrated their expertise in the online community. Furthermore, since readers pay more attention to negative reviews, firms should try to attract those opinion leaders who might be favourable

384

International Journal of Market Research Vol. 54 Issue 3

towards the firms product and encourage them to generate positive feedback.

Limitations and future studies


We conclude by addressing the limitations of our current study and discussing future research topics. In the current study, we used only secondary data to test our proposed hypotheses, and the dataset has its own limitations. First, we have only seven games in our dataset because these are the only games that have both installed base data and online review data. Since these seven games occupy 72.5% of the market share of MMORPGs, our results are biased towards the successful games, and the importance of the installed base might be understated. Because our results indicate that installed base is still crucial for those successful games, we expect the same results will hold if we have a larger dataset containing more game titles. Second, because of the data limitation, we have aggregated data only of readers feedback on HELPFUL and RESPONDENT. We could not differentiate potential users and existing users in the readers group because of this data limitation. Third, at the current stage, Gamespot.com allows only those members with PLAYLEVELs higher than 3 to post reviews. This approach helps to generate more objective reviews from forum participants because it discourages other motives such as anxiety reduction and firms promotion activities. Thus the Gamespot community might be quite different from some other online review communities, such as Amazon.com, where reviewers do not have such restrictions. An important future study to overcome the above limitations is to conduct questionnaire surveys to directly measure online users attitudes towards community influences (the dashed lines in the framework), such as reciprocity and trust. Furthermore, it would be interesting to study the change in forum users attitudes after they become more experienced in the online community. For example, studies have shown that, compared to those consumers who think the product/service just meets their expectations, both extremely happy customers and dissatisfied customers tend to express their opinions more often (Anderson 1998). This could be the driving force when a newbie forum user posts his or her first review in the online community. However, as the person gets more involved in the community, he or she becomes more committed to it and writes more generalised reviews (not only those reviews about extreme satisfaction or dissatisfaction, but also

385

Did you tell me the truth?

more neutral reviews on those so-so consumption experiences). Our current results reflect only the average observation across the entire time span. Third, our study utilises a specific hedonic product online video game which is dominated by experience attributes. A possible future study is to examine the online community and eWOM for search products.

Executive summary
The following is provided to managers and executives for a brief summary of the practical applications of this study. With the rapid development of Web 2.0, more and more firms have started to utilise social media and participate in consumers online communities. Marketing strategies such as seeding campaigns (where companies give products to influential consumers and expect those opinion leaders will talk favourably to other consumers) and buzz marketing (where companies encourage consumers to discuss their products or services) are commonly adopted in practice. Understanding the interactions among community members becomes especially important for an effective marketing strategy. The findings of this paper indicate that reviewers tend to be objective in a well-established online community. Community involvement has little influence on their reviews. Furthermore, consumers do pay more attention to reviewers who have established their expertise. Our results imply that certain marketing strategies, such as seeding, that are targeted towards opinion leaders may work better than a general buzz marketing strategy targeted towards a general audience. In addition, active forum participants may not be the opinion leaders. The true opinion leaders are those forum users who have well-established expertise and whose posts are tracked most often. Furthermore, since negative reviews generate more buzz, managers should try to attract those opinion leaders who might be favourable towards the firms product and encourage them to generate positive feedback.

References
Anderson, E. (1998) Customer satisfaction and word of mouth. Journal of Service Research, 1, 1, pp.517. Bearden, W .O. & Etzel, M.J. (1982) Reference group influence on product and brand purchase decisions. Journal of Consumer Research, 9, 2, pp.183194. Blau, P .M. (1964) Exchange and Power in Social Life. New York, NY: Wiley.

386

International Journal of Market Research Vol. 54 Issue 3

Bronner, F. & de Hoog, R. (2010) Consumer-generated versus marketer-generated websites in consumer decision making. International Journal of Market Research, 52, 2, pp.231 248. Chevalier, J.A. & Mayzlin, D. (2006) The effect of word of mouth on sales: online book reviews. Journal of Marketing Research, 43, August, pp.345354. Childers, T.L. & Rao, R. (1992) The influence of familial and peer-based reference groups. Journal of Consumer Research, 19, 2, pp.198212. Coleman, J.S. (1988) Social capital in the creation of human capital. American Journal of Sociology, 94, S95S120. Coleman, J.S. (1990) Foundations of Social Theory. Cambridge, MA: Belknap Press. Dellarocas, C., Zhang, X.M. & Awad, N.F. (2007) Exploring the value of online product reviews in forecasting sales: the case of motion pictures. Journal of Interactive Marketing, 21, 4, pp.2345. De Valck, K., Van Bruggen, G.H. & Wierenga, B. (2009) Virtual communities: a marketing perspective. Decision Support Systems, 47, pp.185203. Donath, J.S. (1999) Identity and deception in the virtual community. In: Smith, M.A. & Kollock, P . (ed.) Communities in Cyberspace. New York, NY: Routledge. Duan, W ., Gu, B. & Whinston, A. (2008) The dynamics of online word-of-mouth and product sales an empirical investigation of the movie industry. Journal of Retailing, 84, 2, pp.233242. Eccleston, D. & Griseri, L. (2008) How does Web 2.0 stretch traditional influencing patterns? International Journal of Market Research, 50, 5, pp.591616. Ewing, T. (2008) Participation cycles and emergent cultures in an online community. International Journal of Market Research, 50, 5, pp.575590. Hennig-Thurau, T., Gwinner, K.P ., Walsh, G. & Gremler, D.D. (2004) Electronic word-ofmouth via consumer-opinion platforms: what motivates consumers to articulate themselves on the internet? Journal of Interactive Marketing, 18, 1, pp.3852. Katz, M. & Shapiro, C. (1985) Network externalities, competition, and compatibility. American Economic Review, 75, pp.424440. Klein, L.R. (1998) Evaluating the potential of interactive media through a new lens: search versus experience goods. Journal of Business Research, 41, pp.195203. Kozinets, R.V ., De Valck, K., Wojnicki, A.C. & Wilner, S.J.S. (2010) Networked narratives: understanding word-of-mouth marketing in online communities. Journal of Marketing, 74, March, pp.7189. Lin, N. (2001) Social Capital: A Theory of Social Structure and Action. New York, NY: Cambridge University Press. Liu, Y. (2006) Word of mouth for movies: its dynamics and impact on box office revenue. Journal of Marketing, 70, July, pp.7489. Mathwick, C., Wiertz, C. & De Ruyter, K. (2008) Social capital production in a virtual P3 community. Journal of Consumer Research, 34, April, pp.832849. Mayer, R.C., Davis, J.H. & Schoorman, F.D. (1995) An integrative model of organizational trust. Academy of Management Review, 20, 3, pp.709734. McKnight, D.H., Choudhury, V . & Kacmar, C. (2002) Developing and validating trust measures for e-commerce: an integrative typology. Information Systems Research, 13, 3, pp.334359. Miller, K.D., Fabian, F. & Lin, S. (2009) Strategies for online communities. Strategic Management Journal, 30, pp.305322. PQ Media (2009) Exclusive PQ media research: despite worst recession in decades, brands increased spending on word-of-mouth marketing 14.2% to $1.54 billion in 2008. Available online at: http://www.pqmedia.com/about-press-20090729-wommf.html (accessed July 7, 2010).

387

Did you tell me the truth?

Putman, R. (1995) Bowling alone: Americas declining social capital. Journal of Democracy, 6, pp.6578. Raju, P .S. (1980) Optimum stimulation level: its relationship to personality, demographics, and exploratory behavior. Journal of Consumer Research, 7, pp.272282. Rau, P ., Peng, S. & Yang, C. (2006) Time distortion for expert and novice online game players. CyberPsychology & Behavior, 9, 4, pp.396403. Rogers, E. (1995) Diffusion of Innovations (4th edn). New York: The Free Press. Samson, A. (2010) Product usage and firm-generated word of mouth: some results from fmcg product trials. International Journal of Market Research, 52, 2, pp.459481. Sen, S. & Lerman, D. (2007) Why are you telling me this? An examination into negative consumer reviews on the web. Journal of Interactive Marketing, 21, 4, pp.7694. Senecal, S. & Nantel, J. (2004) The influence of online product recommendations on consumers online choices. Journal of Retailing, 80, pp.159169. Sundaram, D.S., Mitra, K. & Webster, C. (1998) Word-of-mouth communications: a motivational analysis. Advances in Consumer Research, 25, pp.527531. Tirole, J. (1988) The Theory of Industrial Organization. Cambridge, MA: MIT Press. Trusov, M., Bodapati, A.V . & Bucklin, R.E. (2010) Determining influential users in internet social networks. Journal of Marketing Research, 47, August, pp.643658. Wasko, M. & Faraj, S. (2005) Why should I share? Examining social capital and knowledge contribution in electronic networks of practice. MIS Quarterly, 29, 1, pp.3557. Wasko, M.M., Teigland, R. & Faraj, S. (2009) The provision of online public goods: examining social structure in an electronic network of practice. Decision Support Systems, 47, pp.254265. Watts, D.J. & Dodds, P .S. (2007) Influentials, networks, and public opinion formation. Journal of Consumer Research, 34, December, pp.441458. Weathers, D., Sharma S. & Wood, S.L. (2007) Effects of online communication practices on consumer perceptions of performance uncertainty for search and experience goods. Journal of Retailing, 83, 4, pp.393401. Wiertz, C. & De Ruyter, K. (2007) Beyond the call of duty: why customers contribute to firmhosted commercial online communities. Organization Studies, 28, 3, pp.347376.

About the authors


Dr Jun Yang is an Assistant Professor of Marketing at the University of Houston Victoria. Dr Yangs current research interests include e-marketing, network effects, word of mouth, online communities and pharmaceutical marketing. Her research articles have been published in several academic journals including Journal of Business Research, Journal of Retailing and Consumer Services, Journal of the National Medical Association and others. Dr Enping (Shirley) Mai is an Assistant Professor at the Marketing and Supply Chain Management, College of Business, East Carolina University. Her research interests cover customer base analysis, e-marketing, word of mouth, as well as online community. Her recent research articles have been published in several academic journals including Journal of Statistical Planning and Inference, Journal of Business Research and others.

388

International Journal of Market Research Vol. 54 Issue 3

Dr Joseph Ben-Ur is a Professor of Marketing at the University of Houston Victoria. Among his journal publications are articles published in Management Science, Energy Economics, Psychology and Marketing and The European Journal of Marketing. His work in political marketing includes publications on polling and political marketing strategies. DrBen-Ur is senior editor of the Journal of Political Marketing. Address correspondence to: Dr Jun Yang, The School of Business Administration, University of HoustonVictoria, 14000 University Blvd, Sugar Land, TX 77479, United States. Email: yangj@uhv.edu

389

Copyright of International Journal of Market Research is the property of Warc LTD and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.

Das könnte Ihnen auch gefallen