The Impact of Online Ratings on Patients and Physicians

Wednesday, June 13, 2018: 12:00 PM
Dogwood - Garden Level (Emory Conference Center Hotel)

Presenter: Yiwei Chen

Discussant: Michael E. Chernew


Background: Health report cards have been widely studied for their impacts on consumer health-seeking decisions. Online ratings for various industries have recently been part of everyone's daily life. How do online ratings change consumers’ doctor-seeking behavior and what are the welfare implications? Many of the healthcare community have a dismissive view towards online ratings, either because they think the effect is trivial or they believe the ratings do not reflect the true “quality” of physicians. This paper aims to understand the role and impact of online doctor ratings.

Data and Methods: I collect a unique dataset that combines U.S nationwide online doctor ratings from Yelp until 2017, 100% Medicare physician payment data between 2012-2015, 20% Medicare claims data between 2008-2013, and measures of physician credentials and quality. Matching Yelp rated doctors in the U.S with Medicare data on doctor names and zip-codes, I am able to analyze performance and behaviors of 37,000 rated doctors from Medicare.

I first analyze the effect of online ratings on patient volume and revenue using Yelp ratings and Medicare payment data between 2012-2015. The basic framework is a panel-regression design to understand how doctors’ annual patient flow responds to the changes in their ratings. However, a simple OLS estimation can be confounded by unobserved doctor efforts and measurement errors due to aggregation. I use an instrumental variable approach to tackle the endogeneity problem. The instrument utilizes individual Yelp reviewers’ “harshness” measured by their ratings in other businesses, typically in non-medical industries. The identification assumption is that reviewers’ outside ratings are uncorrelated with changes in doctors’ unobserved efforts and measurement errors.

Second, to understand the implications of online physician ratings, I correlate the ratings with clinical measures of quality, either constructed by Medicare claims between 2008-2013 or obtained from third-party websites other than Yelp. I also perform a LDA textural analysis to understand the topics of Yelp review contents.

In the next step, I am analyzing how physicians change their practice behaviors in response to being reviewed online using detailed Medicare claims between 2008-2013.

Results: The instrumented panel regression finds that a one star increase in Yelp doctor rating increases a doctor’s patient flow and revenue by approximately 2% annually. The point estimate weighted by number of reviews is larger than the unweighted version.

I find robust positive correlations between Yelp ratings and clinical measures of qualities (eg. education credentials, board certifications, adherence to Hedis procedure guidelines, etc.) The LDA textural analysis shows that Yelp also offers much non-clinical information – the most common review topics are on interpersonal skills, office amenity, etc.

Conclusion: This paper shows that online ratings such as Yelp have a significant impact on consumers’ doctor seeking behavior, even among elderly. Suggestive evidence shows that sorting consumers into better online rated doctors increases consumer welfare.