Brandes and Dover evaluated how weather conditions influence user reviews; their findings may not come as too much of a surprise to anyone who’s ever collected user feedback. The information gathered by Brandes and Dover may also help explain unexpected/unanticipated sets of reviews or with the scheduling of studies, when possible. Brandes and Dover report that their study “uses a unique dataset that combines 12 years of data on hotel bookings and reviews, with weather condition information at a consumer’s home and hotel address. The results show that bad weather increases review provision and reduces rating scores for past consumption experiences. Moreover, 6.5% more reviews are written on rainy days and that these reviews are 0.1 points lower, accounting for 59% of the difference in average rating scores between four- and five-star hotels in our data. These results are consistent with a scenario in which bad weather (i) induces negative consumer mood, lowering rating scores, and (ii) makes consumers less time-constrained, which increases review provision. Additional analyses with various automated sentiment measures for almost 300,000 review texts support this scenario: reviews on rainy days show a significant reduction in reviewer positivity and happiness, yet are longer and more detailed.” Brandes and Dover’s findings support asking people submitting reviews, etc., what the weather is outside as they’re writing and interpreting data collected accordingly.
Leif Brandes and Yaniv Dover. “Offline Context Affects Online Reviews: The Effect of Post-Consumption Weather.” Journal of Consumer Research, in press, https://doi.org/10.1093/jcr/ucac003