Proceedings:
Social Information Processing
Volume
Issue:
Social Information Processing
Track:
Contents
Downloads:
Abstract:
We propose augmenting collaborative reviewing systems with an automatic annotation capability that helps users interpret reviews. Given an item and its review by a certain author, our approach is to find a reference set of similar items that is both easy to describe and meaningful to users. Depending on the number of available same-author reviews of items in the reference set, an annotation produced by our system may consist of similar items that the author has reviewed, the rank of the reviewed item among items in this set, a comparison of the author's scores to averages, and other similar information that indicate the biases and competencies of the reviewer. We validate our approach in the context of movie reviews and describe an algorithm that, for example, presented with a review of a Woody Allen comedy, is able to derive annotations of the form: "This reviewer rates this movie better than 4 out of 6 other Woody Allen comedies that he rated" or "This is the only Woody Allen comedy among the 29 movies rated by this reviewer" or "This reviewer rated 85 comedies. He likes this movie more than 60% of them. He likes comedies less than the average reviewer."
Spring
Social Information Processing