Sunday, March 15, 2009

Social Computing Session 4: Social knowledge production and services


Online social recommendation systems vs. real world advice seeking

For a project for another class I have been locating materials about Geogiana, the Duchess of Devonshire. Before I complete the project I wanted to know what others thought about some of the items I have located. To that end, and for exemplars for this posting, I visited Amazon.com and BN.com to check out the customer reviews of The Duchess by Amanda Foreman. See the screenshots above. Amazon.com had more reviews available than BN.com though many of them were more like plot synopses or vague "liked it," "loved it," or "hated it" type commentary. There were a few that I found helpful with well-written, thoughful discussions with accurate spelling. I found the Detailed Ratings on BN.com very helpful because they really fleshed out what elements specifically the reviewer rated highly or poorly. Unfortunately there was only one rating/review to work with there. Amazon.com does include a tagging feature that may prove helpful as I continue to look for additional materials.

I found the article Social Annotations in Digital Library Collections particularly relevant in this case as the ratings and reviews function as a history of others responses to and interests in the text. While they are not page or even chapter specific, these recommendations can help a user decide whether or not an item is relevant and/or worthwhile for whatever purpose. Also, both Amazon and BN offer "people who bought this also bought" features that may prove helpful in finding additional related materials. Kristina Lerman's article about filtering also applies to some degree...BN.com provides both collaborative and social filtering elements for users of MyB&N. Unfortunately, my research in this area has been very recent and the system is designed to filter based on my overall use and therefore has not provided any additional recommendations for me. We'll see how long it takes the system to catch up with my new interest...if it does at all.

The trust issues discussed in previous sessions came to the fore while I was looking for reviews...how does one determine whether or not a review is trustworthy? This is especially true when one is looking for research materials.

So, in comparison, and also in search of a review I can trust, I spoke with a writer friend who recently began researching Georgiana for her book. I've known her for years and trust her ability to critically analyze material. Her comments about this book were in line with some of the reviews available online, but since I know her, her background, and her abilities I am more inclined to view her review and reviews that are similar more favorably. I also emailed an old history professor whose research included the era of Georgiana, though I have not received any details back yet.

Overall, I found this week's readings varied and interesting. I am particularly excited aboutthe potential future of libraries as they continue to adapt to mobile communications and the changing social needs and ideas. If our library database incorporated "annotations," or reviews by registered users, tagging, or other social computing and knowledge-sharing elements I would be ecstatic! I personally find research rather lonely and sometimes do not trust my own analysis...having a social history in this environment would make for more interesting research. At the moment, one can accomplish some connection by using bibliographic references, who has used what in what context and when...but the speed and variety allowed by the elements discussed in this week's readings allow for many more connections and discussions. As always, quality is a concern when allowing ratings, reviews, tagging, and other "unregulated" social elements, but if the system incorporated elements of social capital allowing these elements and the users themselves to be rated and reviewed, then perhaps the quality and trustworthiness could be analyzed with some degree of reliability.

4 comments:

  1. I love going to Amazon.com to check out reviews on books and other goods, but as you mentioned, trustworthiness is always something to consider. I always wonder, what if I were an author or had something to sell, and had 500+ Facebook friends. Could I round them up to post favorable reviews for me, and unfavorable reviews for my competitors?

    I agree with you that it would be awesome if the UHM library database incorporated "annotations." I tend to browse books through Amazon, look at reviews, and if there is something I like, I turn to the library and borrow it. Now if it were combined, I'd save a lot of time! I love the Seattle Public Library's one. Let me know if you know of any other cool ones to check out.

    ReplyDelete
  2. It would be great for a way to combine your "offline" help via your writer friend and old history professor into an online system like Amazon. There would be a button to show only reviews by people within 3 degrees of separation from me, in any of my social networks. Once systems online start merging, especially the Web 2.0 ones, then we can really begin leveraging offline benefits online.

    The library annotations are such a creative idea, and I wonder if emphasizing the collaborative nature of these ideas could really change how people felt about research. Will libraries in the future end up more like Wikipedia? Where mass annotations (or wiki page contributions, if we go higher up a level) become the actual description of a given topic? Will collaboration be the norm in the near future?

    ReplyDelete
  3. I liked the idea of using a another class assignment to illustrate your pionts for this class.

    I thought it was interesting that you trust your friend because you like them. Funny how trust often is boiled down to like. I think this is why the Bernie Madoffs get by in the world. Logically, of course trust has nothing to do with like. I trust my bank and I don't even know anyone there, let alone like them.

    This is the core of the problem with a tagging/rating system. Does it match your "Like"? If you like the way someone writes you will be more inclined to trust them, even if their facts are clearly wrong.

    Further, how does someones "like" affect their objectivity and accurate understanding of the facts? Specially with issues that even the "experts" disagree on.

    I would like to see a annotation system, that somehow objecttively, independently measures someones "expertise", and allows me to be able to see a level of expertise that I want to view. I really don't want to wade thru Joe the Plumbers comments on brain surgery or physics. And I am not sure why anyone would.

    Regards, Tom

    ReplyDelete
  4. My first thought on the issue of quality was how to get people to write well-written, thoughtful reviews. Do people on amazon get incentives to write reviews? (For example they might get discount rates with x many quality reviews.)

    When it comes to reviews, one of the most important aspects to me is knowing that a reviewer has similar tastes to me. It would also be nice to craft or cater reviews to a particular user's interest.

    Due to the inherent subjectivity of rating systems, I keep falling into the trap of believing an expert, only to be disappointed.

    I went to a certain restaurant because a food critique in the TGIF section of the advertiser said that it had the best katsu on Oahu. While the food was decent, it was far from being the best I've ever had. Later, I looked up some online reviews and found that many people thought the pork tonkatsu was "too greasy."

    ReplyDelete