The Internet changed the way we shop. Today we can order anything online, ship it straight to our door, and never set foot in the mall again. The Internet not only transformed how we make purchases, it also revolutionized how we choose what to buy. With the Internet, the cost of acquiring information about various products is significantly reduced. It is easy to go to several different sites and compare prices and reviews of various products.
For example, before you choose a camera to buy, you want to be sure that you are getting your money’s worth so you would read other customers’ reviews. Some reviews are text only, some have pictures and text, and some are videos. Since these reviews can become powerful sales tools for companies, Dr. Radhika Santhanam decided to research the effect different types of reviews can have on the consumer’s perception of the product.
To research this, Dr. Santhanam’s graduate students, Pei Xu, Lijuan Wu, and Liang Chen, set up an experimental study of three products. Each product had a review in three formats – text, text and pictures, and video – though the information used in each format remained the same for each review, regardless of the set up. They researched, “the extent to which visual media in online product reviews that are given by customers persuade other prospective customers to purchase the product,” Dr. Santhanam explained.
Results show that video reviews are the best tool in online shopping. There is little difference between the influence of text reviews and that of image-based reviews. A company’s knowledge of how to best present customer reviews can be an influential sales tool, resulting in improved sales of a product. Researchers hope to eventually be able to tell a company which review mode will work best to promote sales of a certain product.
Dr. Santhanam hopes to continue research in online product reviews with the University of Kentucky’s Vis Center to improve the visualization used in reviews. With the development of haptic interfaces, or systems that allow you to touch and manipulate a virtual object, haptic reviews may be even more powerful than video reviews. As more and more companies close brick and mortar stores and sell exclusively online, customer reviews will drive sales even more.
Learning how people lived during ancient times requires piecing together clues likes a jigsaw puzzle. One good source of these clues is the bits and pieces of papyri that have been preserved across centuries. These bits of papyrus may contain a shopping list, a land contract or other information that tells us how these ancient people lived their day-to-day lives.
However, studying these various papyri has been a great challenge given their fragility and difficulty of access. Recently Vis Center researchers collaborated with a team from Duke University to create a new online system for papyrological research. Dr. Joshua Sosin from Duke University and Ryan Baumann from the University of Kentucky were part of the team that worked together on the project called Integrating Digital Papyrology (IDP). The final product is an online editing system for collaborative editing.
The greatest challenge of this project was to make the system user-friendly. In order to create the editing tools, the team had to create a new programming language called Leiden+ which combines XML and papyrological markup language. The system also allows for translation edits for each papyri and for other notes to be made. The user submits the changes to a board that then authorizes the changes to be made.
Allowing easy access for researchers to communicate about revisions for the text accelerates the pace of research. The team hopes that the online system will replace the slow pace of print mechanisms for publishing these papyi. Dr. Sosin points out that given the rarity of these papyri that “every bit of data is deadly precious” which means the online system presents a real opportunity for deepened research for the e-papyrological community.
Homer’s Iliad is back at the publishing house, but turning these pages involves only a light tap on an iPad screen. With each digital page turn, the Imaging the Iliad iPad app transports the revered, but fragile, Venetus A Iliad manuscript from an inaccessible Venetian library into the hands of students, researchers, and classical enthusiasts around the world.
During the summer of 2007 researchers from the University Of Kentucky Center for Visualization, University of Houston, College of the Holy Cross, Furman University, and Brandeis University gathered in Venice, Italy at the Marciana Library to digitally preserve the Venetus A. Considered by some to be the most important manuscript of the Homeric stories, the Venetus A also contains layers of commentary and annotations, usually attributed to scholars at the Royal Library of Alexandria.
The only previous images had been made in the 1901 by Domenico Comparreti, but the process was highly destructive since the manuscript was sliced apart, placed on glass and photographed, and then rebound. In contrast, the modern process allowed the intact manuscript to be gently placed in a Meyer Conservation Copystand. Page by page, they carefully scanned the ancient manuscript, capturing both high quality digital photos and structured light data to create a 3D model of the surface, which can then be used to digitally “flatten” the manuscript and remove distortions from the text. (Click here to read the 2008 Odyssey article about the project)
The photos were then made publicly available through the University of Houston’s Homer Multitext data archive. But the Vis Center team had plans to use an undergraduate research team to make the Iliad accessible to a much broader audience.
Undergraduate students, Zach Whelchel and Carla Lopez Narvaez did research the summer of 2010 at the UK Center for Visualization. Their assignment was to create an iPad app that would allow the reader to interact with the Venetus A Iliad as well as an English translation. “The project was an ambitious one that was just concrete enough to be possible,” said Whelchel. “Our team was given a lot of space to envision how to best display the folio images.”
The team was given the 3D Iliad images, the corresponding Greek text, and the English text of the Iliad. “The images had already been matched up with corresponding Greek text, but making that correspond with the English transcription was quite difficult, conceptually,” said Ryan Baumann, Vis Center staff who oversaw the student work. Over the course of the summer they worked to create an iPad app that would allow the reader to read the English text side by side with the corresponding folio of the Venetus A. Whelchel said that “to do this we compared two XML documents. The first had the line found on each folio (Ex: Book 1, Lines 32-56) and the second had the entire Iliad (in English) tagged by books and lines.”
“We wanted to build the app as a template that could eventually encompass other texts. Because of this, we took the long route on parsing through the folios to match the lines properly,” said Whelchel, a sophomore Media Communications and Math double-major at Asbury University in Wilmore, KY. “We had to build an intuitive way to ‘page through’ the book. We wanted it to feel like you were actually turning a page so the user could better interact.” Most surprising was “the level of complexity that goes into every page turn.”
Narvaez, a Computer Science student at the University of Puerto Rico – Rio Piedras, said their problem was “how to bring ‘The Iliad’ from the oldest form of print to the newest form of print on the iPad.” Narvaez interned at the Vis Center through the Vis U program, which brings Computer Science undergraduates from the University of Puerto Rico for summer research opportunities in visualization and virtual environments. “This new experience helped me…to work with new people and combine all our ideas…to manage and resolve the problems we found each day during the process of our research and…to keep learning new things,” said Narvaez.
Dr. Chris Blackwell, Classics professor at Furman University was part of the Venetus A imaging team in 2007. As a member of the Homer Multitext project through the Center of Hellenic Studies at Harvard, Dr. Blackwell has worked for over a decade bringing the words of Homer to new life in electronic media. He has found the Imaging the Iliad app to be an exciting means to do just this. “This iPad app is a beautiful example of where all such projects are going, and the pleasant surprises that lie in store. When we started thinking about giving these manuscripts life electronically, no one dreamed of a touch-based, lightweight, vastly capable and delightfully simple device like the iPad. To see images and text brought together–so quickly!–by the researchers in Kentucky is truly inspiring. The current application is all the proof anyone needs that the work of digitization will serve not only high-end scientific research, but will invite a very wide audience to share in these cultural treasures. As a Classicist, I find this thrilling!”
Few people have the privilege of traveling to the Marciana Library in Venice and studying the actual Iliad folios. But only a month after its March release, the Imaging the Iliad app has already sold more than 800 copies. It is available for free download in the Apple iTunes App store.
“The Iliad app brings one of the oldest mediums of communication to one of the newest. This readily accessible preservation of history and culture will hopefully set the standard of how scholarly research should be published,” said Whelchel. Next, the team is “currently working on a 3D viewer that shows off the models we have of each folio. It really brings the ancient book to life when you can spin it around and see the fine creases.”