Managing Discovery Problems with User Experience in Mind

De TallerDocumental on Wiki
Saltar a: navegación, buscar

Reference

Shriver, E. (2019). Managing Discovery Problems with User Experience in Mind. Code4Lib Journal, (44).Recovered from: https://journal.code4lib.org/articles/14481

Original Summary

Williams Libraries recently developed a system for users to report problems they included while using the library catalog/discovery layer (Primo). Building on a method created by the Orbis Cascade Alliance, we built a Google form that allows users to report problems connecting to full text (or any other issue) and automatically includes the permalink in their response. We soon realized that we could improve the user experience by automatically forwarding these reports into our Ask a Librarian email service (LibAnswers) so we could offer alternative solutions while we worked on fixing the initial issue. The article will include an explanation of the process, reactions from public service staff, methods for managing the problems once submitted, and code shared on GitHub for those interested in implementing the tool at their own library.

Detailed Summary

Williams Libraries launched a new ILS and discovery layer. Users weren't used to receive prompt, personalized responses when they use an online form to report a problem. The new system, simply called “the library catalog,” was the first web-scale discovery tool. The feedback came trickling in, and most of the submissions were reports of problems connecting to full-text electronic resources. Although many users were experiencing streamlined discovery and seamless connection, some were getting dead links, paywalls, or error messages.

Version 1.0: Tell Us What You Think

Before the discovery layer, a third-party link resolver was used to connect citations from databases to theirs full-text holdings. When there were problems connecting to the full-text, library staff reported them to an email address, which was monitored by the systems librarian and the head of collections. However, the email address was not publicized to users, and only a few problems were reported each month.

After the launch of the new catalog, traffic coming through the “tell us what you think” feedback form, linked in the footer, continued at the same pace. In the first year, it received 70 entries, and 90% of them were from library staff. Even though users were having trouble connecting to full-text electronic resources, they weren’t using the feedback form to report them. The feedback link wasn’t noticeable, and they didn’t realize it could be used to report a problem. Moreover, once they landed on the feedback form, there were too many questions, and they needed to go back to the catalog record to get the URL of the problematic item. The reporting process should be quicker and easier.

Version 2.0: Report a Problem

In August 2017, the Williams Libraries started to use a new user interface (UI) offered by our discovery vendor. The out-of-the-box (OTB) version of this new UI did not include a footer by default, so the feedback form link had to be moved elsewhere. This forced change was an opportunity: to improve the feedback experience on the user side, and, At the same time, to update other custom add-ons to be compatible with the new UI, which was built using the AngularJS framework.

Emery Shriver discovered the online Primo Toolkit provided by the Orbis Cascade Alliance. In a post called “Add a ‘Report a Problem’ Link in Primo Actions Menu” which was the solution for the feedback problem. The post explained how to prepopulate a form with the record’s permalink. Although the Orbis Cascade feedback form was Drupal-based, a quick search of Google’s documentation could tell how to prefill fields in a Google Form.

Changes made

Based on the feedback, version 2.0 needed to be shorter. With some input from the web advisory group, the feedback gathering process was split into two forms: one to gather general feedback about the catalog, and another to report access problems with specific records. I used part of the original “tell us what you think” form for the general feedback form, which is linked from the catalog landing page, and other to “report a problem” form based on the Orbis Cascade post.

The new form had four questions:

  • Please tell us what’s happening so we can fix it:
  • Permalink (pre-filled from catalog record)
  • Your email
  • Are you library staff?

This new form was added to each catalog record, under what our catalog vendor calls the “actions menu.”

Two custom scripts were added to the Google Sheet: One forwarded the reported problem to the abovementioned email list, a holdover from our old ILS, which was now monitored by the systems librarian, the head of collections, and the catalog librarian responsible for electronic resources. The second script constructed an email to the problem reporter to let know when the problem was resolved. Both scripts were triggered manually as part of the triage process. In addition to these semi-automated emails, problem reporters outside of the library received emails about other points: when the problem report was received, if what was reported was user error, and to send them suggestions of what to do to fill their initial need while we were troubleshooting the issue.

Results

A soft rollout of this service resulted in a dramatic increase in use. This was a great problem to have (users were seeing the link and using it) but it was still a problem. The workload increased for those involved in troubleshooting.

Users definition of a “problem” was much broader than before. Students used the form when they couldn’t find a book on the shelf. Faculty used the form to request changes to how their own books were represented in the catalog.

Version 2.1: Forwarding to LibAnswers

The new workflow was analysed, and the conclusion was that there were parts of the process that I didn’t need to be involved. Furthermore, we already had a system in place for communicating with users: our Ask a Librarian service, staffed whenever the research help desk is open. Better yet, this system, LibAnswers, had its own email address, and I had already written an email-building script. All I needed was some slight modifications to automate it.

Changes made

Modified one of the manually triggered email scripts to forward problems to the Ask a Librarian service. Every 15 minutes the script checks to see if any reports have come in from users outside the library since the last time the script ran. If a report has come in, it builds an email and sends it to the LibAnswers queue. The subject of the email is “Catalog Problem Reported,” and the sender is the reporter who submitted the problem. The email includes a description of the issue and a permalink to the problem record. The reference librarian staffing the email queue when the report comes in does a little exploration, such as checking to see if it’s possible to recreate the problem or finding an alternative access. They reply to the user with suggested next steps and log the transaction in Reference Analytics.

Results

Automating and delegating the user communication part of the triage process was a significant improvement. Users received help accomplishing what they set out to do before encountering the problem. The problems that are not related to full-text access could be easily forwarded to other library departments. Finally, because transcripts of the transactions are logged in Reference Analytics (which was shared with all reference librarians and student employees) everyone had a better idea of the types of problems tha were being encountered, and their possible solutions. Two additional questions were added to the form to improve the user experience:

  • Are you on campus or off campus?
  • Have you requested this article through interlibrary loan?

The first question was added to help reference librarians in their initial troubleshooting, when trying to recreate the problem reported. The second question was twofold: to remind the user that the interlibrary loan service exists and that they can use it to request articles that they are having problems accessing, and to prevent librarians from offering redundant advice if the user had already used the interlibrary loan service. Seeing the trends in librarian answers to reported problems made it possible to make iterative changes.

Another problem to tackle was the mountain of emails sent to the troubleshooting team, known as the “e-problems group.” Every time a problem was forwarded from the spreadsheet to the e-problems email list, it went to three people. It wasn’t clear which of the three people were working on the problem, and many times more than one person would start troubleshooting the same issue at the same time. It was also hard to share notes among the group about the attempts that had already been tried. It was needed a more formal ticketing system to assign problems, add notes, and track trends.

Version 2.2: Further Forwarding to LibAnswers

A reporting system that was quick and easy for people to use and provided a user experience that fit our prompt and personalized service model already existed. It was needed to focus on improving the experience of the staff troubleshooters, who were struggling to manage the mountain of emails generated by the problem reporting system. LibAnswers was the solution.

The same email queue system that reference librarians were using to answer email reference queries could be used as a ticketing system to organize the problems reported. It would allow to assign and transfer tickets to individuals based on expertise, make notes about the issue and what had been tried so far, and add tags and gather data for analysis of trends.

Implementing this solution was simple. A Springshare administrator, added another email queue (at a cost of $100). Each LibAnswers queue has its own email address, so it was only needed to change the email address in the version 2.0 script.

Results

Moving the management of troubleshooting out of email and into LibAnswers has improved the efficiency of the team. Some small adjustments were also made to the email forwarding script so that it was easier to tell one problem from another in the queue. Since it is an internal workflow with only three people involved it’s possible to continue to make small, iterative changes when necessary.

Future

A suggestion is to create some canned responses in LibAnswers so it can be provided quick and consistent solutions to frequent problems. Syncing the Google Sheet (used to track problems and generate statistics) with the e-problems queue could also be improved. Finally, it would be great to start being proactive in the troubleshooting department, and set up a testing schedule, so problems are identified and fixed before users encounter them.

Personal Comment

User experience has a big impact on problem solving and it includes emotional and perceptual components across time. The user experience consists of perceptions that shape emotions, thoughts, and attitudes. It involves a constant feedback loop repeated throughout the usage lifecycle including from initial discovery through purchase, out-of-box, usage, maintenance, upgrades, and disposal. <ref> Nenonen, S., Rasila, H., Junnonen, J. M., & Kärnä, S. (2008, June). Customer Journey–a method to investigate user experience. In Proceedings of the Euro FM Conference Manchester, pp. 54-63. Recovered from: https://pdfs.semanticscholar.org/bb0a/a5d373c8011eeca2b8b07638ab25c2baec31.pdf</ref> In studies made, it's noticeable that perceptions of recommendation quality and variety are important mediators in predicting the effects of objective system aspects on the three components of user experience: process (e.g. perceived effort, difficulty), system (e.g. perceived system effectiveness) and outcome (e.g. choice satisfaction) <ref>Knijnenburg, B. P., Willemsen, M. C., Gantner, Z., Soncu, H. and Newell, C.. 2012. Explaining the user experience of recommender systems. User Modeling and User-Adapted Interaction, 22, 4-5 (October 2012), 441-504. DOI=http://dx.doi.org/10.1007/s11257-011-9118-4 Recovered from:https://dl.acm.org/citation.cfm?id=2339919 </ref> Using methods like this in Digital Libraries' designs can not only improve the integrated systems that are used but also increase user satisfaction. The development of this tool allows to using the “report a problem” feedback system,which is a system that enables promptly responses to users , track problems efficiently, and lets keep track of problem trends and solutions. <ref>Shriver, E. (2019). Managing Discovery Problems with User Experience in Mind. Code4Lib Journal, (44).Recovered from: https://journal.code4lib.org/articles/14481</ref> When online catalogs first became available in the 1980s, a very high percentage of library patrons used the catalog and everyone using a library computer was searching the catalog. Today, the catalog continues to play an important role, but competes with many other information resources in high demand by our users (e-mail, web, databases, e-books, etc.). In the future, the catalog will probably be searched less and less often as a stand-alone database and will instead be searched more often in conjunction with other information resources.<ref>Chow A. S., Bucknall T. (2012). Emerging technology trends in libraries. In Library Technology and User Services, 105-130. Retrieved 09:41, December 5, 2019, from https://www.sciencedirect.com/topics/computer-science/integrated-library-system</ref>

References