Research Award: Interactive Information Retrieval

Special Section
Research Award: Interactive Information Retrieval
Bulletin of the Association for Information Science and Technology – February/March 2015 – Volume 41, Number 3
by Diane Kelly
2014 Annual Meeting Coverage
Diane Kelly progressed from having no idea of information science as a field of inquiry to
receiving the 2014 ASIS&T Research Award for her outstanding contributions to the field.
Through an early library science course, Kelly met information science scholars and soon
started her journey researching interactive information retrieval, search behavior, search
interfaces and research methods. She became one of the “user study people” when few in
the information retrieval community thought about the search process. Kelly appreciates
starting her studies before Google’s search box and blinking cursor became pervasive and
realizing the wealth of ideas predating Google that are worthy of renewed investigation. She
expressed concern that information seeking may become passive receipt of preformed
information. In her further research, Kelly hopes to shed more light on the process of search
and success metrics.
information science
information seeking
interactive systems
human computer interaction
Diane Kelly is the 2014 recipient of the ASIS&T Research Award for outstanding
contributions to research in information science. She is professor at the School of
Information and Library Science at the University of North Carolina at Chapel Hill. Her
research and teaching interests are interactive information search and retrieval, information
search behavior and research methods. She is the recipient of the 2013 British Computer
Society’s IRSG Karen Spärck Jones Award, the 2009 ASIS&T/Thomson Reuters
Outstanding Information Science Teacher Award and the 2007 SILS Outstanding
Teacher of the Year Award. She can be reached at dianek<at>
hen I started the master’s degree program
at what was then the School of
Communication, Information and Library
Studies at Rutgers University in 1997, unbeknownst
to me, I was joining a school that housed some of the
most distinguished scholars in information science:
Nick Belkin, Paul Kantor, Carol Kuhlthau and Tefko
Saracevic, each of whom has received at least one ASIS&T research award.
Like many students, I had no idea information science existed as a field of
inquiry and practice. I was there for the library science part, which I had also
only recently learned was something one could study. After a semester packed
with interesting courses, including human information behavior taught by
Carol Kuhlthau and online searching taught by Tefko Saracevic, I soon
learned that information (and library) science was an area of inquiry with
deep intellectual roots, vibrant research traditions and provocative scholars.
Following my initial semester of school, I did what any student interested
in information search and human behavior would do next: I volunteered to
join Nick Belkin’s research team. I spent the next six years working on his
team earning a master’s degree and a Ph.D. and learning about things I
never knew existed, including search behavior, the information search
process, interface design, information retrieval, TREC (Text REtrieval
Conference) and, of course, ASIS&T. I took two courses about information
retrieval: one taught by Paul Kantor and the other by Nick Belkin, which
solidified my interests in this area, particularly in interactive information
retrieval. Through these courses and others, I gained a foundation in the
history and evolution of information science. By apprenticing myself to
both Nick and Paul, I gained a foundation in how to conduct research. I will
always be grateful to both of them for their generosity, guidance and support.
< P R E V I O U S PA G E
N E X T PA G E >
2014 Annual Meeting Coverage
Special Section
Bulletin of the Association for Information Science and Technology – February/March 2015 – Volume 41, Number 3
K E L LY, c o n t i n u e d
Since graduating from Rutgers, I have spent the last 11 years at the School
of Information and Library Science at the University of North Carolina,
where I have received strong support from two other ASIS&T standouts:
Gary Marchionini and Barbara Wildemuth, along with a cadre of excellent
students, many of whom are already blazing their own ASIS&T paths. I have
had an active research agenda focused on interactive information retrieval,
search behavior, search interfaces and research methods. This agenda has
been greatly supported by many students who have worked alongside me.
Most of the studies we have conducted have been controlled, laboratory
experiments and have involved a variety of data collection methods including
logging, questionnaires, psychometric scales, observation, stimulated recall,
structured and semi-structured interviews and most recently, physiological
signals. We have studied hundreds of people, including intelligence analysts,
undergraduate and graduate students, faculty and staff and members of the
community at local public libraries. For those interested in a list of my
publications, please visit
Our studies include a 14-week naturalistic, longitudinal study of the
validity and reliability of using implicit feedback as relevance indicators and
of how contextual factors, such as search task, impact this relationship [1] [2];
a monograph about methods for evaluating interactive information systems
[3]; several studies describing method variance in interactive systems research
[4] [5]; a systematic review of interactive information retrieval evaluation
studies documenting 40 years of research [6]; studies of query suggestions
[7]; an examination of the impact of threshold priming on relevance
assessments [8]; a study of the effects of cognitive ability on search [9]; and
most recently, a study investigating stress and workload during search[10].
“User study people,” as we are called, at least in the information
retrieval community, are the minority, but our numbers continue to grow.
This perspective is critical, and it has been exciting to watch its importance
increase during the past 18 years, in part because of all the hard work of
information and library scientists, who have been paying attention to users
all along.
When I first started conducting research about interactive search systems
in 1998, information search was a foreign concept to most people. When we
tested our search systems, we either recruited librarians or library science
students so that we could assume our research participants understood
something about search, or we developed extensive tutorials to train people
to use our systems. Collecting data about participants’ search and computer
experiences and majors was also necessary and usually provided some
insight about any differences we observed in use of the systems. And it took
ages to get a stable, workable system up and running! The most exciting
things about the last 18 years are how much the world has changed with
respect to information search and how much easier it is to do information
search research.
I am grateful that when I started studying information search it was not a
common activity. Contemporary search engines like Google did not anchor
my thinking about what was possible. My thinking was anchored by what
were, at the time, radical ways to conceptualize information searchers (from
Belkin [11]), the information search process (from Oddy [12]) and user
interfaces (from Hearst [13]). The perspective I gained by watching this area
grow and change has been invaluable. It allows me to see beyond Google
because I saw before it. It taught me to look to the literature for inspiration
instead of staring at a search box and blinking cursor. So much research
today lacks spark because it is often heavily anchored by contemporary
practice and trends. It lacks depth because it is disconnected from past work.
I will (almost) spare the cliché that those who do not know history are
doomed to repeat it, in part because I believe some of our history is worth
repeating, especially when it comes to research. Papers from the pre-Google
era contain many amazing and provocative ideas, some of which were never
fully investigated because of technological constraints and some of which
form the basis of modern search engines. For example, Maron and Kuhns
[14] proposed the idea that searchers’ queries could be used as sources of
index terms for documents, such that documents that had been retrieved in
response to a particular searcher’s query and found relevant by the searcher
could then be associated more strongly with that query (sound familiar?).
But the real purpose of knowing the history of a field is that it engenders a
certain amount of humility, which is necessary to become a true scholar of
< P R E V I O U S PA G E
N E X T PA G E >
2014 Annual Meeting Coverage
Special Section
Bulletin of the Association for Information Science and Technology – February/March 2015 – Volume 41, Number 3
K E L LY, c o n t i n u e d
One only has to open a book such as Walker’s 1971 edited volume [15]
documenting one of the first workshops about interactive information
retrieval, Interactive Bibliographic Search: The User/Computer Interface,
which contains citations to hundreds of studies, to appreciate the depth of
our field and one’s place in it. Many people come to this field with the illformed notion that information science is somehow related to the
information technology boom of the 1990s and that search interfaces and
retrieval systems are contemporary inventions. Today, information science
means different things to different people and does many different things for
many different people. As educators, we have a responsibility to make sure
students at all levels, and people more generally, understand the history of
information science and importantly, the central role libraries and librarians
have played in its development.
When I look back on some of the earlier search interfaces I developed
and tested as a student, I cringe. They were so complicated and dense
compared to today’s standards, but they expected more from searchers and
enabled searchers to go further, to use different search tactics, to interact
with the information in different ways. In the past, the research literature
contained an abundance of novel and innovative search user interfaces, but
one has to look hard to find examples today as we have converged on one
standard model, which has been optimized for a small number of search
tasks. Other types of search tasks and other aspects of the informationseeking process have been neglected. How might we design tools that
support information seeking and use, rather than just information search?
How might we design tools that support interaction and engagement with
information across a range of tasks and sessions? How might we design
tools to help people dive deeper into the search results, discover underused
information and create more diverse solutions to their information problems?
Through teaching students and studying the behaviors of research
participants, I have noticed that people often have an inflated sense of their
own search skills and the quality and completeness of the information they
find (and what they can find), and overestimate what they have learned
during the search episode. Have contemporary search interfaces transformed
searchers into passive information receivers rather than active information
seekers? For example, searchers do not have to create their own queries
anymore, and soon they may not even have to think of their own information
needs. Are search systems nudging us towards a homogenization of
information needs? Are we given adequate control over search systems?
Does an imbalance of control foster an illusion of understanding? Does this
imbalance have negative consequences for the sense-making process? Many
of these questions are actually not new, as many information scientists in
the 1990s, including Belkin [16], raised them when contemplating the
possibilities of artificial intelligence.
Researchers document success by showing reductions in time and
amount of interaction and increased user satisfaction, but do these measures
really allow researchers to understand the impact of search? Can people be
satisfied with things that are not necessarily good for them? How can we
measure the success of an entire search session, or a search that takes place
over multiple points in time? These are questions I look forward to seeing
addressed during the next 20 years of information search research. ■
Resources on next page
< P R E V I O U S PA G E
N E X T PA G E >
2014 Annual Meeting Coverage
Special Section
K E L LY, c o n t i n u e d
Resources Mentioned in the Article
[1] Kelly, D., & Belkin, N.J. (2004). Display time as implicit feedback: Understanding task effects. Proceedings of the 27th Annual ACM International Conference on Research and
Development in Information Retrieval (SIGIR ‘04), Sheffield, UK, 377-384.
Bulletin of the Association for Information Science and Technology – February/March 2015 – Volume 41, Number 3
[2] Kelly, D. (2006). Measuring online information-seeking context, part 2. Findings and discussion. Journal of the American Society for Information Science & Technology 57(14), 1862-1874.
[3] Kelly, D. (2009). Methods for evaluating interactive information retrieval systems with users. Foundations and Trends in Information Retrieval, 3(1-2). doi:10.1561/1500000012.
[4] Kelly, D., Shah, C., Sugimoto, C. R., Bailey, E. W., Clemens, R. A., Irvine, A. K., . . . Zhang, Y. (2008). Effects of performance feedback on users’ evaluations of an interactive IR system.
Proceedings of the 2nd Symposium on Information Interaction in Context (IIiX), London, UK, 75-82.
[5] Kelly, D., Harper, D. J., & Landau, B. (2008). Questionnaire mode effects in interactive information retrieval experiments. Information Processing & Management, 44(1), 122-141.
[6] Kelly, D., & Sugimoto, C. R. (2013). A systematic review of interactive information retrieval evaluation studies, 1967-2006. Journal of the American Society for Information Science and
Technology, 64 (4), 745-770.
[7] Kelly, D., Cushing, A., Dostert, M., Niu, X., & Gyllstom, K. (2010). Effects of popularity and quality on the usage of query suggestions during information search. Proceedings of the ACM
Conference on Human Factors in Computing Systems (CHI), Atlanta, GA, 45-54.
[8] Scholer, F., Kelly, D., Wu, W.-C., Lee, H., & Webber, W. (2013). The effects of threshold priming and need for cognition on relevance calibration and assessment. Proceedings of the 36th
Annual ACM International Conference on Research and Development in Information Retrieval (SIGIR '13), Dublin, Ireland, 623-632.
[9] Brennan, K., Kelly, D., & Arguello, J. (2014). The effect of cognitive abilities on information search for tasks of varying levels of complexity. Proceedings of the Fifth Information Interaction
in Context Conference (IIiX), Regensburg, Germany.
[10] Edwards, A., Kelly, D., & Azzopardi, L. (to appear). The impact of query interface design on stress, workload and performance. Proceedings of the 37th European Conference on
Information Retrieval (ECIR ‘15), Vienna, Austria.
[11] Belkin, N. J. (1980). Anomalous states of knowledge as a basis for information retrieval. The Canadian Journal of Information Science, 5, 133-143.
[12] Oddy, R. N. (1977). Information retrieval through man-machine dialogue. Journal of Documentation 33(1), 1-14.
[13] Hearst, M. A. (1995). TileBars: Visualization of term distribution in full text information access. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’95),
Denver, CO, 59-66.
[14] Maron, M. E., & Kuhns, J. L. (1960). On relevance, probabilistic indexing and information retrieval. Journal of the ACM, 7 (3), 216-244.
[15] Walker, D.E. (1971). Interactive bibliographic search: The user/computer interface. Montvale, NJ: AFIPS Press.
[16] Belkin, N. J. (1996). Intelligent information retrieval: Whose intelligence? ISI '96: Proceedings of the Fifth International Symposium for Information Science. Konstanz: Universtaetsverlag
Konstanz, 25-31.
< P R E V I O U S PA G E
N E X T PA G E >