[Back]


Talks and Poster Presentations (with Proceedings-Entry):

N. Rekabsaz, M. Lupu, A. Hanbury, H. Zamani:
"Word Embedding Causes Topic Shifting; Exploit Global Context!";
Talk: ACM SIGIR Conference on Research and Development in Information Retrieval, Shinjuku , Tokyo, Japan; 2017-08-07 - 2017-08-11; in: "40th International ACM SIGIR Conference on Research and Development in Information Retrieval", ACM, (2017), ISBN: 978-1-4503-5022-8; 1105 - 1108.



English abstract:
Exploitation of term relatedness provided by word embedding has gained considerable attention in recent IR literature. However, an emerging question is whether this sort of relatedness fits to the needs of IR with respect to retrieval effectiveness. While we observe a high potential of word embedding as a resource for related terms, the incidence of several cases of topic shifting deteriorates the final performance of the applied retrieval models. To address this issue, we revisit the use of global context (i.e. the term co-occurrence in documents) to measure the term relatedness. We hypothesize that in order to avoid topic shifting among the terms with high word embedding similarity, they should often share similar global contexts as well. We therefore study the effectiveness of post filtering of related terms by various global context relatedness measures. Experimental results show significant improvements in two out of three test collections, and support our initial hypothesis regarding the importance of considering global context in retrieval.


"Official" electronic version of the publication (accessed through its Digital Object Identifier - DOI)
http://dx.doi.org/10.1145/3077136.3080733

Electronic version of the publication:
https://publik.tuwien.ac.at/files/publik_264669.pdf


Created from the Publication Database of the Vienna University of Technology.