Docear’s Online Services Are Down (Recommendation; User Registration; Backup)

Currently, all of Docear’s online services are down. This means, you cannot register, log-in to download backups, or receive recommendations. As we have no time right now for the development of Docear, we are afraid that we won’t be able to fix this problem anytime soon. However, we adjusted the current version of Docear (v1.2) […]

New Pre-print: The Architecture and Datasets of Docear’s Research Paper Recommender System

Our paper “The Architecture and Datasets of Docear’s Research Paper Recommender System” was accepted at the 3rd International Workshop on Mining Scientific Publications (WOSP 2014), which is held in conjunction with the ACM/IEEE Joint Conference on Digital Libraries (JCDL 2014). This means, we will be in London from September 9 until September 13 to present […]

New paper: “A Comparative Analysis of Offline and Online Evaluations and Discussion of Research Paper Recommender System Evaluation”

Yesterday, we published a pre-print on the shortcomings of current research paper recommender system evaluations. One of the findings was that results of offline and online experiments sometimes contradict each other. We did a more detailed analysis on this issue and wrote a new paper about it. More specifically, we conducted a comprehensive evaluation of […]

Evaluations in Information Retrieval: Click Through Rate (CTR) vs. Mean Absolute Error (MAE) vs. (Root) Mean Squared Error (MSE / RMSE) vs. Precision

As you may know, Docear offers literature recommendations and as you may know further, it’s part of my PhD to find out how to make these recommendations as good as possible. To accomplish this I need to know what a ‘good’ recommendation is. So far we have been using Click Through Rates (CTR) to evaluate […]