Which methods are the most effective to enable novice users to participate in FAIR ontology creation? A usability study
datasetposted on 13.07.2020 by Hong Cui, Limin Zhang, Xingyi Yang
Datasets usually provide raw data for analysis. This raw data often comes in spreadsheet form, but can be any collection of data, on which analysis can be performed.
A usability test experiment was employed to evaluate the efficiency, effectiveness and user satisfaction with a set of four “add2ontology” user interfaces (UIs) that allow an end user to add terms and their relations to an ontology.
The experiment consisted of a pre-experiment session, and two activity sessions. In the pre-experiment session, 33 participants remotely filled out a pre-experiment survey consisting of four questions regarding their experience with controlled vocabulary editors and wikis. After completing the pre-experiment questionnaire, participants were scheduled to watch a 3-6 minute video tutorial for each method.
After watching each video, participants completed a web-based questionnaire consisting of five questions (session 1). In the second session, participants watched the videos again and completed a hands-on task using each of the four methods to add new terms and properties to the CAREX Ontology. After finishing the task, participants responded to the same questionnaire as in the first session.
The questionnaire responses data were collected from the three-round surveys. The logs data from the four UIs after finishing the task were collected. The Friedman rank sum test, Wilcoxon signed-rank test, Cochran's Q test, and Spearman correlation coefficient analysis were performed to compare the usability among the four tools and the change before and after completing the hands-on task.
This project has been reviewed and approved by an IRB Chair or designee at the University of Arizona; IRB#1902366508. Parties interested in collaborating on use of the full dataset may contact the authors at firstname.lastname@example.org.
For inquiries regarding the contents of this dataset, please contact the Corresponding Author listed in the README.txt file. Administrative inquiries (e.g., removal requests, trouble downloading, etc.) can be directed to email@example.com