[Zurück]


Zeitschriftenartikel:

D. Piacun, T. Ionescu, S. Schlund:
"Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application";
Applied Sciences, 11 (2021), 22.



Kurzfassung englisch:
Industrial robot programming tools increasingly rely on graphical interfaces, which aim at rendering the programming task more accessible to a wide variety of users. The usability of such tools is currently being evaluated in controlled environments, such as laboratories or companies, in which a group of participants is asked to carry out several tasks using the tool and then fill out a standardized questionnaire. In this context, this paper proposes and evaluates an alternative evaluation methodology, which leverages online crowdsourcing platforms to produce the same results as face-to-face evaluations. We applied the proposed framework in the evaluation of a web-based industrial robot programming tool called Assembly. Our results suggest that crowdsourcing facilitates a cost-effective, result-oriented, and reusable methodology for performing user studies anonymously and online.

Schlagworte:
robot programming; user interface evaluation; crowdsourcing


"Offizielle" elektronische Version der Publikation (entsprechend ihrem Digital Object Identifier - DOI)
http://dx.doi.org/10.3390/app112210903

Elektronische Version der Publikation:
https://publik.tuwien.ac.at/files/publik_300699.pdf


Erstellt aus der Publikationsdatenbank der Technischen Universität Wien.