{"id":1607,"date":"2018-12-12T10:20:33","date_gmt":"2018-12-12T09:20:33","guid":{"rendered":"http:\/\/www.pewe.sk\/uxi\/?page_id=1607"},"modified":"2018-12-12T10:20:33","modified_gmt":"2018-12-12T09:20:33","slug":"autumn-2017-2018","status":"publish","type":"page","link":"http:\/\/www.pewe.sk\/uxi\/autumn-2017-2018\/","title":{"rendered":"Autumn 2017\/2018"},"content":{"rendered":"<ul>\n<li>Mat\u00fa\u0161 Tund\u00e9r: <a href=\"http:\/\/www.pewe.sk\/uxi\/studies-and-experiments\/#tunder2018-02\">Analysis of source code reading<\/a><\/li>\n<li>Peter Ga\u0161par:\u00a0<a href=\"http:\/\/www.pewe.sk\/uxi\/studies-and-experiments\/#gaspar2018-01\">Analysis of Influcence of Visual Stimuli on Movies Selection<\/a><\/li>\n<li>J\u00e1n Hanko:\u00a0<a href=\"http:\/\/www.pewe.sk\/uxi\/studies-and-experiments\/#hanko2017-11\">The relation of gaze fixations and user&#8217;s skill in the digital space<\/a><\/li>\n<li>Martin Mokr\u00fd: <a href=\"http:\/\/www.pewe.sk\/uxi\/studies-and-experiments\/#mokry2017-11\">Identification of the user familiarity with Web domain, based on patterns in eyetracking data<\/a><\/li>\n<li>Martina Redajov\u00e1: <a href=\"http:\/\/www.pewe.sk\/uxi\/studies-and-experiments\/#redajova2017-11\">Automatic recognition of user characteristics<\/a><\/li>\n<li>Jakub Kubanyi: <a href=\"http:\/\/www.pewe.sk\/uxi\/studies-and-experiments\/#kubanyi2017-11\">The relation of gaze fixations and user\u2019s skill in the digital s<\/a><\/li>\n<li>D\u00e1niel Papp: <a href=\"http:\/\/www.pewe.sk\/uxi\/studies-and-experiments\/#papp2017-11\">Visual attention and saliency mapping on web page elements<\/a><\/li>\n<li>Filip \u0160andor: <a href=\"http:\/\/www.pewe.sk\/uxi\/studies-and-experiments\/#sandor2017-11\">Personalized search by using eye tracking to better identifying the user query<\/a><\/li>\n<li>Viktor Ko\u0161\u0165an: <a href=\"http:\/\/www.pewe.sk\/uxi\/studies-and-experiments\/#kostan2017-10\">Facial engagement recognition using sequential analysis<\/a><\/li>\n<li>Mat\u00fa\u0161 G\u00e1sp\u00e1r: <a href=\"http:\/\/www.pewe.sk\/uxi\/studies-and-experiments\/#gaspar2017-10\">Eye-tracking of user while reading source codes<\/a><\/li>\n<li>Bronislava Strn\u00e1delov\u00e1 (FSaEV UK):\u00a0<a href=\"http:\/\/www.pewe.sk\/uxi\/studies-and-experiments\/#strnadelova2017-10\">Identification of emotions by eyetracking in relation to self-criticism<\/a><\/li>\n<\/ul>\n<h3 id=\"tunder2018-02\"><em>Analysis of source code reading<\/em><\/h3>\n<p><strong>Date:<\/strong> <em>February 2018<\/em><br \/>\n<strong>Experiment conductor:<\/strong> <em>Mat\u00fa\u0161 Tund\u00e9r<\/em><br \/>\n<strong>Supervisor:<\/strong> Mgr. Jozef Tvaro\u017eek, PhD.<br \/>\n<strong>Short description: <\/strong><br \/>\nThe main aim of this thesis is to determine, how the order of functions in the source code will affect the program comprehension while reading a source code, but also the way of reading itself. In experiments, programmers will read two types of source codes. The first one, where the main function is on the top and the second one, where the main function is on the bottom of the source code. Experiments are taking place in UX lab at the faculty. The effect of the functions&#8217; layout is determined by scanning the information about gaze while reading the source code. To make programmers study the whole source code, there was the task given to them to find errors in the source code.<br \/>\n<em><strong>Link to a formal description of the UX experiment:<\/strong> <a href=\"https:\/\/docs.google.com\/document\/d\/1G-Sz788VoONQ2sd2x1x_6Uzx3yxed9Sj8bcDks3G8wM\/edit?usp=sharing\">experiment description (In Slovak)<\/a><\/em><\/p>\n<hr \/>\n<h3 id=\"gaspar2018-01\"><em>Analysis of Influcence of Visual Stimuli on Movies Selection<\/em><\/h3>\n<p><strong>Date:<\/strong> <em>January 2018<\/em><br \/>\n<strong>Experiment conductor:<\/strong> <em>Peter Ga\u0161par<\/em><br \/>\n<strong>Supervisor:<\/strong> prof. M\u00e1ria Bielikov\u00e1, Dr. Michal Kompan<br \/>\n<strong>Short description: <\/strong><br \/>\nIn several domains, items can be represented using not only using the text but also by multimedia, specically images. Images and corresponding visual stimuli may affect user during the web navigational tasks, such as clicking activity in the list of items. In such a list, a user feedback is currently interpreted most commonly from the clicking activity, such that when a user clicks the n-th item in a list, all the previous items are considered not to be relevant for him.<br \/>\nIn our work, we study how visual stimuli can be used in recommendation process. We analyze how visual stimuli should be considered during the analysis of user behavior in the results of recommendations and how the user\u2019s behavior should be properly interpreted if the images are present there.<br \/>\n<em><strong>Link to a formal description of the UX experiment:<\/strong> <a href=\"https:\/\/docs.google.com\/document\/d\/1VNxka7L9Rc7PrfNs14zvpv6_yMv3u48ctFUpI-hKp_Y\/edit\">experiment description (In Slovak)<\/a><\/em><\/p>\n<hr \/>\n<h3 id=\"hanko2017-11\"><em>The relation of gaze fixations and user&#8217;s skill in the digital space<\/em><\/h3>\n<p><strong>Date:<\/strong> <em>November 2017<\/em><br \/>\n<strong>Experiment conductor:<\/strong> <em>J\u00e1n Hanko<br \/>\n<strong>Supervisor:<\/strong> Ing. Patrik Hlav\u00e1\u010d<br \/>\n<strong>Short description: <\/strong><\/em><br \/>\nToday, the web is a regular part of almost every household. Since not everyone has the same experience with its use, it is important to adapt and facilitate its usability. However, usage depends not only on the experience, but also on the general characteristics of the user. In our case, we focus on his spatial capabilities, that is, as he thinks, perceives and remembers information related to objects and space, from which we derive his spatial cognitive style. This is based on the Spatial Cognitive Style Test (SCST), because we can compare real-space orientation to web navigation, as the web is also considered as a type of space.<br \/>\nOur goal is to explore ways of navigation for users with different cognitive styles. Therefore, we want to identify the differences in the way of navigation, the problems associated with individual ways and to determine to what extent the web navigation is influenced by the spatial cognitive style of the user.<br \/>\n<strong>Link to a formal description of the UX experiment:\u00a0<a href=\"https:\/\/docs.google.com\/document\/d\/1O9RmdZpLlJZ8fdjbg7aSzFWMxXXjy-NwvrdqWjqmj7k\/edit?usp=sharing\">experiment description (In Slovak)<\/a><\/strong><\/p>\n<hr \/>\n<h3 id=\"mokry2017-11\"><em>Identification of the user familiarity with Web domain, based on patterns in eyetracking data<\/em><\/h3>\n<p><strong>Date:<\/strong> <em>November 2017<\/em><br \/>\n<strong>Experiment conductor:<\/strong> <em>Martin Mokr\u00fd<br \/>\n<strong>Supervisor:<\/strong> Ing. R\u00f3bert M\u00f3ro, PhD.<br \/>\n<strong>Short description: <\/strong><\/em><br \/>\nIdentification of repeating patterns from eye tracking data (patterns like sequences of fixations, saccades or areas of interest) is considered to be an important step in eye tracking analysis. Its main focuses are: explanation of recorded interaction, comparison of ways, in which different users interact or clustering of user based on their similarity. This can be used in evaluation of recorded interaction (UX testing) or customization of graphical interface based on identified situation (pattern). This situation e.g. systematic scanning of web page can imply unfamiliarity of user with specific web page. On the other hand, reoccurrences of a similar situation can imply activities of skilled user.<br \/>\nIn our work we focus on automatic identification of patterns in scanpaths in eye tracking data, related to \u00a0level of user\u2019s familiarity. First phase consists of identifying proper features of fixations, saccades, pupils and head distance (features not task-related and with high frequency of occurrences) and analysis of methods for creating common scanpaths and quantification of similarities between scanpaths.<br \/>\nGoal of second phase is to implement machine learning model for automatic identification of user familiarity with e-shop. Model will take as input basic eye-tracking features and will be extended to use also recurrence, reoccurrence metrics and metrics calculated by similarity of scanpaths to common scanpath.<br \/>\n<em><strong>Link to a formal description of the UX experiment:<\/strong> <a href=\"https:\/\/docs.google.com\/document\/d\/1bkOMBa6B7gG1rl9R8kk8GbewgA65mXNvrI3Pae26zoE\/edit#\">experiment description (In Slovak)<\/a><\/em><\/p>\n<hr \/>\n<h3 id=\"redajova2017-11\"><em>Automatic recognition of user characteristics<\/em><\/h3>\n<p><strong>Date:<\/strong> <em>November 2017<\/em><br \/>\n<strong>Experiment conductor:<\/strong> <em>Martina Redajov\u00e1<br \/>\n<strong>Supervisor:<\/strong> Ing. Jakub \u0160imko, PhD.<br \/>\n<strong>Short description: <\/strong><\/em><br \/>\nThis work uses eye movement behavioral scores as a metric for fatigue recognition. We believe that using the method of tracking user\u2019s eye gaze and tracking behavioral scores has a great potential as a method for detecting fatigue.<br \/>\nDuring the verification phase, we plan to focus on refining the results of previous research and performing experiments to verify selected methods, as we discovered several shortcomings in this method. In case of positive results, we plan to design and apply our method in a real life framework for automated learning of user&#8217;s fatigue.<br \/>\n<em><strong>Link to a formal description of the UX experiment:<\/strong> <a href=\"https:\/\/docs.google.com\/document\/d\/19UUzJpFI0f6ASOjhdSck46akGeUsPjlX10AnqHiD3aY\/edit\">experiment description (In Slovak)<\/a><\/em><\/p>\n<hr \/>\n<h3 id=\"kubanyi2017-11\"><em>The relation of gaze fixations and user\u2019s skill in the digital space<\/em><\/h3>\n<p><strong>Date:<\/strong> <em>November 2017<\/em><br \/>\n<strong>Experiment conductor:<\/strong> <em>Jakub Kubanyi<br \/>\n<strong>Supervisor:<\/strong> Ing. Patrik Hlav\u00e1\u010d<br \/>\n<strong>Short description: <\/strong><\/em><br \/>\nMany information systems use very detailed information about its users for improvement or innovation purposes. Nowadays, statistics of user access and user preferences are used to fulfill these purposes. The other useful information can be user\u2019s skill in the digital space as well. This experiment is focused on analyzing possibilities<br \/>\nof user&#8217;s skill determination based on eye-tracker data. When It comes to user&#8217;s skill, We have focused on web literacy. The search skill to find, access and evaluate specific information. Research also includes relations in current systems and possibilities of user classification based on their search skills. In this experiment, We have prepared several search-based tasks and we will be examing how participants search for the specific information during the process.<em><br \/>\n<strong>Link to a formal description of the UX experiment:<\/strong> <a href=\"https:\/\/docs.google.com\/document\/d\/1e2E2YFaiK90l-Hwkg9dZW-o2xucnwDUuOuCKYcDkGsI\/edit#\">experiment description (In Slovak)<\/a><\/em><\/p>\n<hr \/>\n<h3 id=\"papp2017-11\"><em>Visual attention and saliency mapping on web page elements<\/em><\/h3>\n<p><strong>Date:<\/strong> <em>November 2017<\/em><br \/>\n<strong>Experiment conductor:<\/strong> <em>D\u00e1niel Papp<br \/>\n<strong>Supervisor:<\/strong> Ing. Jakub \u0160imko, PhD.<br \/>\n<strong>Short description: <\/strong><\/em><br \/>\nIn this experiment, we will use eyetracking for identification respectively for capturing the position of significant elements on the web site.<br \/>\nThe knowledge of salient parts of a web page could offer several benefits for web page designers and users as well. If we know what parts of the web page people use to recognize previously seen pages, we could create compact visual representations of web pages that contain only these most relevant areas.<br \/>\nWeb page designers could also benefit from a model of visual attention to improve page layout and design, e.g., arranging page elements in such a way that users\u2019 attention is focused on the aspects that the author considers most important.<br \/>\n<em><br \/>\n<strong>Link to a formal description of the UX experiment:<\/strong> <a href=\"https:\/\/docs.google.com\/document\/d\/1WhHJlRWtsTm90gT2IyfobPYNsX2A0tBnEkj019zR9Y0\/edit#\">experiment description (In Slovak)<\/a><\/em><\/p>\n<hr \/>\n<h3 id=\"sandor2017-11\"><em>Personalized search by using eye tracking to better identifying the user query<\/em><\/h3>\n<p><strong>Date:<\/strong> <em>November 2017<\/em><br \/>\n<strong>Experiment conductor:<\/strong> <em>Filip \u0160andor<br \/>\n<strong>Supervisor:<\/strong> Ing. Eduard Kuric, PhD.<br \/>\n<strong>Short description: <\/strong><\/em><br \/>\nOur goal is to improve the method of calculating user interest.<br \/>\nThis article counts user&#8217;s interests based on the number of views per word. This calculation should be expanded by viewing metrics. We believe we will achieve a more accurate representation of user interest by correctly incorporating multiple viewing metrics.<br \/>\nIn the second part of the experiment, we will monitor user behavior when viewing offers and then count on his interest in individual offers. Finally let provide him offer according to his expectations and compare the answer to our calculations and see how much our algorithm is accurate.<em><br \/>\n<strong>Link to a formal description of the UX experiment:<\/strong> <a href=\"https:\/\/docs.google.com\/document\/d\/1CXiN1iDX01_VaG6YdeHrSdXjHa0RHbJi89S2n1Psvgg\/edit#\">experiment description (In Slovak)<\/a><\/em><\/p>\n<hr \/>\n<h3 id=\"kostan2017-10\"><em>Facial engagement recognition using sequential analysis<\/em><\/h3>\n<p><strong>Date:<\/strong> <em>October 2017<\/em><br \/>\n<strong>Experiment conductor:<\/strong> <em>Viktor Ko\u0161\u0165an<br \/>\n<strong>Supervisor:<\/strong> doc. Ind. Vanda Bene\u0161ov\u00e1, PhD.<br \/>\n<strong>Short description: <\/strong><\/em>This experiment aims to create a dataset for engagement recognition model. The dataset will consist of videos of participants in three different states &#8211; engagement, mind wandering, disengagement; during text reading and video viewing activities. The project also aims to explore use of sequential analysis in order to distinguish between engagement and mind wandering states.<em><br \/>\n<strong>Link to a formal description of the UX experiment:<\/strong> <a href=\"https:\/\/docs.google.com\/document\/d\/1AKOXrZJuPKge1uj0GJrAw2lKwVQxcDt8QciXLu5hwTU\/edit#\">experiment description (In Slovak)<\/a><\/em><\/p>\n<hr \/>\n<h3 id=\"gaspar2017-10\"><em>Eye-tracking of user while reading source codes<\/em><\/h3>\n<p><strong>Date:<\/strong> <em>October 2017<\/em><br \/>\n<strong>Experiment conductor:<\/strong> <em>Mat\u00fa\u0161 G\u00e1sp\u00e1<\/em>r<em><br \/>\n<strong>Supervisor:<\/strong> Martin Kon\u00f4pka<br \/>\n<strong>Short description: <\/strong><\/em>Goal of this experiment is, to set up everything that we need for further testing with eye-tracker on other users while whole semester. We are going to check their eye movements while reading several codes with or without common conventions in them.<em><br \/>\n<strong>Link to a formal description of the UX experiment:<\/strong> <a href=\"https:\/\/docs.google.com\/document\/d\/1DUInCKGzXK7erxvE0MuEMr598S2zIlCz3M73Y8ZHhzU\/edit#\">experiment description (In Slovak)<\/a><\/em><\/p>\n<hr \/>\n<h3 id=\"strnadelova2017-10\"><em>Identification of emotions by eyetracking in relation to self-criticism<\/em><\/h3>\n<p><strong>Date:<\/strong> <em>October 2017<\/em><br \/>\n<strong>Experiment conductor:<\/strong> <em>Bronislava Strn\u00e1delov\u00e1 (FSaEV UK)<br \/>\n<strong>Supervisor: <\/strong> doc. Mgr. J\u00falia Kanovsk\u00e1 Halamov\u00e1, PhD (FSaEV UK)<br \/>\n<strong>Short description: <\/strong><\/em>The current pilot study will explore the relationship between self-criticism and face scanning patterns while recognizing photos of primary emotions. Participants will \u00a0complete Forms of Self-Criticising\/ Attacking &amp; Self-Reassuring Scale and a face recognition task while their eye movements will be \u00a0recorded by a Tobii X2 60 eye trackers. Participant\u00b4s eye movements and fixation on faces will be tracked by static images (photos from The Ume\u00e5 University Database of Facial Expressions) of people representing primary-universal emotions (anger, fear, sadness, surprise, joy, disgust, and neutral). There will be 42 photos presented randomly on the screen, as the set includes both men and women in three age groups (about 25 years, 45 years and 65 years). Apart from watching pictures, participants will be asked what emotion on the picture is. The results are important for understanding the role of self-criticism in relation to emotions and their scanning patterns. It can be used for diagnostic purposes and developing as well as evaluation of the interventions for highly self-critical people.<em><br \/>\n<strong>Link to a formal description of the UX experiment:<\/strong> <a href=\"https:\/\/docs.google.com\/document\/d\/1xUEOkGW-V6iMT1xMxqsc1LHxppERSu3kITlxB8_0o7k\/edit#\">experiment description (In Slovak)<\/a><\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Mat\u00fa\u0161 Tund\u00e9r: Analysis of source code reading Peter Ga\u0161par:\u00a0Analysis of Influcence of Visual Stimuli on Movies Selection J\u00e1n Hanko:\u00a0The relation of gaze fixations [&hellip;]<\/p>\n","protected":false},"author":10,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":[],"_links":{"self":[{"href":"http:\/\/www.pewe.sk\/uxi\/wp-json\/wp\/v2\/pages\/1607"}],"collection":[{"href":"http:\/\/www.pewe.sk\/uxi\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"http:\/\/www.pewe.sk\/uxi\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"http:\/\/www.pewe.sk\/uxi\/wp-json\/wp\/v2\/users\/10"}],"replies":[{"embeddable":true,"href":"http:\/\/www.pewe.sk\/uxi\/wp-json\/wp\/v2\/comments?post=1607"}],"version-history":[{"count":1,"href":"http:\/\/www.pewe.sk\/uxi\/wp-json\/wp\/v2\/pages\/1607\/revisions"}],"predecessor-version":[{"id":1608,"href":"http:\/\/www.pewe.sk\/uxi\/wp-json\/wp\/v2\/pages\/1607\/revisions\/1608"}],"wp:attachment":[{"href":"http:\/\/www.pewe.sk\/uxi\/wp-json\/wp\/v2\/media?parent=1607"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}