The first and most important goal for us this week is to get a fluently moving spotlight. This moving light source is indispensable for the pilot we want to perform at the end of this week.
On monday (16-10-2017) we experimented with different heights for the spotlight to move on. After we determined the ideal height for our experiments we build a frame on which the spotlight rails could be placed. The angle with which the spotlight will be pointed to the painting can be changed as desired when this turns out to be necessary later on in the process.
Because we deemed it necessary to create more adhesion between the wheels of the arduino train and the surface said train would ride on we experimented with different bottom surfaces. Fabric created too much friction and wood seemed to deliver not enough friction for the stepper motor to get the train to move fluently, so we created a duct tape tablecloth to be our bottom surface.
We also altered the top part of the arduino train to better fit the spotlight it is supposed to carry.
Last week we decided our arduino train was supposed to move using only one rail instead of two so we made some alterations on the rails as well.
On tuesday (17-10-2017) we prepared to have a presentation for our fellow students and their experts. We think everything went okay, the presentation was roughly based on the summary (Week 1,2,3) which has also been posted on this blog.
After the presentation we still could not get the arduino stepper motor to work. Since a moving light source is crucial for our set-up we decided we needed to buy another motor. A small trip to an electronics shop in the Voorstraat turned out to be to no avail, they did not sell anything we could use.
The stepper motor we ordered online arrived thursday (19-10-2017). Turned out our old stepper motor was indeed broken but our problem was not fixed by using the new one.
We suspect the drive of our arduino device is broken as well.
We know our code is correct because several experts looked into it (Dr. Y. Song (Wolf) and Willemijn).
Since we wanted to start performing a pilot on friday we had to think of other ways to constantly move a spotlight. Eventually we came up with an idea of using a rope, two pulleys and a counter weight to get the job done.
Sadly, the arduino was not the only aspect of our setup that broke down.
The clamps holding the passe partout foamboard up were not exactly aligned in a parallel way. We added an extra beam to our easel frame to get rid of the offset this created in the foam board. With the foam board, and therefore the painting, standing straight, the calibration of our test subjects will hopefully be even more precize.
As could be seen in the examples of different visualization methods of the eye tracking results Tobii studio provides, given in the summary of week 1,2,3, our results are flickering and not looking really good at all.
This problem occurs after changing the screen ratio in Tobii studio. We changed this ratio because this helped us getting a better quality image. Since flickering black bars do not really help in making our results look good we did some test and it turns out that a screen ratio of 1024×768 provides optimal results, no flickering results and proper video quality are attained.
On friday (20-10-2017) it was time for the real thing! We are very happy to announce that our first pilot proceeded in a successful way. We were not able to test different frame colours and Willemijn was not able to provide us with paintings printed with different types of gloss but earlier she did provide us with a real painting and a 3d printed version of said painting so in theory we could start the pilot properly.
Before starting our pilot we made a general instruction for our different test subjects so we could provide a clear overview of our experiment. It is important that we will not tell our test subjects to specifically look into gloss and whether or not they are looking at a printed or a real painting. We will tell them they are going to look at a painting and that their eyes will be tracked in the process.
We also prepared a survey to support our eyetracking results. We want to compare the actual experience of the test subjects to the results provided by the eyetracking software.
Is it possible to improve the precision of the results of the Tobii eyetracker, whether or not our test setup is suitable for all different kinds of test subjects and if it our setup can be made even more user friendly (for both test subjects and researcher) are things we want to look into by performing this pilot.
The survey which will hopefully supply us with answers to these questions can be looked into by clicking on the link below.
Because of the fact that not only the open day of the TU Delft but also a Renault master class are planned to be happening at the same day our pilot was our working area was, to mildly put it, a little busy. The beamer we usually use and we reserved for the duration of the project was already given away so we had to change or setup to match a different beamer. Some of our main calibration parameters changed because of this.
The battery of the camera emptied, sometimes the fact that some test subjects wore glasses turned out to cause some minor differences in handling the software and a complete dark room turned out to be far less than ideal for the Tobii device its functionality.
But in the end we managed to get 22 people tested. 10 Of them saw the real painting, 10 The printed version and for 2 persons we could not use the results because the camera failed.
Our main goal for next week will be to analyse the results from our pilot. This analyzation process will executed by analysing both our survey, we will probably using SPSS, and our eyetrack results to be able to improve our test set up further by integration of the test results.
The visualisation and realisation process of the eyetrack results will proceed differently than initially planned. Dr. Y. Song (Wolf), the expert on the Tobii eye tracking device, advised us on tuesday (17-10-2017) to not only look into the heat maps, (moving) areas of interest and gaze plots Tobii studio lets you create to visualize your results but to also look into the raw data. Tobii enables the user to export this data to an excel file, using this feature we will be able to take a deeper look into the actual precision of the eye tracking device.
To compare the eye track data and test its limitations we conducted an experiment using squares and an axis system in which we can locate said squares matching reality. By doing so we can compare the real location of the squares and the location of Tobii lists as being looked into. By making this comparison we will be able to draw some important conclusions about the precision involved in performing eyetrack experiments using this specific device (Tobii x60).
The raw data Tobii provides is a bit difficult to understand. To help us to understand the data we are looking at we conducted an experiment as well. In this experiment the subject first looks at the origin of an axis system and then at a dot.
Really understanding the data Tobii provides and making calculations about the precision of our eyetracking experiments using coördinates and deviations from these coördination using tests such as shown above will be looked into next week.
We also thought of ways to properly present our project on the science fair and we decided we needed a live stream in order for us to be able to eye track people live on the fair next to just showing the results from our actual project.
After some research we managed to create a successful live stream test. We will keep the detailed version of our plans for the science fair a surprise for now but we are working on it.
In short: next week we will be busy processing results, integrating results and making steps in understanding and using the raw Tobii data to its full potential.