In these weeks I read some papers on interactive design for children with ASD (Autism Spectrum Disorder). This summary is to review the technologies in the therapy of autistic children and look for inspiration to our project Lilypad.

Generally, current technologies can be divided into three categories: a) robot; b) multi-touch screen based software; c) touchless application based on Kinect. This summary would discuss research status, such as results, limitations, and further research directions, and analyze an up-to-date research example in recent conference proceedings.

1. Robot

The use of robots as an approach to clinical purpose is suggested by a growing number of literature. Research in this area is still in its infant. This paper (Diehl, J. et al., 2011) summarized 15 peer-review literature work before 2011 and provided detailed analysis in this field.

Robots have some advantages in clinical settings for individuals with ASD. Some studies (Klin et al., 2009) showed that individuals with ASD are more sensitive with object than biological motion, which made robots more acceptable, for both therapists and parents. With the development of technology, a lot of functions can be realized in a short period. Robots can create simple and isolated social behavior specifically for individuals. In addition, they are programmed and controlled by therapists, which is more maneuverable than human or animal peers.

1.1 Four Study Categories

There are four categories when doing research in this area: a) response of ASD children on robot or robot-like characters; b) the use of robot to elicit behaviors; c) the use of robot to teach, model and practice skills; d) the use of robot to provide feedback or encouragement.

This paper (Diehl, J. et al., 2011) stated that most studies focus on individual or case studies rather than group effects, and on robot development and methodology rather than clinical effect.

1.2 Auti —  A Socially Assistive Robotic Toy

Figure -1 Auti – A social assistive robotic toy (Helen E. Andreae et al., 2014)
Figure -1 Auti – A social assistive robotic toy (Helen E. Andreae et al., 2014)

Figure -1 Auti – A social assistive robotic toy (Helen E. Andreae et al., 2014)

Auti is a socially assistive robot designed to encourage physical and verbal interaction for children with autism, especially positive behaviors such as speaking and touch gently, instead of negative behaviors, for instance, screaming or hitting roughly (Helen E. Andreae et al., 2014).

1.2.1 Research Purpose

The purpose of this study was to explore the effectiveness through the medium of Auti. In this study, two versions of Auti were provided based on ABA (Applied Behavior Analysis). Interactive Auti would vibrate randomly but give feedback according to children’s behaviors. If the children did some negative behaviors, the robot would freeze for five seconds. Active Auti would not provide any feedback to children’s performance.

1.2.2 Hypothesis

The first hypothesis is that children playing with Auti would show less negative and more positive behaviors than the control condition. The other one is that children would use more anthropomorphic language and display more social interactions.

1.2.3 Method

Participants consisted 18 children with ASD, and divided into two groups. First they were all provided with a puppy toy to play for 10 minutes. Then Auti was introduced and children’s attention was directed to it. Children were allowed to play with Auti or do other activities. The redirection to Auti repeated for twice, so that children had three times playing with Auti. After last time children played with the interactive Auti, they were given the puppy toy to see if any behaviors could be generalized. While the children who played with active Auti were given interactive Auti.

During the experiment, videos were collected and children’s behaviors were recorded.

1.2.4 Result & Discussion

The analysis of children’s behaviors showed that children playing with interactive Auti displayed more positive behaviors than those playing with active Auti. And children played with interactive Auti after playing with active Auti also display more positive interactions.

There were some unexpected findings in the experiment. Children played with interactive Auti more likely tended to treat Auti as an animal rather than a robot, though it was clear that some children did not consider Auti as an animal.

But children’s negative behaviors didn’t decrease. The possibilities were as follows. Reducing a negative behavior is much more complex than eliciting a new behavior. Children didn’t understand why the interactive Auti stop and perhaps they didn’t consider the vibration as a reward.

1.3 Future Research Direction

In this paper (Diehl, J. et al., 2011), some limitations of current research on robots for Autism therapy were demonstrated. First, the sample size was usually too small, and detailed characterization of participants were also needed. Second, most research focused more on the development and methodology of robots rather than clinical effects. In regard to the outcomes, the studies also lacked comparison between human or non-robotic characters. Third, because the research itself is in beginning stage, what kind of robot to use and how to use it are still important questions. Robots should be easier to controlled and manipulated by more than one therapists and provide better interactions with children.

2. Multi-touch screen based software

In some literature (Silton, Nava R.et al.,2014) there are more categories of technology for autistic children. In this summary, virtual environment, games and some iPad apps are considered as one kind because they both base on touch screen and focus on practicing autistic children’s social skills, social cognition and social functioning. (Kandalaft M. et al., 2012).

Activities designed on a multi-touch screen include collaborative games, collaborative story-telling, and interactive activities. (Arpita Bhattacharya, et al., 2015)

2.1 Echo (VE)

Echo is a virtual environment for children with ASD to detect discrepancies. These two papers (Alyssa M. Alcorn et al., 2013, 2014) demonstrated the how the VE worked and gave some initial recommendations.

Highly structured and predictable environment usually limited motivation for children with ASD to communicate. But in some studies irrelated to autism (McClenny, C. S et al., 1992), unexpected events can initiate children’s communication behaviors. The overall strategy of project Echo is to create the necessity to initiate communication.  The word “discrepancy” here could be understood as “expectation-violating events” or “surprise”. For example, virtual characters(VC) made some mistakes, or disappeared on the screen.

2.1.1 Research questions

In the project Echo, researchers wanted to find out:

  1. a) Whether virtual environments might be systematically modified to extrinsically motivate social communications.
  2. b) Whether children’s reactions to discrepancies are isolated occurrences, or part of a wider pattern;
  3. c) Whether these reactions were limited to specific children, or common across the participant group;
Figure -2 Echo VE (Alyssa M. Alcorn et al., 2014)
Figure -2 Echo VE (Alyssa M. Alcorn et al., 2014)

2.1.2 Method

Participants in project Echo were 28 children with ASD from two UK school sites. Site 1 is a small UK primary school for intellectual disabilities with an autistic-specific class. Site 2 is a medium-large UK primary school with an autistic resource base. They used Echo for six to eight weeks, and each of them completed several 10-20 minute sessions per week. The hardware is a multi-touch screen with a virtual environment Magic Garden.

In Site 1 children would complete communication with VC Andy under the instruction of one researcher, who would control the environment through a second monitor (not visible to the child), other researchers operated the camera and provided support in other rooms. In Site 2 children would operate the VE on their own, several researchers control the environment in another room. During the experiment, children’s behaviors would be recorded both in spreadsheets and videos.

2.1.3 Result & Discussion

The analysis showed that participants formed specific expectations on the VE and displayed socially directed reactions. Discrepancies may be a promising design element for technological interventions targeted at autistic children.

2.1.4 Recommendations

Depending on their project Echo, the researchers summarized their strategy above as “discrepancy-detected opportunities” (DDO) and provided some initial recommendations for designer who aims to motivate autistic children via DDO. These recommendations are as follows:

  1. The environment should maintain high integrity, including the art style of the environment, the VC, and the core goals of the tasks. Although the strategy is to create some surprise, the discrepancy should not be too large to scare the children.
  2. Researchers should conduct children to operate the VE, instead of forcing them to interact with DDO, in which case they may find the VE disinteresting, stressful and incomprehensible.
  3. Whether children pick out the errors by themselves or with the assistance of instructors, the environment should be resolvable and avoid creating unachievable tasks.
  4. Because the preference of individuals differs from person to person, the VE should provide a variety of DDOs to accommodate children’s interest.
  5. Because each child may not be able to take up all the opportunities. Designers should create large number of DDOs to ensure the frequency for children to detect the discrepancy.
  6. The communication between children and the VE should be open-ended and interesting, which may provide an environment more “daily life”.

2.2 Games and Apps

Other technologies for autism based on multi-touch screen are most in the form of apps or games. Some of them focus on specific skills such as Find Me, which helps children with ASD to joint attention (Fletcher-Watson, S. et al., 2012), and the Transporters videos focusing on facial emotion recognition (Baron-Cohen et al., 2007), or some daily life practice (Parsons, S. et al., 2006).

Figure -3 Find Me! (Fletcher-Watson, S. et al., 2012)
Figure -3 Find Me! (Fletcher-Watson, S. et al., 2012)

3. Touchless Application based on Kinect

Touchless applications are mostly related to Kinect, which can provide a naturally shared experience in which individuals with varied intellectual and communication abilities can participate (Munson, J. et al., 2012). Current research focuses on specific skills such as eye contact, pointing, imitation, etc (Casas, X. et al., 2012).

This research explored the effect of motion-based activities to engage children with ASD in classrooms. (Arpita Bhattacharya, et al., 2015)

3.1 Research question

The goal of this project was to explore whether this design of activities could be used in a classroom setting, and to examine the impact on students’ engagement and social behaviors.

3.2 Method

This study collaborated with a school for children with autism, including 18 students between the age of 8 to 19. Students were divided into two classrooms, each of the classroom equipped a Kinect on a whiteboard that supports touch and a camera.

The study had three phases: formative research from months 1-2, iterative prototyping from months 3-7, and evaluation in months 8-9.

Three prototypes were established and developed in the first and second phases.

  1. Children would catch objects to score points represented by live images or avatars in different themes.
  2. Free-form interaction. Players’ live image or avatar could be reflected on the screen. Up to two players could participate when avatars were selected. And up to six players could participate when live images were selected.
  3. Interactive story-telling. In the interactive version of story-telling, images accompanied with audio track would be displayed to encourage children to enact parts of the story through movement and sound. Sometimes the story would pause and wait for children’s reflection. Once Kinect recognize their gesture, the animation would go on.

3.3 Result & Discussion

During Phase 2, teachers reported that the students were engaged with Kinect activities. It was evidenced by video records that children displayed a high rate of positive facial expressions, body language and comments.

The social behaviors were motivated in the second phase: peer interaction, “audience” effect and “spillover” effect.

Teachers reported that children would like to give comments, gestural or physical feedbacks to their peers, and help each other when completing some tasks together. Due to the fact that there could only be two players when Kinect is selecting two avatars, other children had to play the roles of “audience” when they are waiting for their turns to play. As non-playing peers, they often cheered and danced, or made comments about theirs, in which case, the original two-player version game could expand to a classroom-level activity. According to teachers’ report, children displayed more interactive and chatty after these activities. So they decided to keep children in the classroom and let them to play to take the advantage of the engagement.

In some prototypes such as free-form interaction, children showed better motion coordination, even the children with specific coordination issues could take part in dancing and moving actively.

There were collateral effects on behaviors in this study. The problems of regulation and attendance decreased. Children could wait for computer to restart patiently.

3.4 Future Direction

Sheer novelty may contribute a lot to students’ engagement with the technology, but social initiation and responses among peers may be promoted by something other than novelty. Researchers proposed that it may be the combination of a fun activity, the embodied nature of the interaction and teachers’ facilitative work., It remains an important area for future research to disentangle the relative contributors.

Figure -4 Motion-based activity in a classroom setting
Figure -4 Motion-based activity in a classroom setting

4. Conclusion and Inspiration

This report summarizes three primary categories of technologies in autism therapy and takes a latest research as an example for each one.

Technologies in autism are developing fast and there are a lot of studies on this field. Future work should focus more on interaction between machines and humans and clinical effects. In certain studies, design should provide more details to adapt the specific scenario.

In regard to Lilypad, the functional app which is different from the categories above, it can corporate well with these technologies for collecting and analyzing data. 

Reference

Alyssa M. Alcorn, Helen Pain, and Judith Good. 2013. Discrepancies in a virtual learning environment: something “worth communicating about” for young children with ASC?. InProceedings of the 12th International Conference on Interaction Design and Children (IDC ’13). ACM, New York, NY, USA, 56-65.

Alyssa M. Alcorn, Helen Pain, and Judith Good. 2014. Motivating children’s initiations with novelty and surprise: initial design recommendations for autism. In Proceedings of the 2014 conference on Interaction design and children (IDC ’14). ACM, New York, NY, USA, 225-228.

Arpita Bhattacharya, Mirko Gelsomini, Patricia Pérez-Fuster, Gregory D. Abowd, and Agata Rozga. 2015. Designing motion-based activities to engage students with autism in classroom settings. In Proceedings of the 14th International Conference on Interaction Design and Children (IDC ’15). ACM, New York, NY, USA, 69-78.

Baron-Cohen, S., Golan, O., Chapman, E., & Grander, Y. (2007). Transported to a world of emotion. Psychologist, 20(2):76.

Casas, X., Herrera, G., Coma, I., and Fernández, M. 2012. A kinect-based augmented reality system for individuals with autism spectrum disorders. In Proceedings of the International Conference on Computer Graphics Theory and Applications and International Conference on Information Visualization Theory and Applications (GRAPP/IVAPP ‘12). SciTePress, 440–446.

Diehl, J. et al. 2011. The clinical use of robots for individuals with Autism Spectum Disorders: A critical review. Res. Autism Spect. Dis. 6, (2011), 249–262.

Helen E. Andreae, Peter M. Andreae, Jason Low, and Deidre Brown. 2014. A study of auti: a socially assistive robotic toy. In Proceedings of the 2014 conference on Interaction design and children (IDC ’14). ACM, New York, NY, USA, 245-248.

Kandalaft M., Didehbani N., Krawczyk D., Allen T., Chapman S. (2012). Virtual reality social cognition training for young adults with high-functioning autism. Journal of Autism and Developmental Disorders, ▪▪▪, 1–11.21360019

Klin, A., Lin, D. J., Gorrindo, P., Ramsay, G., & Jones, W. (2009). Two-year-olds with autism orient to non-social contingencies rather than biological motion. Nature, 459, 257–261.

Laura Bartoli, Franca Garzotto, Mirko Gelsomini, Luigi Oliveto, and Matteo Valoriani. 2014. Designing and evaluating touchless playful interaction for ASD children. In Proceedings of the 2014 conference on Interaction design and children (IDC ’14). ACM, New York, NY, USA, 17-26.

McClenny, C. S., Roberts, J. E., & Layton, T. L. (1992). Unexpected events and their effect on children’s language. Child Language Teaching and Therapy, 8(3):229-245.

Munson, J. and Pasqual, P. 2012. Using Technology in Autism Research: The Promise and the Perils. Computer. 45(6), 89-91. DOI=http://dx.doi.org/10.1109/MC.2012.22

Parsons, S., Leonard, A., & Mitchell, P. (2006). Virtual environments for social skills training: comments from two adolescents with autistic spectrum disorder. Computers & Education, 47(2):186-206. DOI=10.1016/j.compedu.2004.10.003

Silton, Nava R., and IGI Global. Innovative Technologies to Benefit Children on the Autism Spectrum. Hershey, Pennsylvania (701 E. Chocolate Avenue, Hershey, PA 17033, USA): IGI Global, 2014. Web.