• JoomlaWorks Simple Image Rotator
  • JoomlaWorks Simple Image Rotator
  • JoomlaWorks Simple Image Rotator
  • JoomlaWorks Simple Image Rotator
  • JoomlaWorks Simple Image Rotator
  • JoomlaWorks Simple Image Rotator
  • JoomlaWorks Simple Image Rotator
  • JoomlaWorks Simple Image Rotator
  • JoomlaWorks Simple Image Rotator
  • JoomlaWorks Simple Image Rotator
 
  Bookmark and Share
 
 
Master's Dissertation
DOI
https://doi.org/10.11606/D.5.2019.tde-11122019-112343
Document
Author
Full name
Mirian de Cesaro Revers Biasão
E-mail
Institute/School/College
Knowledge Area
Date of Defense
Published
São Paulo, 2019
Supervisor
Committee
Brentani, Helena Paula (President)
Boggio, Paulo Sérgio
Martins Junior, David Corrêa
Valente, Kette Dualibi Ramos
Title in Portuguese
Classificação da gravidade do transtorno do espectro autista baseada no padrão de rastreamento do olhar
Keywords in Portuguese
Aprendizado de máquina supervisionado
Classificação
Fixação ocular
Índice de gravidade de doença
Rastreamento do olhar
Transtorno do espectro autista
Abstract in Portuguese
O transtorno do espectro autista cursa com alterações precoces na percepção visual, culminando com déficits na comunicação social e comportamentos restritos e estereotipado. Dados objetivos sobre o padrão visual dos indivíduos são obtidos através da técnica de rastreamento do olhar. Sabe se que a técnica é eficaz para identificar indivíduos com TEA quando comparados a controles, mas, ainda não há trabalhos que utilizem esses dados a fim de classificar subtipos do transtorno. O objetivo deste estudo é preencher essa lacuna, e utilizar os dados de rastreamento do olhar associado a padrões de aprendizado de máquina a fim de classificar subgrupos de TEA, quanto a gravidade. Para tanto, foi utilizado modelo baseado em mapas de atenção visual, que utiliza os dados da captação sem filtros. O classificador foi testado pelo método de validação cruzada. Os resultados mostraram que foi possível classificar em TEA grave e não grave com média de 85% de precisão, atingindo o máximo de 88% de precisão, 87% de sensibilidade e 60% de especificidade. Espera-se que novos estudos, envolvendo número maior de indivíduos e outras características fenotípicas, possam ser desenvolvidos utilizando esta técnica, a fim de identificar biomarcadores para o transtorno
Title in English
Classification of the severity of autistic spectrum disorder based on the eye tracking pattern
Keywords in English
Autism spectrum disorder
Classification
Eye tracking
Fixation ocular
Severity of illness index
Supervised machine learning
Abstract in English
Autism spectrum disorder presents with early alterations in visual perception, culminating with deficits in social communication and restricted and stereotyping behaviors. Objective data on the visual pattern of individuals are obtained through the technique of eye tracking. It is known that the technique is effective to identify individuals with ASD in comparison with controls, but there are still no studies that use these data to classify subtypes of the disorder. The purpose of this study is to fill this gap, and to use the data from the eye tracking associated with machine learning patterns in order to classify subgroups of ASD as the severity. For this, a model based on visual attention maps was used, which uses the captured data without filters. The classifier was tested by the cross-validation method. The results showed that it is possible to classify in severe and non-severe ASD with an average of 85% accuracy, reaching a maximum of 88% accuracy, 87% sensitivity and 60% specificity. It is hoped that new studies, involving larger numbers of individuals and other phenotypic characteristics, could be developed using this technique in order to identify biomarkers for the disorder
 
WARNING - Viewing this document is conditioned on your acceptance of the following terms of use:
This document is only for private use for research and teaching activities. Reproduction for commercial use is forbidden. This rights cover the whole data about this document as well as its contents. Any uses or copies of this document in whole or in part must include the author's name.
Publishing Date
2019-12-11
 
WARNING: Learn what derived works are clicking here.
All rights of the thesis/dissertation are from the authors
CeTI-SC/STI
Digital Library of Theses and Dissertations of USP. Copyright © 2001-2024. All rights reserved.