Displaying all 3 publications

Abstract:
Sort:
  1. Buhari AM, Ooi CP, Baskaran VM, Phan RCW, Wong K, Tan WH
    J Imaging, 2020 Nov 30;6(12).
    PMID: 34460527 DOI: 10.3390/jimaging6120130
    Several studies on micro-expression recognition have contributed mainly to accuracy improvement. However, the computational complexity receives lesser attention comparatively and therefore increases the cost of micro-expression recognition for real-time application. In addition, majority of the existing approaches required at least two frames (i.e., onset and apex frames) to compute features of every sample. This paper puts forward new facial graph features based on 68-point landmarks using Facial Action Coding System (FACS). The proposed feature extraction technique (FACS-based graph features) utilizes facial landmark points to compute graph for different Action Units (AUs), where the measured distance and gradient of every segment within an AU graph is presented as feature. Moreover, the proposed technique processes ME recognition based on single input frame sample. Results indicate that the proposed FACS-baed graph features achieve up to 87.33% of recognition accuracy with F1-score of 0.87 using leave one subject out cross-validation on SAMM datasets. Besides, the proposed technique computes features at the speed of 2 ms per sample on Xeon Processor E5-2650 machine.
  2. Oh YH, See J, Le Ngo AC, Phan RC, Baskaran VM
    Front Psychol, 2018;9:1128.
    PMID: 30042706 DOI: 10.3389/fpsyg.2018.01128
    Over the last few years, automatic facial micro-expression analysis has garnered increasing attention from experts across different disciplines because of its potential applications in various fields such as clinical diagnosis, forensic investigation and security systems. Advances in computer algorithms and video acquisition technology have rendered machine analysis of facial micro-expressions possible today, in contrast to decades ago when it was primarily the domain of psychiatrists where analysis was largely manual. Indeed, although the study of facial micro-expressions is a well-established field in psychology, it is still relatively new from the computational perspective with many interesting problems. In this survey, we present a comprehensive review of state-of-the-art databases and methods for micro-expressions spotting and recognition. Individual stages involved in the automation of these tasks are also described and reviewed at length. In addition, we also deliberate on the challenges and future directions in this growing field of automatic facial micro-expression analysis.
  3. Sapai S, Loo JY, Ding ZY, Tan CP, Baskaran VM, Nurzaman SG
    Soft Robot, 2023 Dec;10(6):1224-1240.
    PMID: 37590485 DOI: 10.1089/soro.2022.0188
    Data-driven methods with deep neural networks demonstrate promising results for accurate modeling in soft robots. However, deep neural network models rely on voluminous data in discovering the complex and nonlinear representations inherent in soft robots. Consequently, while it is not always possible, a substantial amount of effort is required for data acquisition, labeling, and annotation. This article introduces a data-driven learning framework based on synthetic data to circumvent the exhaustive data collection process. More specifically, we propose a novel time series generative adversarial network with a self-attention mechanism, Transformer TimeGAN (TTGAN) to precisely learn the complex dynamics of a soft robot. On top of that, the TTGAN is incorporated with a conditioning network that enables it to produce synthetic data for specific soft robot behaviors. The proposed framework is verified on a widely used pneumatic-based soft gripper as an exemplary experimental setup. Experimental results demonstrate that the TTGAN generates synthetic time series data with realistic soft robot dynamics. Critically, a combination of the synthetic and only partially available original data produces a data-driven model with estimation accuracy comparable to models obtained from using complete original data.
Related Terms
Filters
Contact Us

Please provide feedback to Administrator ([email protected])

External Links