This research introduces a new GUI-based method to assess cognitive stimulus in human-robot interaction, aimed at understanding user workload and state, replacing outdated tools like Dual N-Back. The study involved 30 participants interacting with SMARTmBOTs and using wearable biosensors, with data integrated into a multi-modal perception model to analyze cognitive loads. Developed using PyQt for ROS2 compatibility, the GUI, designed for this specific experiment, is being made open source for broader application in human-robot interaction research.