Robotics simulation is a key enabler for automation and control development, allowing safer experimentation, faster design iterations, and reduced development cost. Nevertheless, the current simulator ecosystem is highly fragmented, spanning open-source and commercial tools with different levels of physical fidelity, performance, and ecosystem integration. As a result, simulator selection is frequently driven by familiarity or availability rather than explicit task requirements, often leading to suboptimal engineering workflows. This paper proposes a task-oriented decision framework to support reproducible and transparent selection of robotics simulators based on a fixed and structured set of evaluation criteria. These criteria cover: (i) physical fidelity and contact modeling; (ii) sensor modeling and visual realism; (iii) performance and scalability aspects, including headless execution, parallelism, and GPU acceleration; (iv) ecosystem integration with automation, control, and learning pipelines, including ROS/ROS 2 compatibility; (v) extensibility and programmability; and (vi) practical constraints such as hardware requirements, licensing models, and learning curve. The framework is operationalized through a checklist and scoring matrix guided by four key questions addressing the target task, fidelity-versus-speed priorities, target software stack, and sim-to-real transfer requirements. To validate feasibility in a representative engineering workflow, a URDF-based modeling and simulation pipeline is implemented and used to compare Gazebo, as an open-source physics-based simulator, against MATLAB/Simulink, representing a commercial model-based simulation environment. The comparison reports practical indicators, including setup effort, integration complexity, computational requirements, and runtime behavior, for representative motion and sensing scenarios. The results highlight consistent trade-offs across different user profiles and application needs, while also revealing open gaps in the field, notably the lack of unified multi-task benchmarks and joint metrics capable of simultaneously capturing simulation fidelity, computational performance, and sim-to-real transfer effectiveness.
Previous Article in event
Next Article in event
A Decision Framework to Select Robotics Simulators for Automation and Control Tasks: Criteria and Validation
Published:
07 May 2026
by MDPI
in The 3rd International Electronic Conference on Machines and Applications
session Automation and Control Systems
Abstract:
Keywords: robotics simulation; simulator selection; MATLAB/Simulink; Gazebo; physics engines; automation and control; URDF workflow; sim-to-real.
