In the field of mobile robotics, Simultaneous Localization and Mapping (SLAM) is an algorithmic approach to the computational problem of creating and updating a map of an environment while simultaneously keeping track of where the robot is within the environment. Applications of a SLAM algorithm are important for autonomous mobile systems to traverse an environment while avoiding obstacles and accurately achieving designated goal destinations. This paper presents the design of a SLAM-driven controller for a semi-autonomous omnidirectional mobile robot. Input to the system comes from a Brain Computer Interface in the form of simple driving commands or a goal destination as decided by the user. Due to latency issues of reacting and responding in real time, the system must safely navigate following the last given commands until it runs out of free space, reaches a goal designation, or receives a new command. The robotic system utilizes a three-wheeled robot kit with an upgraded sensor system. The Intel RealSense Depth Camera D435 and two lidar sensors are utilized to construct a full 360° field of view. The SLAM algorithm and system controllers are developed using the Robot Operating System (ROS). The controllers are developed and tested within Gazebo, which is a physics simulation engine utilized for rapid prototyping. Testing was performed to validate controller performance when given varying commands as well as performing long distance path planning and obstacle avoidance. The system was often capable of achieving its goal destinations with a small error of around 3% or less though the error was found to increase with the more commands the system processed.