Unmanned Ground Vehicles (UGVs) are essential tools during war, and in counter-IED, surveillance, and reconnaissance operations. Currently, UGVs are mostly tele-operated and require the operator's continuous attention and assistance. Video streaming between the UGV and the Operator Control Unit (OCU) helps the operator visualize the environment of the UGV. This is challenging under degraded communications found in Military Operations on Urban Terrain (MOUT). A semi-autonomous navigation system that helps the operator command the UGV using intermittent still images will be useful. IAI and its collaborators, QinetiQ, and Rutgers University have been awarded a follow-on contract entitled “Bio-Inspired Visual Navigation: From Landmarks via Bearing to Controls.” This research team has demonstrated the feasibility of a novel approach with three main innovations. First, a novel bio-inspired fixation-based segmentation method is used to extract closed contours around visually salient objects in the image, and these closed contours are used as landmarks. Salient landmarks are automatically detected and tracked robustly, under viewpoint and scale change even in texture-less environments. Second, landmarks are used as visual guides for bearing-based control of the robot to its destination. Third, the OCU provides enhanced situational awareness and intuitive interaction between the operator and UGV. Landmark detection and tracking algorithms were evaluated on more than 20 videos of indoor scenarios. Next, an integrated prototype will be developed, and system capabilities will be demonstrated in a wide range of environments including inside buildings and tunnels. The system will be field tested on QinetiQ North America’s Tactical Robot Controller (TRC) and a TALON or Dragon Runner UGV.