HusqvarnaGroup

Master Thesis: Robust Localization for Autonomous Robots in Feature-Sparse Indoor Environments

Jonsered Full time

We are now looking for master students within Automation & Mechatronics, Computer Science or Physics that are interested in doing their thesis project at our Heavy Equipment R&D Department at Husqvarna Group!
  

Accurate localization is a fundamental capability for autonomous robots operating indoors. It enables navigation, mapping, and interaction with the environment. In typical settings, robots rely on sensors such as LiDAR, cameras, and—in some cases—GPS to estimate their position. These sensors work well in environments rich with geometric or visual features, such as corridors, furniture, or textured walls. However, in sparse-feature environments, such as empty warehouses, long tunnels, or large industrial halls, traditional localization methods often fail or degrade significantly. LiDAR-based localization depends on detecting and matching geometric features in the environment. In sparse settings, the lack of distinct structures leads to ambiguous or noisy scan matching, reducing accuracy and increasing drift. Similarly, vision-based systems require visual landmarks or textures to perform feature extraction and matching. In environments with uniform surfaces, poor lighting, or minimal contrast, camera-based localization becomes unreliable. GPS, while effective outdoors, is typically unavailable or severely degraded indoors due to signal attenuation and multipath effects. 
 

These limitations pose a significant challenge for mobile robots that must operate autonomously in such environments. Without reliable localization, tasks like path planning, obstacle avoidance, and map building become error-prone or infeasible. Moreover, installing artificial landmarks or dense infrastructure to aid localization is often impractical, especially in dynamic or temporary deployments. The problem of localization in feature-sparse environments remains an open challenge. Addressing it requires innovative solutions that can operate reliably with minimal infrastructure, adapt to changing conditions, and generalize across diverse indoor settings. This thesis aims to investigate and develop methods that push the boundaries of positioning under constrained conditions, contributing to more scalable and resilient localization systems. 

The objective of this thesis is to improve the accuracy of robot localization in feature-sparse environments, where conventional methods struggle. The focus is on identifying and evaluating alternative sensing strategies and developing novel localization techniques that can operate reliably with limited environmental information. The ultimate goal is to create a robust and adaptable solution suitable for challenging scenarios. 

The thesis work will include, but not be limited to, the following tasks:

  • Conduct a literature review on current localization methods and sensor technologies, with a focus on challenges in feature-sparse environments
     
  • Investigate alternative sensors and localization methods 
     
  • Design, implement, and test a prototype localization system using selected sensors and algorithms 

Proper area of education 

Master in Automation & Mechatronics, Computer Science, Physics or another relevant field.

How to apply   
Please send in your application with CV and cover letter no later than 28th November. Due to GDPR, we do not accept applications by email. 
 

Read about Husqvarna Group here:   
https://www.husqvarnagroup.com/    

 

Husqvarna Group is a world-leading producer of outdoor power products for garden, park and forest care. Products include chainsaws, trimmers, robotic lawn mowers and ride-on lawn mowers.  

The Group is also the European leader in garden watering products and a world leader in cutting equipment and diamond tools for the construction and stone industries. The Group’s products and solutions are sold under the main brands Husqvarna & Gardena that serve professionals in more than 140 countries.

Last date to apply:

27 November 2025