Abstract: 
    
            
                    This paper presents an AR videoconferencing approach merging two remote rooms  
into a shared workspace. Such bilateral {AR} telepresence inherently suffers  
from breaks in immersion stemming from the different physical layouts of  
participating spaces. As a remedy, we develop an automatic alignment scheme  
which ensures that participants share a maximum of common features in their  
physical surroundings. The system optimizes alignment with regard to initial  
user position, free shared floor space, camera positioning and other factors.  
Thus we can reduce discrepancies between different room and furniture layouts  
without actually modifying the rooms themselves. A description and discussion  
of our alignment scheme is given along with an exemplary implementation on  
real-world datasets.