Artificial Intelligence (AI) has been integrated into interior design in the past few years to open a vast field of ideas, visualizations, customizations and creative support. There are many AI tools which can generate room designs using trained models. However, AI tools lack precision at open customization, fine-tuning structural aspects such as walls, windows, ceilings and floor layouts. In this project, an AI-powered interior design system which is user-friendly as well as has features such as transforming furnished rooms to empty ones, customizable spaces and furniture, setting the foundation for realistic virtual redesigning. For object segmentation, our system used the Segment Anything Model (SAM) for object segmentation before room emptying, along with Stable Diffusion v2 inpainting. Further to improve the room’s geometry and structure, ControlNet MLSD was integrated. A two-pass pipeline intensive inpainting followed by ControlNet guided refinement, making sure all furniture is removed and the natural layout of the room is maintained along with dimensions. Outputs of the system shows visually rational and spatially accurate empty-room renders, making this project a valuable asset for architects, designers, as well as homeowners giving efficient design and renovation previews. The work demonstrated a step towards an intelligent and interactive AI-driven design system with controllable and realistic outcomes.
Introduction
This study presents an AI-driven system designed to improve interior design by automatically removing furniture from real room images and generating accurate, structurally consistent empty spaces ready for redesign. Traditional interior design is limited by the inability to virtually test layouts in real environments, and current AI diffusion models—such as Stable Diffusion—often fail to produce geometrically correct or context-aware designs. Existing approaches generate attractive but impractical scenes that overlook real-world constraints like doors, windows, and user preferences. This work addresses these gaps by integrating segmentation, diffusion modeling, and structural control into a unified space-planning pipeline.
Core Contributions
The project introduces a multi-stage intelligent assistant that:
Defurnishes rooms using SAM segmentation and Stable Diffusion inpainting to create realistic empty spaces.
Preserves architectural geometry using ControlNet with MLSD line-detection to ensure straight edges, proper room proportions, and accurate layouts.
Reduces artifacts by refining masks through dilation, smoothing, and negative prompting.
Supports user customization, allowing control over style, colors, and design preferences.
Objectives
The key goals include building a zero-shot defurnishing pipeline using SAM + diffusion inpainting, enforcing structural consistency with ControlNet-MLSD, minimizing generative artifacts using refined masks and prompts, and creating a framework that enables precise, interactive room layout adjustments.
Literature Review Insights
Existing generative models like latent diffusion, RoomDiffusion, and various interior-specific tools produce visually impressive interiors but fail in structural accuracy, context awareness, or interactivity. ControlNet improves structural control, while prior defurnishing studies focus only on cleaning rooms without supporting further redesign. The current system bridges these gaps by combining defurnishing, geometry control, and user-guided customization into one workflow.
Methodology
The system follows a six-phase pipeline:
Dataset preparation and image resizing
SAM-based object segmentation and mask refinement
Diffusion-based inpainting to remove furniture
MLSD-guided ControlNet to preserve room structure
Integration through PyTorch, Diffusers, ControlNet_Aux, and OpenCV
Evaluation using visual fidelity and structural consistency metrics
Users upload a room image, SAM extracts furniture masks, masks are refined, diffusion removes objects, and ControlNet ensures the reconstructed space maintains original geometry.
Results and Discussion
The integrated approach effectively removes furniture while preserving architectural details, producing realistic empty rooms with coherent textures and lighting. Stable Diffusion inpainting ensures natural restoration of floors and walls, while ControlNet prevents distortions such as warped edges or misaligned corners. Qualitative results show sharp structure and high realism, and quantitative measures like FID and CLIP similarity confirm improved accuracy.
Limitations and Future Work
Residual artifacts, lighting inconsistencies, and GPU-based speed limitations remain challenges. Future enhancements include SAM-2 for better segmentation, depth-aware ControlNet for stronger spatial understanding, and style-transfer diffusion models for personalized interior redesign.
Conclusion
The initiative introduces an advanced artificial intelligence-driven tool for designing interiors efficiently by automatically relocating objects in images and generating precise virtual room layouts.
The devised workflow seamlessly combines SAM for image segmentation, Stable Diffusion\'s inpainting feature for removing objects, and ControlNet’s Multi-Layer Structure Detection Network for maintaining structural integrity, thereby harmoniously balancing artistic freedom with precise geometry in its output.
A sophisticated approach surpasses conventional AI development software\'s tendency to overlook spatial coherence while also lacking in enabling direct manipulation options.
In employing an iterative approach comprising initial removal via strong inpainting techniques coupled with subsequent enhancement using ControlNet\'s guidance, we guarantee thorough elimination of extraneous furnishings and decorations without compromising the structural beauty of the room layout. Empirical findings indicate that this model generates consistent, error-free outcomes featuring intact wall, floor, and illumination elements.
This tool significantly speeds up initial stages in designing interiors and renovating spaces while laying groundwork for upcoming improvements like dynamic customizations through technology, virtual reality visualizations, and automatic design recommendations. Further research shall focus on enhancing mask accuracy, adjusting diffusion parameters tailored to specific domains, and incorporating customizable elements like room dimensions into the creative workflow to increase interactivity and contextual relevance in designing projects.
In summary, this proposal aims at advancing artificial intelligence technologies by creating systems capable of both creative generation and precise control, thus closing the gap between automation and human-designed excellence in engineering tasks.
References
[1] M. S. Khan and A. Hussain, “AI Knows Aesthetics: AI-Generated Interior Design Identification Using Deep Learning Algorithms,” 2023.
[2] B. Sekerci, D. Develier, and C. Kahraman, “Artificial Intelligence Applications in Interior Design,” in CII 2024 E-Proceedings, 2024.
[3] P. Sharma and R. Gupta, “Implementation of AI in Interior Design,” Int. J. Res. Eng. Sci., vol. 11, no. 4, pp. 112–120, 2023.
[4] Y. Zhang, H. Wang, and L. Li, “Adding Conditional Control to Diffusion Models for Enhanced Interior Design Generation,” arXiv:2311.04567, 2023.
[5] R. B. L. Thomas and K. Mehta, “AI-Powered Home Interior Design Solutions,” J. Comput. Intell. Syst., vol. 9, no. 2, pp. 85–94, 2022.
[6] X. Chen, J. Zhou, and P. Wu, “Advanced AI Techniques for Interior Design Applications,” arXiv:2310.03602v4, 2023.
[7] A. Banerjee and S. Das, “iDesigner: An AI-Based Interior Design Support System,” in Proc. Int. Conf. Human-Computer Interaction, pp. 202–213, 2023.
[8] L. Huang, Z. Liu, and F. Chen, “An Empty Room is All We Want: AI-Driven Interior Design from Blank Layouts,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR) Workshops, pp. 55–64, 2023.
[9] T. Nguyen and M. Lee, “ControlRoom: AI-Driven Room Customization and Layouts,” in Proc. ACM Int. Conf. Multimedia, pp. 187–196, 2022.
[10] D. Rossi, F. Mancini, and A. Bianchi, “AI-Enabled Smart Interior Environments: Opportunities and Challenges,” Buildings, vol. 13, no. 1861, pp. 1–18, 2023.
[11] J. Patel and M. Roy, “Creator Interior Design: AI-Assisted Creative Approaches in Home Spaces,” Int. J. Artif. Intell. Appl., vol. 15, no. 1, pp. 45–56, 2023.