AI-driven room design has rapidly shifted from niche experiment to core infrastructure within the interiors sector and three strands are converging: generative image models that can restyle spaces in seconds, AR (augmented reality) and 3D scanning systems that convert real rooms into editable digital twins and intelligent layout engines designed for commercial planning long before a designer puts pen to paper.

This is being embraced in the consumer-home sphere, with the most visible examples coming from major retailers such as IKEA.

Although its brand engineering and retail footprint are enormous, IKEA is treating AI not as a bolt-on, but as a means of re-engineering the shopping journey. Its Kreativ platform enables customers to scan any room, remove existing items digitally and drop in curated IKEA products at true scale, creating a design-to-checkout pipeline that transforms inspiration into purchase.

Similarly, Wayfair, the Boston-headquartered online home furnishings marketplace, has introduced Decorify, a generative-AI room restyler which takes a photo of any room and produces multiple shoppable redesigns. More recently it has embedded these experiences into spatial computing via programs such as Apple Vision Pro, positioning itself as both a retail channel and a design engine.

Home Improvement
This shift is also being embraced by home improvement retail chains. In the US, Lowe’s uses AI visualisers in conjunction with assistant tools built with OpenAI to guide kitchen and bathroom planning, while The Home Depot applies AI-powered ‘see in your space’ tools across flooring, paint and finishes.

In these cases, the fundamental goal is consistent: de-risking big consumer decisions through immersive visualisation followed by reducing friction from concept to conversion.

In the retail context, AI design is functioning as a conversion engine by reassuring customers, promoting coordinated product schemes, and shortening the path from idea to checkout.

Guest Experience
In hospitality meanwhile, hotel chain Marriott is now deploying AI to streamline operations and personalise guest experiences and is currently piloting an AI-driven room-assignment tool that processes complex criteria in seconds.

The hotel chain has introduced an AI-powered ‘virtual concierge’ called Renai that combines human curator insight with generative AI to recommend local dining, activities and services. Deeper personalisation is achieved via AI patterns in behaviour and preferences so that Marriott can anticipate needs, tailor offers and deliver a more seamless, relevant stay.

Hyperindividuality
“Personalisation is nothing new, but AI is taking things to the next level and ushering in an era of hyperindividuality,” says Ivonne Seifert, Messe Frankfurt’s marketing director. “AI learns and adapts, making everything faster and more scalable and enabling consumers to be much better informed in terms of what they really want.”

She cited examples of how, at the very start of the supply chain, Germany’s Style3D/Assyst is turning AI into a practical engine for how textiles are conceived, visualised and specified, which is increasingly becoming as relevant for home textiles as for apparel.

At the heart of the Style3D/Assyst platform is a suite of AI tools embedded in Style3D Studio that automate stitching, generate garments and trims, and enhance visual details, all from digital fabrics that behave like real materials in simulation. When these fabrics are upholstery cloths, curtain qualities or bed linens, designers can experiment with repeat sizes, constructions and finishes while seeing instantly how a fabric will drape over furniture or hang in a window, long before a physical sample is woven.

Style3D/Assyst’s workflow begins with fabric digitisation and AI-supported textile design, moves through 3D design environments, then connects back into 2D CAD, nesting and costing so that creative choices are always grounded in industrial reality. AI models help forecast colour and pattern directions, speed up fabric simulations and validate behaviour such as fullness and pleating, which are critical for curtains and decorative textiles. For brands and mills, this means faster collection development with fewer strike-offs and a clearer view of margin impact.

On top of this, Style3D is experimenting with fully automated design generation through iCreate and AI-driven visual content through iWish, which can create photorealistic room set imagery without traditional photography. Taken together, these capabilities allow the Style3D/Assyst organisation to reposition home textile design as a digital, data-informed process where AI augments the designer’s eye, compresses lead times and supports more sustainable decision-making.

Convergence
An interesting shift now is that the home and contract design worlds are converging. Consumer tools are acquiring richer product data, accurate geometry and cost metadata, moving closer to early-stage specification workflows. Contract tools are gaining more generative and visual capability so that presentation and client alignment become easier. What emerges is a pipeline where scanning, zoning, first-pass layout and material palettes are handled by AI and human designers who then apply their expertise to narrative, brand, texture, tactility and long-term performance.

Visibility
For businesses supplying products for interiors, whether they are furniture manufacturers, flooring specialists, lighting companies or textile suppliers, visibility in these AI environments will rapidly become as important as showroom visibility.

If a textile manufacturer, for example, can supply accurate 3D-asset material files, performance data and sustainability metadata, its products are more likely to be included in an early AI-supported scheme.

The contract home textiles industry will find itself at the heart of this transformation. The fabrics, finishes and backs of the soft-furnishing world will feed into AI workflows and digital twin libraries rather than only analogue sample sets.

The suppliers who adapt quickly will find themselves embedded into the new digital fabric of the interiors market. Those who are slower may find that by the time the human specification meeting happens, many decisions have already been generated by AI tools.

Haptics
There are, however, still limitations to what AI can do at the moment, and not least in transmitting what textiles actually feel like, whether on the body or around the home, and much work continues in the field of so-called ‘haptics’.

At the 2025 Berlin Science Week in Germany in November for example, the Fraunhofer Institute for Reliability and Microintegration IZM and the WINT Design Lab unveiled textiles that can respond to touch and interact intelligently with their environment, as part of the Soft Interfaces project.

A special textile lampshade was unveiled in Berlin, based on a newly developed and fully printable liquid metal ink made with Galinstan, an alloy of gallium, indium and tin.

The electrically conductive ink is covered in highly elastic thermoplastic polyurethane (TPU) and can be laminated onto knitted textiles to create surfaces that are not just functionally usable, but also flexible, stretchable and pleasing to both the touch and eye.

The 3D-printed lamp, with its clean lines and LEDs has no intrusive switches or buttons and only subtle touch of the fingertip enables it to be dimmed, change colour, or be switched on and off as a result of subtle differences in the knitted pattern.

As a result, the fabric itself becomes the user interface, opening up a whole new dimension of interactivity.

The viscous LMI (liquid metal ink) can be printed onto elastic substrates to create structures that work like resistive strain sensors. Gentle pressure is enough to influence the resistivity of the material and exert the changes.

Fabric Aesthetics
One of the biggest challenges faced by designers and manufacturers in general, is in describing and sharing information about fabric aesthetics before manufacturing, or without the costly and time-consuming process of transporting physical samples.

The elusive concept of ‘fabric handle’, the tactile sensation experienced when touching and manipulating fabric, is a critical aspect of textile evaluation, but to date it has been very subjective.

“No two people will describe how a fabric feels in the same way and the lack of a common language to describe fabric tactility poses communication challenges across the complex global fashion, home furnishings and textile supply chains,” says Sean O’Neill, managing director of UK-based textile technology company Roaches International. “How do you objectively measure qualities like softness, smoothness, drape and stiffness?”

In a bid to address this, Roaches has developed the new Sentire fabric handle tester. The company worked with specialists at the University of Leeds to develop this new finished fabric evaluation system which defines the tactile properties of fabrics via a haptic spatial system, similar, for example, to the way colour charts are digitally defined for colour palettes, or Tog values rate warmth.

Fabric samples are placed into the Sentire to run a series of tests which generate quantitative fabric tactile property data akin to a fingerprint for the fabric, which can then be compared against other samples and communicated digitally to partners in different locations.

“This technology has the potential to impact the supply chain in a similar way to the spectrophotometer for the communication of colour,” says O’Neill. “We have had a fantastic initial response to the Sentire and its possibilities are huge. Not only can it be used to compare textile tactility globally, but we are also seeing interest from online retailers who want to accurately display the way a particular fabric drapes.”

Human Resolution
In the US, Northwestern University engineers have meanwhile developed the first haptic device said to achieve ‘human resolution’, accurately matching the sensing abilities of the human fingertip.

Called VoxeLite, the ultra-thin, lightweight and flexible wearable device is said to recreate touch sensations with the same clarity, detail and speed as skin. The device gently wraps around a fingertip and by combining high spatial resolution with a comfortable, wearable form factor, could transform how people interact with digital environments, including more immersive virtual reality systems.

“Touch is the last major sense without a true digital interface,” says Northwestern researcher Sylvia Tan. “We have technologies that make things look and sound real and now we need to make textures and tactile sensations feel real. Our device is moving the field towards that goal. We have also designed it to be comfortable, so people can wear it for long periods of time without needing to remove it to perform other tasks. It is like how people wear glasses all day and do not even think about them.”

Unsolved Problems
Despite decades of progress in high-definition video and true-to-life audio, digital touch has stubbornly lagged behind. Today’s haptic feedback, mostly simple smartphone vibrations, cannot convey the rich, detailed information the fingertips naturally perceive. This is partially because the skin’s spatial and temporal resolution is notoriously difficult to simulate.

“Think of very old motion pictures when the number of frames per second was really low, so movements looked jerky, which was due to low temporal resolution,” explains Northwestern professor of mechanical engineering J. Edward Colgate. “Or think of early computer displays where images were pixelated due to low spatial resolution. Nowadays, both problems are solved for graphical displays but not for tactile displays. VoxeLite has the ability to present haptic information to the skin with both the spatial and temporal resolution of the sensory system.”

Pixels of Touch
The device features an array of tiny, individually controlled nodes embedded into a paper-thin, stretchable sheet of latex. These soft nodes function like pixels of touch, each capable of pressing into the skin at high speeds and in precise patterns.

Each node comprises a soft rubber dome, a conductive outer layer and a hidden inner electrode. When a slight voltage is applied, it generates electroadhesion, the same principle that causes a balloon to stick to a wall after being rubbed.

VoxeLite applies electrostatic forces in a precise, controlled way to make each tiny node ‘grip’ a surface and tilt to press into skin. This generates a highly localised mechanical force, so each ‘pixel’ of touch pushes the skin on a fingertip. Higher voltages increase friction during movement, producing more pronounced tactile cues to simulate the feeling of a rough surface. On the other hand, lower voltages create less friction and, therefore, the sensation of a slipperier surface.

“When swiped across an electrically grounded surface, the device controls the friction on each node, leading to controllable indentation on the skin,” says Colgate. “Past attempts to generate haptic effects have been big, unwieldy, complex devices, but VoxeLite weighs less than a gram.”

Virtual Textures
In a series of experiments, study participants wearing the device accurately and reliably recognised virtual textures, patterns and directional cues. People wearing VoxeLite identified direction patterns like up, down, left and right with up to 87 per cent accuracy and also identified real fabrics, including leather, corduroy and terry cloth, with 81 per cent accuracy.

Because it is extremely thin, soft and conforms to the skin, VoxeLite does not interfere with real- world tasks or block the natural sense of touch, enabling wearers to move seamlessly between real and digital experiences.

Such devices suggest that more fruitful online shopping experiences, where shoppers can feel textiles and fabrics before making purchases, are surely just around the corner.

Robotics
Parallel to such developments is work in robotics, since while robots have long excelled at precision, consistency and endurance, they have remained remarkably limited in their ability to feel.

Touch is subtle, continuous and deeply contextual, and for decades it has been one of the great missing pieces in machine capability. The emergence of smart textiles is now beginning to close that gap. By clothing robots in flexible, sensor-rich fabrics, engineers are giving machines something far closer to a haptic sense, allowing them not merely to detect contact but to interpret it with nuance and purpose.

An essential shift lies in moving from isolated, rigid sensors towards integrated textile systems that mirror the sensitivity and adaptability of human skin. These fabrics are woven, knitted or laminated with conductive threads and soft polymers that respond to pressure, stretch and temperature. When a robot arm or torso is enveloped in such a textile, its entire surface becomes an active interface, capable of registering the slightest brush of a fingertip or the more assertive weight of an object being grasped. Because the textile is continuous rather than fragmented into discrete pads, the resulting data offers far richer spatial resolution. A robot can sense not just that something has touched it, but where, how firmly and with what pattern of movement.

Point of Contact
All of this may seem at a remote distance to the products on display at Heimtextil, yet the smart fabrics enabling a robot to modulate its grip or interpret subtle pressure are now beginning to inform how softness, resilience and sensory response are perceived within domestic and hospitality spaces.

And while textiles have long shaped the atmosphere of the home, in the world of contract interiors their role becomes both more exacting and more expansive.

In these environments the textile is often the first point of contact, the element that creates the room’s emotional temperature.

What is decorative in a private residence also grows into a material of architectural consequence in hotels, workplaces and public buildings. Here fabrics must not only enhance the look and feel of a space but also meet rigorous demands for safety, durability and acoustic control, while supporting large-scale installation and maintenance programmes.

This duality is one of the reasons Heimtextil in Frankfurt has become such an important annual gathering point for the industry.

Upholstery fittings in lounges or executive offices may appear effortless, yet are engineered to remain inviting even under heavy use. Drapery in hotel suites must fall with grace while meeting stringent fire codes, and decorative fabrics in healthcare or educational settings need to introduce warmth without compromising hygiene or cleanability.

Many of the collections at Heimtextil can be viewed as material solutions that behave reliably, in terms of flame-retardant properties, acoustic benefits and dimensional stability, across hundreds of rooms and over many years of service.

Much of the textile architecture of a building, however, remains hidden, performing its duties quietly behind the visible layer. Nonwoven backings sit beneath carpets to suppress sound, stabilise floor finishes and improve thermal behaviour. Acoustic panels wrapped in technical fabrics shape the auditory character of meeting rooms, restaurants and atrium spaces. Furnishings rely on internal webs, scrims and fire-barrier layers to ensure comfort and safety, while the building envelope itself often incorporates textile elements in shading systems, gaskets and façade membranes. These materials are rarely remarked upon by occupants, yet they are instrumental in making contract interiors feel composed, hospitable and acoustically balanced.

Heimtextil 2026 will underline the fact that contract textiles have grown into a distinct discipline, one in which design, engineering and sustainability intersect with increasing precision.

Human Factor
In the end, what is emerging from the rise of AI is not a contest between human sensibility and machine intelligence but a more intricate choreography between the two.

AI is reconfiguring how spaces are imagined, specified and experienced, and compressing the distance between inspiration and execution while drawing ever closer to the elusive realm of touch, tactility and emotional nuance.

From generative design and digital twins to human-resolution haptics and smart textiles, the industry stands at a juncture where the virtual and the physical no longer sit in opposition but in dialogue.

Yet even as algorithms accelerate decision-making and reshape supply chains, the essence of interiors remains profoundly human, with the instinct to create comfort, character and atmosphere through materials that speak to the senses.

Heimtextil 2026 will not simply showcase new fabrics and technologies but will illuminate this evolving synthesis, where data meets drape, simulation meets sensation and digitalisation supports spaces that feel instinctively right.

In this convergence lies the future of contract textiles, a discipline defined not only by performance and efficiency but by its ability to translate innovation into environments that resonate, endure and quietly enhance the way people live.