DigitalBridge: Early applied AI in retail

Role: Head of Product
Focus: Product strategy, UX, product delivery


Overview

DigitalBridge began as a computer-vision startup tackling a simple question: “Why can’t I see how new décor will look in my room?”

The first product used AI to interpret a photo of a customer’s room and let them apply wallpaper, paint, or furniture virtually, years before “AR try-on” was mainstream. It combined deep research in geometry detection, image segmentation, and photorealistic rendering with the ambition of bringing interior design into the browser.

The project was featured by NVIDIA and Techcrunch as an early example of applied AI in retail.

An early version of the iOS app.

Impressive technology, ugly interface, terrible interior taste!


Joining DigitalBridge

In 2016 I joined as Head of Product, effectively the company’s first product manager. At the time, the team consisted of around 12 engineers and a CEO, with no designers or other product roles. The culture was very technical, and part of my role was to bring design thinking and user empathy into the process.

At this point, the product was an iOS app prototype that let users photograph their room and visualise home décor changes using AI-based image analysis. Photos were sent to AWS for processing, where the system detected vanishing lines and segmented the image into walls, floors, ceilings, and objects.

Because the AI couldn’t perfectly detect room geometry, we designed a UI that allowed users to drag and fit a virtual cube overlay to match the shape of their room. This provided spatial understanding, letting users overlay paint, wallpaper, flooring, and 3D furniture in realistic lighting conditions.

Target customers were home improvement retailers, but early sales calls made it clear that retailers did not believe shoppers were likely to install an app just to visualise their room. We made the difficult but correct decision to rewrite the product for web, using early compile-to-JavaScript technologies to bring the 3D experience directly into the browser.


From AI research to commercial product

As we transitioned to a web-based experience, my focus was on bridging research and product:

  • Translating AI and computer-vision outputs into interfaces that were intuitive for non-technical users
  • Working closely with the Head of AI and engineering teams to turn early research into features that could run in production
  • Leading conversations with retailers to validate use cases and integration approaches
  • Shaping the core user flow for photo upload, geometry adjustment, and virtual product overlay
  • Designing the user interface, and working closely with the dev team to implement it correctly
  • User reseearch to define home decor jobs-to-be-done, and designing a flow and features that would work for them
The screenshot of the updated web app.
The web version of the app.

The result was one of the first browser-based virtual try-on tools for home décor, enabling users to upload a photo and see real products from partner retailers in their own rooms.

As The Next Web put it at the time, the experience was "impressively accurate and fast for what it does", highlighting how far ahead of its time the product was.

Strategically, the move from iOS to web was pivotal. It didn’t just improve usability, it unlocked retailer adoption, allowing integration into e-commerce websites without requiring shoppers to install an app.

The product launched with B&Q on their website. It proved that real-time visualisation could meaningfully increase buyer confidence for home décor purchases like paint and wallpaper.


From photo app to 3D bathroom planner

Through close collaboration with Kingfisher, B&Q's parent, we discovered a bigger commercial opportunity: bathroom renovation. While décor items like paint or wallpaper had modest order values, complete bathroom projects were significantly higher.

When renovating a bathroom, user's would typically start from scratch rather than modify an existing room, meaning the current photo app was not suitable.

This lead to us designing and building a 3D bathroom planner that allowed online shoppers to:

  • Draw their own floorplan and add existing doors and windows to create a new 3D space
  • Browse and filter the retailer’s catalogue, searching for products such as showers, baths, and tiles
  • Add products into the 3D space and move them into the optimal place
  • Configure products with options like tap placement, size, and colour, and preview changes in real-time
  • Generate a 360° high-quality render for a realistic view of the finished bathroom
  • Add all the products in a scene directly to their online basket for purchase
A screenshot of the bathroom design tool.
The 3D bathroom app on B&Q's website.

At this point the company started scaling quickly, and my role shifted from hands-on product work to more leadership and management. We added more engineers, a designer and a couple of PMs, and I spent a lot of time in London working with Kingfisher, our design partner.

My focus was keeping everything moving: staying close to our teams, keeping the client informed, and making sure what we were building would really work for shoppers.

The bathroom design tool was deployed across multiple major retailers including B&Q, Castorama, Victorian Plumbing, and Victoria Plum (who were arch-rivals at the time).

Retailers saw longer engagement times, higher average order values, and improved positive TrustPilot feedback from users who used the planner.

This success lead to DigitalBridge raising a £3m round.


My time at DigitalBridge taught me how to operate in a fast-moving, highly technical environment, building enough understanding of complex 3D and AI systems to make good product and business decisions.

It was also where I learned how to lead and align a rapidly growing, engineering-heavy team, introducing structure, communication channels, and processes without slowing momentum.