Lux Research recently spoke with Mike Festa, Director of Wayfair’s Next at Wayfair, about how the online retailer is innovating on its customers’ online shopping experiences. Wayfair has over 7 million furniture and home décor products available through its website, and started the Wayfair Next R&D lab to bring part of the brick-and-mortar shopping experience to online shoppers through visualization and ease of interaction (client registration required). The lab was created to digitize Wayfair’s extensive product catalog using 3D scanning to create augmented reality (AR) and virtual reality (VR) customer experiences. Creating this experience would require 3D scanning hardware, associated software and expertise, as well as developers to create the customer-facing AR and VR applications for desktop and mobile platforms. To redefine the online shopping experience and let people visualize products in their own homes, Wayfair needed to overcome the sizeable engineering hurdles associated with creating 3D models of its millions of products.
Wayfair Next required a technology workflow that would take pictures or scans of complex objects, manipulate and post-process the images, and render them as 3D photorealistic models in Wayfair-created AR and VR environments. Mike said the need for photorealistic versions of Wayfair products eliminated many off-the-shelf scanners due to prohibitively high prices and would not easily adapt to different scanning scenarios. Mike and his team turned to a technique called photogrammetry, where photos stitched together from various angles are used to create a 3D model. Although less accurate than laser scanning in most cases, photogrammetry more adequately captured the visual appeal needed to sell home goods and furniture products online. 3D scanning startups using this technique as part of their offering include Eora 3D and Fuel3D. In addition to being photorealistic, the system’s overall workflow needed to be automated to the point that technical training would not be required for users. With these goals in mind, Wayfair Next built its own hardware solution using auto-programmed digital SLR cameras, lighting equipment, and turntables for less than the cost of high-end handheld 3D scanners. For software, the team built out a data workflow that incorporated Agisoft Photoscan photogrammetry and custom processing steps to prepare models for use in the available WayfairView AR smartphone app and the Patio Playground VR app for the Oculus Rift platform. Wayfair Next has successfully developed a low-cost and scalable 3D scanning product workflow that produces about 100 photorealistic 3D models of products every week, and did so within a year of starting this project.
There is a second strategic angle to Wayfair’s digitization endeavor in addition to the customer-focused features of visualizing furniture placement in AR and VR. Wayfair has opened up an API (application programming interface) that offers 3D models of over 10,000 home goods products to the developer community. As more AR and VR consumer devices and experiences become available, developers will look to free sources of 3D content, much like stock photo databases are used today for online and print media. This availability reduces the barriers to product placement in VR, a media format that is only now beginning to reach mainstream consumers. If this works for Wayfair at scale, giving away 3D models will increase product visibility in a new medium and will be a future driver of sales. Wayfair has portrayed itself as an innovative online retailer thanks to its decision to create a team that innovated at the hardware level and adopted emerging technologies at the software level in order to create a future sales channel. Although it is too early to determine the impact of this strategy, readers with long tail business models for physical products should think critically about how to leverage product data to engage with customers, and what platforms can enable that engagement.
By: Dayton Horvath