How Ulta Beauty applies low-code, AI-powered dev approach

Hear from CIOs, CTOs, and other C-level and senior execs on data and AI strategies at the Future of Work Summit this January 12, 2022. Learn more

Know how you can go online, digitally paint and repaint a room in your home to see how it looks before you pull the trigger on the changeover? Well, now you can do something similar with your face. You can try on different kinds and shades of makeup virtually to see how it fits your personality before you buy it.

Nobody’s saying that your PC or the app you might be using – such as Ulta Beauty’s – already knows how you want to remake your facial appearance, but it can find out rather quickly, using input from users and a new low-code, AI-based back end that is getting a lot of traction for the company. It’s quicker and more efficient to do these tryouts on your PC in the privacy of your own home, rather than trudging downtown, waiting for an appointment, then trying on and wiping off lipsticks, foundations, and eyeliners. Fragrances, naturally, require a different approach.

Chicago-based, publicly held Ulta, which has steadily gained market share during the last five years, has put several low-code-powered innovations into place that it claims have changed the routine for retail customers. This development strategy enacted by VP of digital innovation Michelle Pacynski has enabled the company to bring customer-facing web apps to production far faster than conventional hardcoding. The technical backbone of the strategy has been Iterate.ai‘s Interplay platform (a LEGO-like assembler of 475 pre-built modules for AI, voice, chatbots, headless ecommerce, and other components). Developers have a virtual grab bag of tools, components, and additional functionality that they can simply drag and drop into the application and try out ahead of time – similar to the way customers use the app to upgrade their faces.

Ulta’s turn to low-code to rapidly build virtual makeup try-on applications fueled by AI became a critical competitive differentiator in speed-to-market as the pandemic unfolded. Importantly, Pacynski told VentureBeat, Ulta has been able to do this by upskilling its existing dev and digital innovation teams versus spending on expensive outside dev talent.

“One of the biggest assets of this low-code strategy has been the ability to get data-driven testing/iterations of applications done 10 times faster than if we were using a traditional development process,” Pacynski said. “Low-code has been a key enabler to help us experiment with emerging technologies (AI, ML, AR, VR, IoT, etc.) to bring new experiences to our customers.”

Pacynski was an early adopter in this sector of low-code development, bringing it into the Ulta ecosystem about four years ago. Among some of the specific low-code apps that Ulta has deployed include the following:

  • GLAMlab (virtual beauty tool that uses AR)
  • Guest services chatbot
  • Curbside pickup that became essential as the pandemic unfolded
  • Other applications that use low-code-enabled technologies, such as AI, data connectors, and AR.

Ulta Beauty is a major player in the retail cosmetics industry, with about 9% of the overall market, according to Statista Research. It was founded in 1990, operates 974 stores across the United States, and had revenue of about $80 million in 2019.

How the AI is implemented

In order for technologists, data architects, and software developers to learn more about how to use AI, VentureBeat asked the following questions to Pacynski, who offered our readers these details:

VentureBeat: What AI and ML tools are you using specifically? 

Pacynski: We primarily use Tensorflow, Keras, Scikit-learn, PyTorch, Rasa, ClearML, vertex AI (within our Data Science and Computer Vision teams).

VentureBeat: Are you using models and algorithms out of a box — for example, from DataRobot or other sources?

Pacynski: We are not using any out-of-the-box models. We primarily use standard ML algorithms and customize them (fine-tune the hyper-parameters, modify weights of various parameters in the ML model(s) based on our data, etc.) based on our needs.

VentureBeat: What cloud service are you using mainly? 

Pacynski: We use Google Cloud Platform.

VentureBeat: Are you using a lot of the AI workflow tools that come with that cloud? 

Pacynski: We use Vertex AI, Tensorflow as a part of Google Cloud Platform.

VentureBeat: How much do you do yourselves? 

Pacynski: We pretty much do everything ourselves. Our Data Science and Computer Vision teams implement ETL pipelines, Data Mining, Model building, etc. by ourselves.

VentureBeat: How are you labeling data for the ML and AI workflows? 

Pacynski:  We have an internal team that helps us with data labeling, but we also use a couple of third-parties to scale up, based on the amount of data we need labeled. We just signed an agreement with Cloud Factory, a vendor partner that specializes in this space and has a complete platform for data labeling.

VentureBeat: Can you give us a ballpark estimate on how much data you are processing? 

Pacynski:  Approximately 1-2TB.

VentureBeat

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Source: Read Full Article