VDF @ Wear2Find

There is no easy, interactive, and fun way to find and buy clothes online. We are Wear2Find and we will create the next generation 5G-enabled, AI-driven fashion marketplace.

  • 472 Raised
  • 17 Views
  • 6 Judges

Categories

  • 5G - The Network of the Future

Gallery

Description

5G - The Network Of The Future

Wear2Find @ WeARTeam 


DESCRIÇÃO


5G is poised to power the next generation of internet connectivity causing significant growth in mobile e-commerce, with some researchers estimating that it will produce an additional $12 billion in revenue for retailers, in just three years[1]. Faster and more reliable than anything, 5G networks will enable people to communicate, consume entertainment, and shop at unprecedented speeds.


Besides this, 5G networks will enable groundbreaking online retail technologies that leverage augmented reality (AR), virtual reality (VR), Artificial Intelligence (AI), and Machine Learning (ML).


Imagine getting the best parts of the brick-and-mortar shopping experience without ever setting foot in a store. You're not simply scrolling through a website, you're “looking at the shelves” of a fashion retailer through your mobile phone and you can even use a virtual fitting room to try on outfits, from the comfort of your couch.


This is the future of shopping with Wear2Find.


Our idea is to build the fashion marketplace of the future. We will use AI-driven systems that will allow the customers to upload a picture of the item that they want to buy and our system will find similar ones. Besides that, we will have a feature that will allow the users to find clothes they like by swiping right and left according to their tastes. Our algorithm will then reduce the number of choices and will show only the ones that match the customer's taste. This feature will get better and better since we will learn from the customer.


To make it fun and interactive we will build an AI-system that will recognize hand movements such that the swipe movement will be done through the movement of the hand of the customers to give them a better experience. A virtual fitting room makes it possible for the customer to try on its favorite clothes without leaving the couch, very much similar to the fitting experience of brick-and-mortar shops.


Being a global marketplace, the customers will have the ability to see themselves with different items from different stores at the same time and can buy them at once. And, if they don't like one, they will have the ability to swipe it with their hands.


For this to be possible, we will establish commercial agreements with the brands that will upload their items into our platform. Every time a customer buys one item, we will get a small fee for the transaction and this will be our business model.

 

Regarding the state of the project, we are still in the development phase having already preliminary versions of the hand swiping action feature and the matching system.

 

[1] According to "A Mobile-First World", a report from Adobe Digital Insights


MOTIVAÇÃO


Like every one of us, we are sure that if you shopped online before, it is likely that you already had a negative experience. The common online fashion shopping is tedious, filled with thousands of different filters, boring, non-intuitive, and very time-consuming. The usual online shopping experience doesn't allow you to try the items you want and this made us think it could be possible to significantly improve the shopping experience. 


With Wear2Find, we intend to dramatically improve the fashion e-commerce market with an innovative approach by taking advantage of the groundbreaking technologies enabled by 5G networks. With this, we want our customers to have an exciting, unique, and fun experience. We chose TecStorm because it provides us with greater visibility in order to make our project come to life.


COMO O VAIS CONCRETIZAR


Our platform will be a progressive web app that will have the front-end built in ReactJS and the back-end in Python. For some of our features, including the matching system, we will use segmentation models followed by auto-encoder architectures that will allow us to compare between clothing items. On the other hand, for the user experience, we will use a CNN in conjunction with a RNN to follow and classify the movement of the hand of the user. Lastly, the virtual try-on will also use the same segmentation model but different auto-encoder architecture. Then, with Augmented Reality, you will be able to visualize yourself with your favorite clothing items. Therefore, we don't need any material to build a prototype.


VÍDEO OPCIONAL


Attachments