Visie is a system designed to help the blind community gain independence in grocery stores. It is the culmination of my thesis on blindness and the pain points associated with it. To view the full documentation of the research head over to blind-world.co
To summarize, my research included exploration on what blindness was, its causes, how people cope with it and how they learn. I realized that our society is built in bias to those without sight. Everything we interact whether it be it our phone, a packaged item or signage, all of it has been designed with the presumption that the ‘interactee’ is able to see. With that conclusion in mind by the end of the research phase of my thesis I had decided on repurposing everyday items to be user friendly for the blind community.
In the process of figuring out which objects would be best to repurpose I stumbled upon a design for a braille embossed milk carton. I started recreating the milk carton in Google Sketch Up in order to get an understanding of how to set braille text and to practice working in a 3d environment.
Working on the milk carton got me thinking if having the braille on the packaging of the products was enough to help. It made me wonder how people would get to the products in the first place. This strain of questioning led me down the path of looking at how supermarkets are laid out and how they could be bothersome for some without sight.
Further research showed that supermarkets were hard to maneuver around because all the signs were visual based, and could be very confusing because items would not always be on the same shelf, and were designed to keep people in. All characteristics that made it very hard for someone without sight to efficiently operate inside of one. What I also realized from my research was that to solve this I would required resources and time that I did not have, so instead of pivoting to work on the store layouts I decided to keep my focus on redesigning the item packaging.
From working on the 3d model for the milk carton, I had realized that recreating the entire packaging for the items was going to be a waste of time, so instead I decided to create label plates that would be applied to existing packaging.
After designing and 3d printing the label plate I realized that 3d printing was not the way to go. After a crit with my thesis professor, we decided that a transparent plastic paper like acetate paper would be better suited for this project.
I had also realized that here was a vast variety of packaging sizes and considering the rules of setting braille text not all of it would fit on the smaller to medium sized items. This meant that the information on the label head to be modular. Below is the final form of the label.
The label was designed with flexibility in mind and was divided into three sections.
The first section is divided into two parts. Section 1a consists of the product name, the brand, expiry date and the quantity inside the packaging and section 1b has a call to action for the partnered mobile application but more on that later; it would be present on all packaging regardless of size. Section two informs the reader of nutritional facts appearing on medium to large packaging, while section three lists all of the ingredients and would be present on larger packaged items such as one litre milk cartons. At the top of the label are arrows and instructions to help store staff apply the label right side up and at the bottom is name of the product that label belongs on.
Lastly, the QR code is printed onto the label and works in conjunction with section 1b to direct the reader to scan the QR code with the previously mentioned partner mobile application, Visie, to scan the label. Its purpose is to give the user information that won’t fit on the smaller and medium sized packaging, and are designed to start off from where the label ends.
If the label was the appetizer, then Visie is the main course; this system wouldn’t work without it. Once I had realized that the labels on their own wouldn’t be enough for a the system to work, I turned to the accessibility mode on my phone to see how it dealt with making a visual interface work for a user who may not be able to see. As soon as I turned the accessibility mode on, my phone started stating every interaction I had with it, I lasted 2 mins before I ended up turning it off because of the sensory overload. While it was too much information for me, for someone who is missing their sight, the accessibility mode was perfect.
After figuring out what the interaction for Visie would look like, or in this case sound like, it was time to determine the actual interactions. I decided to keep as purely gesture based as I could, using swipes and taps to navigate through the application as it spoke out loud the effects of the user’s interactions. Each interaction (play, pause, scanning, skip) also has a sound associated with it for further verification for the user.