Roads are not safe for cyclists. In this report, we discuss the extent of this social challenge and innovative means to counteract cyclist risk. To that end, we consider the following question: How might we increase driver and cyclist awareness in order to improve cyclist safety? To answer this question, we discuss user needs, pain points, and means to enhance cyclist accessories. Through this exploration and human-centered design, we present design research and ideation of a fully integrated, wearable device that senses approaching cars via sonar sensors, vibrates to warn the user, and flashes LED lights to warn drivers.
EnergyVR is a virtual reality energy analysis tool that uses EnergyPlus to help designers make informed design decisions on low-energy buildings. Users can analyze their design real-time based on changes made spatially or on different properties of the envelope, all in an immersive experience. The first prototype is based on a simplified model using the San Francisco weather file, where the user can try alternative window-to-wall ratios (WWR), visualize the corresponding design alterations and perform the energy simulations. This tool represents a next step for designers to understand the impact of their design decisions on energy consumption of a building in an immersive setting.
Communication is essential, whether it’s through spoken language, text, or body language. For this project, our design goal was to make a wearable that bridges the gap between gestures and meaning, whether that’s for beneficial or sinister purposes.
Jester is a wearable that allows the wearer to communicate to other wearers through gestures such as crossing arms, or arms down. Jesters are equipped with IMU sensors that output orientation data and gravitational vectors.
Users can define mappings between gestures and meanings by repeatedly performing the gesture, which allows the Jester to train using machine learning. Our vision for this product includes a feature that allows users to track the frequency of habitual movements like shaking legs and get immediate reminders to stop them, or get out of uncomfortable situations by contacting their friends.
Our process includes use of high precision 3d scanners called Creaform Go!SCAN 3D to detect and store geometry information and then using an automated method prominent 3d edges and feature lines will be detected, which will be then used in order to create the blending geometries that will be the micro component of the parametric joint.
Although 3D scanning and 3D printing are major technological advances that have found widespread use in different fields of design and engineering, bridging both methods is less explored. We used this crossover as a novel process to explore and inform design. As a case study, we developed a workflow that involved a 3D scanner, then using the resulting data in Rhinoceros to inform design of an interactive 3d printed object that will custom fit to its existing context. Our goal was to provide users with healthy environments through designing a responsive enclosure, and create a tool for citizens and activists to inform, educate, and empower them to take action for a greener future with less air pollution.
Air pollution is an environmental challenge that impacts a large number of people world-wide. In major metropolitan cities such as Beijing people wear dust masks daily to preserve their health due to the high levels of PM 2.5 particles year round. Air pollution is called ‘wumai’ (雾霾) in Chinese. Wu (雾) meaning fog, and Mai (雾) meaning haze, while together they mean smog. Wumai is an important environmental challenge facing China which then served as an inspiration for our project.
We developed an interactive mask to be utilized in extreme smog conditions of Beijing that can influence mobility choices by encouraging the interaction between the user and their community. While a screen display allows a user to display their feelings about the situation, a moveable part that opens and closes functions as a controlled air filtration mechanism. This system was inspired by naturally ventilated buildings in challenging environmental conditions.
This project explores the concept of personal transportation that can perform beyond an inanimate object. By equipping a skateboard with pressure sensors and a projector, we created a meaningful personal connection between skater and skateboard. Consider the skateboard your new companion — with needs and a life of its own.
Through these interactions, we were interested in themes of waste reduction, navigation, and what it means to have a meaningful experience with your personal belongings.
This research project explores the possibilities of 3D scanning in the realm of architectural design, and explores the use of the resulting data to inform design of 3d printed objects that will custom fit and interact with existing architectural spaces, and interact with users. The user has the opportunity to modify and 3D print these objects using makerspaces on UC Berkeley Campus.
We explored different methods of 3d scanning such as structured light and photogrammetry, and decided to use photogrammetry for the final 3D printed design. Use of this method brings new opportunities and design challenges that will be discussed.
Designed with zero-energy aspirations, this visitors’ center is designed at the historic site of Abu Simbel temples, in the challenging climate of Aswan, Egypt. Passive strategies played an important role in this project including evaporative cooling, natural ventilation, natural light, and also in order to reduce the cooling load of the building. Cooling is achieved using evaporative cooling screens (mashrabiya) that use Nile river water and cools down the building and also acts as the building shading.
The aim of this project is to enhance the ability of designers to visualize and compute structural analysis within a short period of time. As a real time interactive tool, its intention is to familiarize people with everyday shell structures as well as assist in early stage form finding. Based on loading conditions that are inputted by the user, the shell structure is analyzed within the modeling interface and the user can mold the outcome as they wish to achieve the most optimal outcome. The loading conditions are inputted in the beginning of the analysis using color detection. The Kinect picks up this data as well as the modeling process and maps the analysis in real time on the object.
In this project our goal was To create a physical barrier between the car and the bike that is installed on a street temporarily (weeks, months), then packed up, transported and then re-installed on other streets.
This transformable element has the ability to move vertically in order to create a safe pathway for bikes throughout the day and protect them from passing cars, and at times and locations not needed it can move back to its horizontal situation, which allows car to drive on them in order to make the street more flexible at different times of the day.