The 'LightCycle' is a unique lighting interface based around natural inputs from outside sources such as the light and temperature of a particular location as well as a way for the user to directly interact with their personal lighting environment.
The 'Cycle' mode is a reaction to the static nature of most lighting systems. A manual mode where the user sets the range of change in brightness and hue as well as the amount of time the change takes to occur. So for example, the user could set the cycle to change brightness over several hours if they wanted it to be more of a subliminal presence or a few minutes if they wanted to be more aware of the light changing over time.
The 'Floyd' mode creates a psychedelic light show based on input from an amplitude sensor. The light show reacts to the users choice of music to create a playful and hypnotic visual experience.
The above shows a rough styrofoam prototype to realize the scale of the buttons. This was used in the play-testing week to get people to express the ease of use and understanding of the buttons. It came out predominantly that people related more to sliders than potentiometers to understand the increase or decrease of each parameter. We were faced with the additional challenge of how to map out the values since the values change real time from sensor readings. The controller here merely adds or subtracts from the original sensor values. A synonymous control in the real world is the exposure compensation in a camera that we set to compensate the lighting. That is the terminology we went ahead with. This is when we also discussed what modes would have what controls active.
The idea here was to create a hierarchy in the buttons so that importance is given to buttons that we think, would be used most often. This is still on ongoing process. More to follow...
The first thing that drew us to this idea was the fact that a light would be intelligent enough to slowly turn itself on as the sun goes down. It all started from there. What we wanted to make was a complete home/indoor lighting solution that reads values off the internet and approximate it to feed the lights. Though the idea felt exciting initially, we later found out that the Hue lights and the app did something very similar. This is when we decided to make everything off actual sensor data of temperature and lighting because those parameters change based on the the environment and lighting condition at a particular place. In the initial project discussion, Jingwen planted this idea of seeing the light based on time in her home country, China. After the discussion, Koen and I were generally wondering how cool would it be to set up a sensor in India to actually feed the lighting conditions live so we simulate it here locally, instead of approximating it based on time. There was something really romantic about that idea!
We deliberated a lot on the modes and how it should work. We wanted to use the temperature reading and correspond them to the color temperature of the light. We thought that the relation between temperature and color should be complementary ( cooler outside- warmer colors inside) in the auto mode and should mirror in the Simulation mode. Some felt differently to our conclusion while others strongly agreed. We kind of felt obvious it had to be that way. Anyway, we are working on
I felt that this was a very good exercise to make a lot of things talk to each other. There were remote sensors, server side programming, serial communication through node, figuring out API documentation and choosing the one the worked best; and most important of all, working as a team. I felt that we really clicked as we complemented each other well. Koen was brilliant in ideation and what really was fun was bouncing ideas off each other.
Although we did not have much of fabrication to do, we felt that we took in a lot more than we could chew. We wanted to finish all the modes and make them work for the presentation. All we achieved was to make the Auto mode and the simulation mode work.