Introduction: Green City - Interactive Wall

The Green City project aimed to explore the issue of renewable energies, which are so important in the context of energy and in the prevention of the depletion of natural resources, in order to raise awareness of this issue in some way. We also wanted to explore video mapping and in which way we would let the users interact with the wall and make it possible to create a narrative an interactive infographic.

Interactivity is achieved through two sensors. The first is a microphone, which detects the wind and its intensity and, in this way, turns wind turbines which produce energy and feed a battery. The second sensor is a photo resistor (LDR) which detects the light intensity and as soon as the user points a light source to the solar panel, the animation of the power generation starts and the battery is charged. As the battery fills up, the houses’ lights come on as well.

Hope you like it :)

Step 1: Material Used

  • Arduino UNO
  • Microphone CZN-15E
  • LDR
  • 330 Ω resistance
  • Breadboard
  • Jump wires
  • Welding iron
  • Solder

Step 2: Idea Definition

Initially, it was only thought that an interactive wall would be built with a wind shovel and a battery that would be charged as wind would blow. After a brief analysis, this solution seemed a bit poor and then the I (we) choose to add a photovoltaic panel for energy production. The goal would be to make an animation of a tree born from the pile when it was loaded, symbolising savings that this would have represented to nature when nonrenewable resources were used to produce energy.

Since this solution still seems insufficient, and after discussion of the solution propose, it was also thought to develop, based on the idea developed until then, a dynamic infographics, thus giving a purpose, context and content to the interactive wall.

Step 3: Solutions Test

When it comes to the wind power and the interaction of users with this component, it was necessary, somehow, to detect the wind. Among some solutions, which passed through pressure sensors, we also thought about the use of a microphone. With this ran the risk of the noise of a room make moving the wind blades and, of course, this was not the goal. But when it came to experimenting the microphone, it only detected very close and high-pitched noises (a very high-pitched music scene was actually tested and this was not detected) - thus proving to be the ideal solution.

For the detection of light to focus on photovoltaic panels there was no need for great discussion or thought, and an LDR was the chosen one. It was only necessary to calibrate so that, even behind the screen, I did not consider the light of the room, even if it was at its normal maximum brightness.

Step 4: Circuit Assembly

After the solutions studied, the assembly of the circuit was started. Since the screen is high in size and the jump wires used were short, it was necessary to weld wire extensions so that the sensors (both the LDR and the microphone) connected to the Arduino, which is located in the lower right corner of the screen.

Step 5: Integration With Unity

In addition to the construction of the circuit, it was necessary to send the information generated by the sensors to the computer and translate them into some type of action through the projection. Unity was used to construct the projectable scenario, to read the values coming from the Arduino and to run the animations based on the latter.

Step 6: Building the Unity Scenario

We used a Canvas to display all the elements and used the original image to align the elements that would have movement. In order to make it possible to project and highlight only the moving parts, the background has to be black and the rest preferably white, as you can see by the images below.

Arduino Contest 2019

Participated in the
Arduino Contest 2019