Open Mobile is a collaborative artwork that is being developed at AltLab and that is opened to all members.
The mobile is constituted by pieces that are developed individually or collaboratively, and the final integrant structure is built in order to support all developed pieces and having in mind the characteristics of each individual piece.
The individual pieces — OMPs (Open Mobile Pieces) can range from 3D printed sculptures to electric and/or motorized works… stop by AltLab on any Tuesday (from 8.30pm on) and check it out.
Development period: April — July 2013
O Open Mobile é um projecto artístico colaborativo a ser desenvolvido no AltLab e o convite à participação é aberto a todos os membros.
A ideia é construir uma estrutura kinética e/ou eléctrica (aka “mobile) em que cada pessoa (ou conjunto de pessoas) desenvolve uma peça integrante. O “esqueleto” da estrutura é posteriormente desenvolvido de forma a integrar as peças individuais e tendo em conta as suas características.
A elaboração das peças individuais — OMPs (Open Mobile Pieces) podem ir desde impresssões 3D a trabalhos eléctricos e/ou motorizados… passem pelo AltLab numa terça-feira (a partir das 20h30) e espreitem.
Período de desenvolvimento — Abril — Julho 2013
AUTHORS: Rui de Carvalho; Maurício Martins; Pedro Ângelo; Kyriakos Koursaris
Concept Author by Rui de Carvalho;
Engineering Development by Maurício Martins;
Software Computing Orientation by Pedro Ângelo;
Sound and Music by Kyriakos Koursaris;
Collaboration: Ricardo Lobo; João Gonçalves;
This project’s designation intends to proceed to a new overview of the human condition.
We must think of a box that is directed to the inside, to the place where we find our meaning and where, therefore, we part in search of our significance.
I want to relive this awareness because it implies a demand and this demand will take us, no doubt, to some kind of finding.
We can describe “Pedra d’ Água” as a Technology and Digital Art Temple, that tries to raise the need of one for another and the ritual way we choose to celebrate the emptiness left in us when someone leaves.
I wonder if in a near future, this kind of technological chimera will be the only way to ensure the extension of our diseased and lost into the living world.
Also, I ask myself if the machine becomes equal to the human creator, like a post-maker without the burden of a body or if the creator becomes dominated by technology. We must stop and meditate over the possibility that this way will lead us to the overpowering of machine over human, taking us further into alienation.
Nowadays it’s been noted a high number of various manifestations by web users meant to be some kind of memory or tribute to the diseased loved ones.
Today, friends and family keep websites updated, as if they refuse to accept the imposed absence. They keep their routines, extending an emotional connection similar to the one that was experienced with the loved one.
“Pedra d´Agua” will be like a sanctuary that will register manifestations from web users: when someone writes or updates one of this “tribute” sites, Pedra d´Água will light up, releasing a sound. When we don´t find find updated information, Pedra d´Água” will cry.
I wish to create a artifact that claims our society as one of living but, also, as one of dead and that will allow the perpetuation of the individual and collective memory according to a new way: the web.
Nuclear Taco Sensor Helmet Gameshow is the name of our project entry for the 48h hack project of Sapo Codebits 2011. The aim of the competition was to develop a project during 48 hours and present it in 90 seconds to a live audience. Out of over 80 proposed projects, 65 were presented live.
We won the 1st place of the public voting.
The 48h project consisted of building a helmet device with humidity, temperature and fluid intake sensors, used to record and measure the reaction of nuclear taco victims of Codebits 2011 Nuclear Taco Challenge. The sensors and servos are connected by Arduino. 6 timelapse videos were recorded documenting the user experience. The 1:30 project presentation was in the style of a Japanese gameshow using OpenFrameworks. The host displayed using face substituion technology in realtime.
Our motivation to develop this project was the following:
- Do something fun with sensors and Arduino, that would show people how easy it is to use these things.
- Showcase applications of recent Face Tracking and Face Substitution technology.
- Do a presentation format that would not leave anyone indifferent to our project.
- Bring attention to the creative community we have in the Audiência Zero hacker spaces in Portugal (LCD in Porto / Guimarães, xDA in Coimbra, altLab in Lisbon), in hopes of getting new members.
- Take home some new hardware.
Video of Presentation
At Sapo Codebits 2010 the event organizers held a nuclear taco challenge during one of the nights of the event. Many brave attendees spent their last day of the event in severe discomfort, cursing their idealized bravery. No members of our team were brave enough to take on the nuclear taco challenge but the memories of everyone else suffering lingered on with us. Then one day a lightbulb was turned on inside Mauricio Martins’s head when he saw a tv comercial for MEO featuring Ricardo Araujo and an “all American” beer helmet.
The idea Mauricio had awaken inside his head was to use his Arduino and sensors expertise to pimp that beer helmet into a nuclear taco sensor device of some sort. He began looking for the pieces required.
By the way, if you want to learn how to use Arduinos for random projects, there are some workshops at altLab on a regular basis.
The helmet itself was quite hard to find for sale in Portugal. After many searches on the internet, we ended up buying it at epia.com for 10 euros.
The Arduino, LEDs, temperature and humidity sensor were easily acquired anywhere online. The flow measurement sensor was alot harder to find, we ended up buying it second hand from ebay.
The webcam for the head mounted view used was a Microsoft LifeCam VX-2000 bought by 20 euros.
Overall the hardware cost was around 60 euros.
While Mauricio was searching for the helmet he recruited two new members for our team. To assist with the hardware the Luso — New Zealandinsh Tiago Rorke, a semi-regular altLab attender. And to handle the presentation format, the Portuguese demoscener emigrated in Helsinki, Finland Filipe Cruz, who had already collaborated with Mauricio on a Codebits project in 2010 (the Blind Pong project).
A couple of weeks before the event, Mauricio and Tiago Rorke got together to write a first abstract description of the project, do some sketches of the idealized helmet and sent the text to Filipe. Few days later the three of them had a skype call to define the presentation format and hear Filipe explain his concept idea of having a japanese gameshow style of presenting the project to the public.
A couple days before the event the three members of the team finally managed to get together physically to discuss the project in person. Taking the oportunity to test some components (the sensors, the FaceTracking library by Arturo Castro, Kyle McDonald and Jason Saragih) and more importantly: to decide on a final name for the project. Nuclear Taco Sensor Helmet Gameshow was the decision.
Mauricio and Tiago Rorke spent the day working on the helmet, mostly building and testing the sensors with the Arduino and deciding on how they would be placed on the helmet. Ferdinand Meier, a resident member of altLab was recruited to help printing small pieces for the helmet with the Makerbot.
Filipe arrived late and started working imediatly on the framework for the presentation usingOpenFrameworks, mostly testing background effects in a Japanese swish swash style and trying to close the presentation storyboard. Ferdinand who was already a new member of the project at this point offered his Blender skills to create a model of the helmet in 3D to be used in the presentation.
While the hardware guys were struggling with the sensors, Filipe was testing ofx3DModelLoader with Ferdinand’s 3D model exports of the helmet. Several 2D renders of Japanese virtual idol Hatsune Miku modelling our helmet were also taken. The open source 3D model of Miku was taken from blendernation. We had to rush this process since Ferd had to leave the Codebits event that night to attend a conference in Porto.
We did not attend the Elevator Pitch talk.
Tiago Farto was recruited to help with the graphic effects of the presentation. The background effects you see are all running on pixelshaders realtime under openframeworks. It was not trivial to get the shaders setUniform to handle textures properly under openFrameworks. We spent quite a few hours debugging and wild guessing their framework since neither Filipe nor Tiago had experience running shaders on openFrameworks.
During the night we were one of the few teams still left hard at work at the partyplace at 3 am. Mauricio and Tiago Rorke finishing the helmet — testing the liquid flow sensor, building the servos, gluing the led structures, painting the helmet.
We didn’t manage to sleep much on the first night of the event, some of us were falling asleep on our computers while still trying to get some work done. We started having to turn down folks who were coming to ask us to print random things on the makerbot. We sadly had to do this because we were so busy finishing the project for the competition. The helmet needed to be finished and ready for the codebits nuclear taco challenge which was happening at 19:00.
Mauricio and Tiago finished the helmet, attached the head camera and went to the Taco Challenge area to record some footage. Tiago worked on the title screen flames effect while Filipe re-structured the framework and tested the video playback right before having to head out to give his speaker talk “Crash course on Phonegap + Sencha Touch”.
Mauricio and both Tiagos went to the taco lounge and managed to record footage from 6 volunteers wearing our helmet while eating their nuclear tacos. Big thanks to Pedro Umbelino, Daniel Freitas, Pedro Silva, Tomé Duarte, Joana Ferreira and Artur Goulão for their assistance! We ended up only using 4 of the 6 videos.
Photo by Nuno Dantas
Meanwhile, back at the altLab table Filipe had ended his speaker talk and was back to work on the presentation code with some interruptions to try and find out where the confessionary room where we were supposed to present our project 1 hour ago was located. He failed. Notified Mauricio and decided to attend the speakers dinner instead.
Upon return, Filipe managed to find where the confessionary room was located while the rest of the project folks attended the Scorpions concert. We finally managed to get skype interviewed by chewbacca and darth vader. It went rather well and we were hopeful that our project would get selected for the group A of projects presenting live on stage.
The rest of the night was spent editing video and finding the perfect Japanese face to use on the FaceTracking part of the presentation. Shido Nakamura was the final selection. Filipe had some nightmares about forgetting what to say live on stage and screwing up the Japanese accent. Tiago Rorke ended up working another all nighter doing some video editing and drawing a 2d taco for the presentation.
By the way, the music we used for the final part is ParagonX9 — Chaoz Airflow, available under a Creative Commons by-nc-sa license. And the short clip of Japanese crowd cheering was snipped from a random youtube video of a random Japanese gameshow which we can’t find anymore.
We all woke up later then planned and feeling somewhat sick and tired of working on the project. But one final effort was still needed, the presentation had to be perfect!
We did a few iterations of the final challenge video, adding sound effects and testing the length. The storyboard still suffered a few small changes to create bigger crescendo impact. Last minute overlay graphics of the sensors were designed by Tiago Farto and quickly inserted.
Test on the stage proved the facetracking could work without additional lighting. Everything seemed more or less ready. Just one more render of the final video with some more small important changes required.
Presentation had some glitches but went rather well. The crowd managed to get into it and that was reflected heavily on the voting. Great positive reactions both in person and through the twitter feed. We were very pleased and looking forward to the prize giving. Tiago Farto had to leave early and Ferd never managed to come back to Codebits since Thursday, so we were left only 3 of us, Mauricio Martins, Filipe Cruz and Tiago Rorke to collect the prizes!
We won the 1st place public award and offered the sensor helmet device to the Codebits organizers informing them that all the people involved with organizing the Nuclear Taco Challenge had to take pictures of themselfs wearing the helmet and upload them to the internet.
Domo Arigato to everyone for your feedback and support. We are very happy you liked our project. Please come and join altLab or another Audiencia Zero hacklab closer to you. We need more people sharing knowledge and doing things with technology.
Source code github repo.
If you liked our project, please flattr it to support our hacker space labs.
Leonardo has been working on a laser-head RepRap for some time. His latest experiment was etching a PCB with it. Above is the video of the first tests, but go on over to his blog to check out all the details.
Feeding 15 sleep deprived hackers is not an easy task and during the AZ Residency only 3 people were brave enough to put their culinary talents to the test: Joel, Vitor and Mariana. Everyone agreed that their homemade meals were awesome and no bug reports were filed. Since we believe in sharing, here are Joel’s delicious open source recipes (in french comme il faut):
Salad Dressing [ VO ]
4 cuillère à soupe d’huile d’olive
3 cuillère à soupe de vinaigre balsamique
2 cuillère à café de miel
Goûter et ajuster : si trop sucré, ajouter du vinaigre / si trop amer, ajouter du miel.
Bask Chicken [ VO ]
Prévoir un bon morceau de poulet pour chacun des invités
Joel Belouet has been working on an art piece involving microorganisms and needed a support structure for his microscope camera. It turns out the MakerBot sitting on our table was the solution.
At first Joel attached the camera to the z axis and the slide rested on the build platform, but it soon became clear that it would be much better to have the sample remain still and the camera move instead. Inverting the positions meant attaching the slide to the bottom of the z axis platform in order to prevent the camera lens from bumping against it. This setup also allowed him to use the z crank as a focus mechanism. (more…)
Original point cloud data from Aaron Koblin’s House of Cards (GeoVideo’s 3D scanning system)
Meshed with Point Cloud Skinner script for Blender
Cleanup in MeshLab
Made during the Makerbot workshop with Zach Hoeken at Lisbon Tech University
Download the file at Thingiverse
Awesome CreativeCommons post :
Four drum pads ready to go.
All made of old material found in the Alt/Lab installations, and a very special big thankxxx for Mónica who brought the casings (we are going back to that in a moment) for the drum pads.
So the idea was to make drum pads that we could hook up to a sound card(or whatever) and them make sweet music, this is a very nice combination between piezoelectric components and a few layers of some absorbent sound material like rubber or cork foil (that’s what we use because there was nothing more) and a piece of aluminum foil for a greater drum area .
We use an old can (20l) of paint, four piezoelectric found in electronic junk like old modems and old telephones, wire for connecting the piezos, cork foil for insulation the drum pad area and Mónica supply the casings (square rubber cd´s stands), and glue for putting everything nice and tight .
First we cut a piece of the can (circular about 10cm radius)and we glued the piezo into it, then we drilled one hole into the rubber casing for the wires to came out, them we cut two square cork foil parts (the first in the bottom of the casing and the other for the top) a bit of glue and that’s it drum pads ready to rock.
Now we got some audio coming out of the pads but thats just boring because its always the same and we want to go further like transforming audio into midi messages, and we found the right tool for it, its called “KTDrumTrigger” and he transforms the audio signal into midi notes, we can use this midi notes inside a sequencer program to control any kind of instrument (either VSTI or some other stuff), in our case we use the drum pads to control “Battery” and thats it instant fun.
There are some other links and some other ideas for drum pads. This “one“uses ardunio as a source for the imput signal.
Helicam is an AltLab project that emerged from the wish to capture images from the sky with a WiFi enabled camera so that one can see what’s being shot from a different perspective and in real-time. This approach may lead to new paradigms in visual perspectives by enabling shooting from air views at a considerably low cost and also so close that cannot be easily done from a helicopter.
As a real example, one of the projects to accomplish with Helicam is for testing for local forests surveillance and use in related research projects, as with forest fire prevention for sustainability. Other possible operation fields can be architecture, building surveillance or even artistic performance environments where multimedia has a strong presence.
The main idea is to build an inexpensive and flexible platform using – as a starting point – specifications made available by several open source projects available online (like Mikrokopter or UAVP-NG). After doing some initial research and costs evaluation we realized that we cannot make this prototype with the resources currently available within the group members or AltLab and therefore we started to seek for some kind of sponsorship. I have to say that we were lucky since with only two contacts made, we were able to fundraise money in order to build a flying prototype according to our initial costs predictions. Thank you Mobbit for accepting our proposal!!
If you wanna join us in this project stop by AltLab in one of our regular Tuesday night meetings.
…Don’t let others tell you how many buttons you can press at once, these are times of freedom :)
This is a homemade game-pad with just one accelerometer (a breakout board from Sparkfun) and one push-button.
Using a simple, custom protocol and a C# listener we could play Acceleroids, an Asteroids lost sibling on Steroids, which uses the accelerometer to control the ship and the button to shoot.
next steps will be to turn this into a generic game-pad by simulating keyboard key presses for any sensor you throw at the Arduino, allowing you to create truly funky pads to control your favourite game. Stay tuned! ;)
Edit: We were alerted by friendly comments that this project resembles another project/tutorial that can be found here: http://lusorobotica.com/index.php/topic,902.0.html