#apilab
Explore tagged Tumblr posts
Text
Week 8 (8 - 12.03)
I really liked the coaching this week. We had some time alone wit our teachers and could focus 100% on our project and their feedback. It seems like we have everything sorted out and we are definitely on the right tracks with our project.
Next step is to find a way to connect handtrack.js, Johnny Five and Arduino. Sanna is figuring it out as one of her API examples, since she is the best at programming. She also wrote down some scenarios, situations and questions for our user testing session, which me, Carlo and Patrycja, will be conducting on Wednesday.
Wednesday, 10.03
Today in the school workshop, we managed to successfully launch our prototype, using the handtrack.js API. We connected it through Johnny Five and Arduino, so if you wave at the laptop and camera detects your hand, the plant would start moving. We had three optional movements and if we wanted to use of those, we had to comment our the other two in the code. We were very proud of ourselves, however, the API didn’t work as well as we expected it to. It couldn’t detect hands effectively and sometimes it detected hands in other objects, so it launched movements randomly. We asked three people to try out our prototype and asked them about their feelings and experience, but we felt like it was not going in the right direction. We figured out that we have a lot to think through before the final user testing and also, we started wondering if we really should use the API to conduct the tests.
Video from user testing

API examples
This week, I also finished with my API examples. My first example, which I extended to prototype, is based on Victor Dibia's code: https://codepen.io/victordibia/pen/RdWbEY. I misunderstood the assignments directives and didn't copy it before extending it to the Prototype, so I added the Prototype folder afterwards and made some slight changes. I added a video of a shaking plant and made it play when a hand is detected by the API.
I had a lot of doubts about it. I didn’t know if the thing I would do, could even be called a prototype. But after reading “The anatomy of prototypes” by Lim, Stolterman and Teneberg, I learned a lot about what we can actually call a prototype and that prototypes differ when it comes to material, resolution and scope. And since I am not aiming to actually test anything with it, I can definitely call it a prototype and... it is my call ;)
My second example detects the hand position and logs it in the right column. I made some changes to the original code: https://codepen.io/md-azeem/pen/xxEZbrz.
I had fun with the API assignment and I didn’t find it very hard. We had time and strength to focus on our project too and I have an impression that other groups kind of ditched the main project for the sake of API Lab.
Link to our GitHub
The rest of the week went quick and it was all about our API presentation. During coaching, we showed Peter what we had and he was very excited about the hand tracking we used and the physical prototype we have built. We know that we are on the right track with the API assignment, but still, we doubt we are going to use it in our final prototype and user testing.
API Presentation
0 notes
Text
Week 6 (22 - 26.02)
This week we had our exam and were presented to API assignment.
We presented our final concept to the teachers and got pretty good feedback. The only thing I remember seemed to be doubtful was why we presented a slide, which included a description of our concept works. We put the slide right before our video prototype and it explained all the features of our concept. Teachers couldn’t understand why we did that and we responded that this was a requirement in our Project Guide. We were asked if my doubt that our video explains the concept and its features, which we didn’t. Johannes drilled the topic for a really long time and that was the moment I thought that hopefully, this is the only actual criticism he has in mind right now ;)
Our presentation can be found here.
API Lab
Later this week we were introduced to APIs and API assignment. The term API ringed a bell for me before, but I didn’t really know what an API was. Two weeks before that we were introduced to GitHub and terminal, but it seemed surrealistic to me - I didn’t understand it even after watching Coding Train Tutorials. The Eloquent JavaScript, Ch. 11, Asynchronous Programming seemed like black magic to me. I even had thoughts that maybe I don’t really know English, because I was reading the sentences one by one, but didn’t understand a thing.
I had fun reading two other papers: “The computer for the 21st century” by Bill Weiser was a little bit like a look into the past. The author in this article written in 1991 is trying to predict how people would coexist with computers in the future and to some extent, most of his predictions is pretty accurate.
In “Radical Atoms” by Hirosi Ishii et al., the authors are taking a look into the future? Or are they? The proove to the readers that the future is now and the things we cannot even dream of are happening, evolving and being developed right now.
Finally, we were introduced to our API assignment.
Since then our work started to revolve around which API should we choose and how to use GitHub. We settled that any API doesn’t really add anything to our project. We really wanted to go with Johnny Five, as we can and should use sensors in our project, but it turned out we couldn’t use, because it would be used during following workshops. We needed a lot of guidance on that and finally we decided to use the API that is the most relevant to our project - handtrack.js. We thought that maybe instead of using a proximity sensor, we could use hand detection and this might be just a branch of our project as we don’t know if we’re gonna end up using the API instead of the sensor. So, our API is chosen and mainly for the sake of the API assignment.
0 notes