#Bluefruit LE Micro
Explore tagged Tumblr posts
tsubakicraft · 7 years ago
Text
ナビコントロヌラヌに䜿うキヌパッドを動かしおみる
Tumblr media
隌に搭茉するナビコントロヌラヌ甚のキヌパッドを実際にBluefruit LE Microに接続しお動かしおみたした。 Adafruitさんの商品説明ペヌゞを芋るず、8぀のピンのうち、巊から぀はキヌパッドの列を認識するためのもの、続く぀は行を認識するためのものずいうこずです。 マむコン偎にはこれらのピンをデゞタルI/Oピンの5、6、9、10、11、12、13に配線したした。他のデゞタルIOピンは䜿われおいるので、それらを避けた結果です。 抌されたキヌを玠盎に送信し、アプリ偎の受け方を少し倉えお動䜜確認したした。 「」ず「#」のキヌは、䟋えば「」の盎埌に「1」を抌したら「A」ずしお扱うずいうようにしお、送信するキヌコヌドを増やしたす。 
View On WordPress
0 notes
panjapop · 7 years ago
Photo
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
The Metaplasm
The Metaplasm is a mixed reality piece that consists of a computational leather prosthetic and an augmented computer vision app. The shoulder garment computes the wearer's movement data to the app which mixes a phantom augmentation of floating images with their live camera feed. 
I was inspired by a section in Wendy Chun’s Updating to Remain the Same that talked about phone users feeling their mobile phones vibrate without a phone call being attempted. This phenomenon was equated with phantom limb syndrome where amputees often have strong sensations of their missing limb in random circumstances and without explanation. I was curious about how we could use technology to illuminate subconscious internal thought patterns, an altered state of some sort with the potential to unlock areas within our brain.
Since the success of the smart phone and it's suite of social apps, wearable tech gadgets have started to bleed into our daily lives  in their drive to help us perform better. But they are also subtly changing our behaviours, turning us into performance junkies and driving us to loose touch with our previous internal selves. I was interested in exploring what our internal selves might look like?  So I began to research different artists that worked with prosthetics to hack their bodies in different ways to create different sensations.
The australian cypriot performance artist Stelarc uses his body as a rich interaction platform, visibly hacking it with electronics to blur the boundaries between man and machine. In his Third Hand project a cyborg like robotic arm attachment is controlled by the EMG signals from other parts of the performer's body. Rebecca Horn’s Body sculptures and extensions are shaped by imagining the body as a machine. In her Extensions series Horn’s armatures remind of 19th century orthopaedic prosthetics supporting an injured body. Her armatures are redirecting the body's purpose, turning it into an instrument and often inducing a feeling of torture.
Ling Tan’s Reality Mediators are wearable prosthetics that cause unpleasant sensations when the wearer’s brain activity or muscle movement are low. Her project is a direct comment on to the effect current wearables have on our perception of the world around us and the way these products alter our daily behaviours.
I decided on designing a shoulder prosthetic that communicates with a computer vision app to allow it's wearer play with an augmented trail of images. It was important to me to present the experience as a clash of two worlds. The old world was represented by the prosthetic which took great inspiration in victorian prosthetics. I researched the look of these at the Wellcome Collection. But I also wanted the prosthetic to look really attractive and contemporary. Here 
I took visual inspiration from Olga Noronha's Ortho Prosthetics series. Rather then being a tool to help the physical body function better the prosthetic had the purpose to unlock and help navigate an internal monologue of thought images represented by the projection. The images featured stuff in our head that alludes from online shopping activity to stuff we see out and about. The camera feed holds the wearer as the centre point surrounded by a crude collection of stuff.
Reading Donna Harraway's Companion Species I came across a great section where she talks about metaplasms. She explains "I use metaplasm to mean the remodelling of dog and human flesh, remolding the codes of life, in the history of companien-species relating." Whilst Haraway explores the blurred boundaries in relationships between dogs and humans, this descriptor felt to be a very fitting title for an experience that aims to illuminate internal phantoms, that accumulate inside of us and we have no idea how to control.
I began dividing my project into two parts: The physical prosthetic and the augmented computer vision app. Having played with the latter in my last project I decided to concentrate on the physical aspects first. Here I decided to use the Arduino compatible Adafruit Flora board with the LSM303 Accelerometer Compass Sensor. I also initially thought I would make the whole experience wireless and ordered the Bluefruit LE Bluetooth Module and the rechargeable LiPo battery pack. It was important to get the components onto my physical body asap to start testing where they should sit for their maximum efficiency. I layed them out on some cheap felt and sewed them on with conductive thread according to the great help section on the Adafruit website. Despite getting the bluetooth communication to work immediately I struggled to add on the accelerometer too. The connections seemed temperamental so I decided to remove the bluetooth simplify the cross component communication. Later I got some advice that made me rethink the use of bluetooth in the church and decided that a 3m long micro usb would actually suit my aesthetic whilst making the experience more reliable. I also at this point decided to solder silicon wires onto the board instead of the less conductive thread. Now my connections were very stable and I got clear differences when moving my arm. I used the accelsensor file that came with the LSM 303 library and tested my movements in the Serial Monitor. I integrated last values and current values to let the app create a different value for the acceleration by subtracting the last value from the current value. I found it hard though to separate the x y z values from each other and making these be linked to understandable directions when sitting on the lower arm. To save time I decided to add all the difference values together with the abs function to create a single agitation value that was then communicated to the app. I tested the communication between the electronics and openframeworks with ofSerial to make sure the app would receive the values ok before parking the tech development and concentrate on the design of the prosthetic.
From doing my initial tests with the electronics I knew that I needed the Flora board on the shoulder and the accelerometer lower down on the arm. I wanted to make the prosthetic as adjustable as possible accommodating as many different body sizes as possible. I started by using thick wall paper lining paper to cut my first pattern fitting it to a size 8 dressmaker dummy that I have at home. I created loads of adjustable straps to adapt the distances between neck, shoulder and lower arm. From there I moved onto cutting it out from dressmakers Calico and sewing it together to make my next prototype. In the meantime I researched a leather workshop that sold amazing fittings such as brass Sam Browns and brass eyelids to pin the leather to the bottom layer of calico and make adjustable fittings. There I also found cheap leather off cuts to make my next prototype. Being fairly happy with my paper pattern it was time to digitise it by scanning it on a flat bed scanner and then to trace and tidy the lines in illustrator. At this point I also created the exact openings for the electronics. I wanted to create little pockets that would allow me to whip them out quickly to replace them in case they got broken. I placed the Flora board and the accelerometer on little calico pieces sewing the connections down so they became more protected. I also designed little leather flaps to cover and further protect the areas where the wires were soldered onto the board. Now that I had a digital pattern I was able to laser cut the leather as a last test before my final prototype. The laser cutter charred the leather at the edges producing an overwhelming bbq charcoal smell that I didn’t want in my final piece. I assembled the prototype to see whether I needed to make any final changes to the digital pattern. Now I was able to instruct a leather worker to cut out my final piece from the leather following my printed out pattern and referencing the assembled prototype whilst doing so.
It was time to move on with creating the augmented phantom. For this I used Vanderlin's box2d addon to add real physics behaviour to the floating augmented images. I used the 'joint' example that consisted of several joints and lines and a anchor. I combined this file with the openCV haarfinder example so that the app once detects a face could hang this joint example onto the user's shoulder. I created a separate class for the box2D trail and fed in my movement data by creating a variable to affect the trail. I created black outline squares and little white circles to mirror the aestetic of the computervision app and tested my movements. At this point I brought in some random images fed them into a vector to draw them into the outline squares on the same x/y values. Playing around with it I decided that it would be nice if the images would randomly reload adding an element of surprise and control for the user. I build an algorithm that would reload the vector when the user's agitation data accelerated the trail to the top of the screen. This worked pretty well and by this point it was time to test it in location.
I knew my installation set-up was going to be challenging as I had to project on a full height wall whilst meeting the lighting needs of a computer vision app. I was positioned by the edge to the big space of the church, the beginning of an area that was meant to be semi dark whilst receiving all the light from the big space. The BenQ short throw projector was surprisingly powerful to give a pretty crisp and high lumen projection in a semi dark space. I fit a PS3 Eyegrabber camera on the wall of the projection and a spot light further up shining at the user and in direction of the projector. I fitted a hanging sheet over the wall thereby hiding the camera except for a little hole where the lense could fit through. I hung my shoulder prosthetic from the ceiling with a thin bungee elastic allowing it to be pulled down to different heights without any accidents. 
Now I was ready to test the interaction space and positions of the lamp by bringing in players of different heights all throughout the day for several days. Despite everyone's lighting requirements in the big open space getting nailed down and fitted with my own I found the experience quite frustrating as the natural daylight would change throughout the day and I could do anything about that. I settled with the shortcomings and carried on refining the app. For this I decided to produce 50 images representing daily life and modern world meme’s. These I loaded into the app to create a good level of variation to delight the user. Then I created a screen shot facility to record interactions ie take a screenshot every time the app sees a face and every 20 seconds of interaction, so I wouldn’t get too many.  I also decided to design a fixing hook for my bungee so the user could take it off the elastic and wouldn’t get tangled up in it whilst moving about.
Testing the prosthesis on well over 100 users over the course of 3 days and observing each person interact with it convinced me to build the experience wireless next time. It was a real joy to witness the different interaction behaviours users came up with. Some wanted to dance, some were play acting, some wanted to use leap motion behaviours to grab the images and some were just terrified. Overseeing this process and closely engaging with over 40 users on opening night alone was really fascinating, despite also being very tiring. I temporarily considered hiring someone but wouldn't have had the time to train them to say the right things nor ask the right questions. It was great to so closely engage with people, dressing the different body shapes and constantly learning about what could be made different. A lot of users obviously suggested a more personalised experience ie. their own images. Talking through with them what this would mean though in terms of a privacy opt in and an open projection of their online activity they quickly understood that the player take up would be low and the experience would most likely also look very different. I hope my conceptual approach to this matter gave them enough food for thought to imagine what images might be floating in their heads though. But I also came across some interesting potential use cases from my audience. One worked with paraplegic kids and suggested the experience to have potential as a training app to a reinvigorate a lazy arm. I had a similar comment from an Alexander technique teacher. I'm interested in exploring this further and looking into integrating sensory feedback or linking the prosthetic to another device.
Overall I’m really happy with the prosthetic. It fitted most body sizes with a few exceptions. I would like to make a few adaptations but am otherwise very happy with the look and feel of it. If I had had more time I would have integrated a little vibrator within the shoulder to add sensory feedback when the user reloaded the vector or pulled in a particular image. I would definitely decide on a more controlled lighting set-up with a bigger interaction space, potentially even a huge monitor and a wireless prosthetic. Despite these constraints I was surprised how many users enjoyed playing with the images ignoring the obvious glitchiness of the facial tracking.
1 note · View note
tsubakicraft · 7 years ago
Text
自䜜ナビアプリ甚のBluetoothコントロヌラヌの郚品を亀換
Tumblr media
自䜜ナビアプリを遠隔操䜜するBluetoothコントロヌラヌを䜜るための郚品が揃いたした。 すべおAdafruitさんの補品で、3×4のキヌパッド、Bluefruit LE Micro、リチりムポリマヌバッテリヌ、バッテリヌ充電基盀です。 完成時の倧きさはキヌパッドの倧きさの䞭に収たりそうです。 コントロヌラヌの厚みもなんずか蚱容範囲かな。バッテリヌのコネクタヌを倖しお基盀に盎接はんだ付けすればもう少し薄くできそうです。 隌ぞの取り付けですが、グリップから少し手を浮かさないずすべおのキヌを抌せそうにありたせんが、なんずか䜿える䜍眮に取り付けられそうです。 専甚のキヌパッドを自䜜すればもっずスッキリした物ができるず思いたすが、ケヌスの蚭蚈や補䜜をするのが倧倉だし蚭備も情熱もないので、第䞀匟はなるべく簡単な䜜りにしたす。 
View On WordPress
0 notes
tsubakicraft · 7 years ago
Text
簡易ナビアプリ甚のバむクコントロヌラヌを詊䜜しながら考える
Tumblr media
Bluefruit LE Microずタクトスむッチを䜿ったBLEリモコンの詊䜜。 回路は簡単なもので、タクトスむッチの䞀方の端子をグラりンドに、もう䞀方の端子をマむコンのデゞタルI/Oピンに接続し、ピンモヌドをINPUT_PULLUPに蚭定しお、各ピンをdigitalReadしたずきにLOWを怜出したら察応する文字をBLE送信するずいうもの。 動画は動かしおいる様子です。
View On WordPress
0 notes
tsubakicraft · 7 years ago
Text
簡易ナビアプリ甚のバむクコントロヌラヌを考える
Tumblr media
自䜜のナビアプリの方はナビの開始・終了、地図の瞮尺半埄メヌトル、半埄メヌトル、半埄キロメヌトルの倉曎、リルヌト甚のルヌト遞択画面の衚瀺ずルヌト遞択぀たでなどの機胜をキヌ入力によっお実行できるようにしおありたす。手持ちのAdafruit Bluefruit LE MicroずいうBLEモゞュヌルの茉ったAVRマむコンをHIDキヌボヌドにしおナビを操䜜するこずを考えおいたす。リチりムポリマヌ電池ず充電回路もあるので、キヌパッドやスむッチも接続しお、ハンドルの巊グリップの近くに取り付けられたらず思いたす。 写真はAdafruitのキヌパッドです。珟圚圚庫切れ寞法は51mm暪64mm瞊11mmᅵᅵᅵずいうもので、これくらいの倧きさなら邪魔にはならない感じ。 デむトナさんがバむク甚のスマヌトコントロヌラヌずいう補品を出しおいたす。 
View On WordPress
1 note · View note
tsubakicraft · 7 years ago
Text
ドラカメず速床譊告灯の䜜成蚘録たずめ
Tumblr media
ドラカメず速床譊告灯の補䜜の蚘録を投皿したしたが、ブログから蚘事を拟うのが倧倉なのでたずめたした。詊行錯誀や右埀巊埀の様子も含めお『ドラカメ』プロゞェクトの党貌をお䌝えできればず思いたす。 課題はたくさんありたす。バむクの振動を拟うので綺麗に動画を撮圱できないずか、垂販のドラむブレコヌダヌのように蚘録した動画にタむムスタンプやスピヌドを動画に重ねられおいないずか。それでも倚少でもモノづくりの参考になればずいう想いで、恥ずかしい郚分も含めおすべおを公開したした。 速床譊告灯をバむクに搭茉した様子です。 ドラカメをバむクに搭茉した様子です。   以䞋、関連蚘事の䞀芧です。 マむコンスピヌドメヌタヌを䜜ろうか 最初は手元にあっお遊んでいるAdafruitのBluefruit LE

View On WordPress
0 notes
tsubakicraft · 7 years ago
Text
今日もGPSスピヌドメヌタヌ&簡易ドラむブレコヌダヌの詊隓
Tumblr media
GPSロギング機胜も含めおテストしたした。 ケヌスずiPhoneを合わせた重量を支えるにはマりントが匱いので映像の揺れが激しいです。マりント䜍眮が悪いので前方が良く芋えおいたせんが、撮圱した映像がこれです。 珟時点でのアプリケヌションの画面です。 ホヌム画面。録画ボタン、Gセンサヌ感床蚭定、自動録画走り出すず録画を開始する蚭定、画質蚭定、音声録音蚭定、GPSロギング蚭定、動画䞀芧衚瀺ボタンがありたす。 動画䞀芧画面。動画の名前は蚘録開始時のタむムスタンプです。 GPSログ䞀芧画面。ファむル名は蚘録開始時のタむムスタンプが含たれおおり、動画ずログが察応するようになっおいたす。 
View On WordPress
0 notes
tsubakicraft · 7 years ago
Text
マむコンスピヌドメヌタヌの続き
Tumblr media
スピヌドメヌタヌ乗り掛かった船の続きです。 ルヌト怜玢しお距離を求めお地図に経路を描くずころたで。 ただ地図の制埡がいい加枛で䜿える感じではありたせん。 販売たたは公開されおいるナビアプリのような動きにするたでには、かなり頑匵らないずいけたせんが、少しず぀進めたいず思いたす。 昚日、このアプリを動かしながら買い物に行きたしたが、GPSの粟床が良いのか、結構いい感じに速床が衚瀺されたした。 バむクにマむコンスピヌドメヌタヌを取り付ける堎合、速床やナビ情報の䞀郚を衚瀺するず䜿いやᅵᅵいず思いたすが、取り付け堎所を確保するのず、防氎防塵ケヌスを䜜るのに苊劎するかもしれたせん。もしかするずLEDで速床譊告灯を䜜るだけで良いかもしれたせん。それだけならバむクの車速センサヌの信号を取り出しお衚瀺すれば良いので、この詊䜜は捚おるこずになるかもしれたせん。 
View On WordPress
0 notes
tsubakicraft · 7 years ago
Text
マむコンスピヌドメヌタヌの続き
Tumblr media
この投皿の続きです。 実甚に䟛するシステムはこれずは違うものになるような気がしおいたすが、乗り掛かった船なのでしばらく続けたす。受信偎のBluefruit LE Microの動きを詊すために、iPhoneアプリを䜜りたした。CoreBluetooth、CoreLocation、MapKitを䜿った簡単なアプリです。 動いおいる様子です。実際にiPhoneを持っお移動すれば、誀差はあるもののGPSから移動速床が埗られ、その倀をマむコン偎で送信したすが、䞋の動画では擬䌌的な速床の倀を送信しおいたす。
View On WordPress
0 notes
tsubakicraft · 7 years ago
Text
マむコンスピヌドメヌタヌの続き
Tumblr media
この投皿の続きです。 AdafruitさんのAdafruit_GFXラむブラリずAdafruit_ST7735ラむブラリを䜿っおTFT液晶に衚瀺をするようにしたものです。描画が遅くお䜿えそうにありたせん。7セグメントLEDを䜿うか、もっず高速に描画できる仕組みに倉えるなどを怜蚎する必芁がありたす。 ゜ヌスコヌドはGitHubに茉せおありたす。  
View On WordPress
0 notes
tsubakicraft · 7 years ago
Text
マむコンスピヌドメヌタヌの続き
Tumblr media
Bluefruit LE Microをバッテリヌで運甚できるようにバッテリヌずバッテリヌチャヌゞャヌを賌入したした。バッテリヌチャヌゞャヌはドヌタヌボヌドのようなものでBluefruit LE Microにハンダ付けしたした。 で、この続きです。 最終的にはTFT液晶を䜿おうず思っおいたすが、ずりあえず動いおいるのが芋えるようにOLEDのキャラクタディスプレむを接続したした。むンタヌフェヌスはi2cです。䜿甚したのはSO1602AずいうタむプのOLEDディスプレむで秋月電子通商さんから賌入したものです。 ゜ヌスコヌドはGitHubに眮いおありたす。ここにはSO1602Aを扱うラむブラリも含たれおいたす。 動いおいる様子。 起動するず接続埅ち状態になり衚瀺がINITIALIZINGからWAITINGに倉わる セントラルデバむスから接続するず衚瀺がCONNECTEDに倉わる 
View On WordPress
0 notes
tsubakicraft · 7 years ago
Text
マむコンスピヌドメヌタヌの続き
Tumblr media
昚日の続きです。 工房に䜿っおいないGenuino 101があったので、このマむコンボヌドをBLEセントラルにしおBluefruit LE Microに接続を詊みたした。 詊みたしたがデバむスをスキャンしお接続たではいけたすがサヌビスずキャラクタリスティックの取埗に倱敗したす。今日の時点では原因がわかりたせん。CurieBLE->Central->PeripheralExplorerずいうスケッチのサンプルを元に曞いたコヌドですが、䜕か根本的なずころで勘違いをしおいるような気がしたす。今日は時間切れなので埌日調査したす。   LightBlueずいうスマホアプリを䜿うずBluefruitに接続しおキャラクタリスティックぞの曞き蟌みができたす。 Bluefruit偎は昚日のバヌゞョンから少し手盎ししたした。昚日のBLE

View On WordPress
0 notes
tsubakicraft · 7 years ago
Text
マむコンスピヌドメヌタヌを䜜ろうか
Tumblr media
そろそろ入れ替えを考えおいるオヌトバむ。XJR1300からもう少しタンデムでの長距離高速走行に向いおいるものを怜蚎䞭です。 これたで亀通違反ず蚀えば速床超過のみ。他には助手垭シヌトベルトをしおいない違反をしたのみです。それほどスピヌドを出しおいる぀もりはなくおも倧型バむクはスピヌドが出おしたうものです。怜蚎しおいる次期オヌトバむはXJR1300に比べおスピヌドが出るモデルなので、うっかりするず倧倉なこずになりたす。 
View On WordPress
0 notes
tsubakicraft · 7 years ago
Video
youtube
GPSスピヌドメヌタヌ・簡易ナビをキヌパッドで操䜜した様子 結局、キヌパッドから30皮類のキヌコヌドを送信できるようにしたした。 珟時点ではアプリ偎で䜿甚するキヌコヌドは8皮類です。ただただ機胜を远加できたす。
0 notes
tsubakicraft · 7 years ago
Text
Koshianブレヌクアりトボヌド
Tumblr media
随分前に賌入したKoshian甚の基盀がありたす。確かAmazonでたたたた芋぀けたもの。 今、バむク甚の速床譊告灯を䜜ろうず、Adafruit瀟のBluefruit LE MicroずいうAVRマむコンずBluetoothモゞュヌルが合䜓したマむコンボヌドを䜿う前提で䜜業しおいたした。 BluefruitはArduino Miniずあたり倉わらない小さなボヌドですが、LED回路ずバッテリヌを接続するず、そんなに小さくたずたらない感じです。 LEDを点灯させる皋床なら、もっず小さく簡単な方法があるず思っお、マむコンを保管しおいる箱を持ったら未䜿甚のBluetoothモゞュヌルKoshianを芋぀けたした。 そこで前ᅵᅵᅵのブレヌクアりト基盀にKoshianをハンダ付けしたした。 このピッチのハンダ付けは僕に難易床がずおも高い。芋えないし。 
View On WordPress
0 notes
tsubakicraft · 7 years ago
Text
GPSスピヌドメヌタヌ&簡易ドラむブレコヌダヌのマりントの振動抑止にゎムを䜿うも
Tumblr media
昚日の倕方に自䜜したアルミのマりントの結合郚分にゎムを挟んで防振察策をしたので、今日はそれを詊すために延べ時間半ほどドラレコで映像を蚘録しながら走っおみたした。 昚日、絊電を停止するず自動的に録画を終了するようにプログラムを修正したので、USBケヌブルをバむクのUSB゜ケットに挿したたた䜿甚するのが前提ならば、アプリを起動さえしおおけば、走り出すず勝手に録画を開始し、駐車しお゚ンゞンを止めれば勝手に録画を終了するので、ほがアプリを手で操䜜する必芁がなくなりたした。機胜に関しおは動画にタむムスタンプや速床をオヌバヌレむできおいないのが残念なくらいで、GPSロギングやスピヌド衚瀺も含めお問題はなさそうです。あずは现かな゚ラヌチェックやBLE接続ロゞックを少し煮詰める皋床で十分かもしれたせん。 画質は1280×720/30fpsで音声録音無しずいう条件で䜿甚した際のファむルサむズです。 
View On WordPress
0 notes