yeseul-itp-blog
yeseul-itp-blog
Yeseul @ ITP
106 posts
yeseul? http://yeseul.com itp? http://itp.nyu.edu
Don't wanna be here? Send us removal request.
yeseul-itp-blog · 7 years ago
Text
Indigo
Belated documentation of our performance is here.
vimeo
"Indigo" by Michael Simpson and Yeseul Song is an audio-visual performance that uses synchronized graphics and sound to create an abstract reality where time and space have been reimagined. Viewers are invited to escape the logic of daily life and immerse themselves in a world defined by rhythmic patterns of light and sound. The real-time visual system is written in openFrameworks to visualize the sound composed by the two performers.
yeseul.com/indigo
ITP Big Screens Show 2017 Presented at the IAC Building, New York, NY Advisor: Mimi Yin Class Residents: Aaron Parsekian, Lisa Jamhoury
Huge thanks to friends and family for supporting us over the course of the last semester. We love the ITP community that made this happen 
0 notes
yeseul-itp-blog · 7 years ago
Text
Harvesting Energy from Radio Wave (no success... yet)
What if we could capture and accumulate energy that exists in the air? Inspired by crystal radio, Jenny Lim and I have tried to make a radio frequency energy receiver by building an analog circuit using crystal diodes. We want to see how much energy it can harvest and decide what we want to use it for.
We were happy to find this Instructables post with a circuit diagram which looked promising. The circuit uses a copper wire antenna, two ceramic capacitors, two electrolytic capacitors, and two crystal diodes. 
Tumblr media
We were disappointed that the amount of energy we were able to harvest from the circuit was extremely small. Even when we added more capacitors in the circuit, the peak voltage was (DC) 0.5mV. The voltage variated according to the position of the copper wire antenna and whether we touch the end of the antenna with our (sweaty) fingers.
Tumblr media Tumblr media
We looked at some other circuit diagrams for the same purpose, but we didn’t have right parts for the circuits and weren’t able to tell if they’re going to work. It seems like there are some products (such as this that Jenny found) and research projects, but we decided to put aside this project for now considering the amount of time we have until the last class. The idea itself is still exciting to us so the project might resume sometime later :)
References / Related Readings
Human Generated Power for Mobile Electronics by Thad Starner and Joseph A. Paradiso > 3.1 Catching the Ambience
RF-based Wireless Charging and Energy Harvesting Enables New Applications and Improves Product Design
0 notes
yeseul-itp-blog · 7 years ago
Text
Solar Chime Video
youtube
0 notes
yeseul-itp-blog · 7 years ago
Text
Solar Chime
I worked with Jenny Lim for the solar project. The documentation of the making process is here:  http://jlimetc.tumblr.com/post/171667818297/solar-chime
0 notes
yeseul-itp-blog · 7 years ago
Text
Wireless Power Transmission
Link to the Slide
0 notes
yeseul-itp-blog · 7 years ago
Text
Human Powered LEDs
Use human kinetic energy to create as much light as possible. 
vimeo
Jason Yung and I designed a simple mechanism to turn people’s everyday behavior into a small amount of electricity that can turn on LEDs. When people open or close the sliding door, the linear movement of the door mechanically spins the shaft of a stepper motor and it causes a small amount of AC electricity. We used a bridge rectifier to convert AC to DC to turn on several LEDs.
Here’s the circuit diagram:
(will update the diagram soon)
Questions:
What is a better design between parallel circuit and series circuit for this project?
Can we charge a battery instead of using up generated electricity instant by lighting up LEDs?
What if we design an analog circuit to detect which direction the motor turns so we can light up different color of LEDs according to the direction?
0 notes
yeseul-itp-blog · 7 years ago
Text
ADS1015+TMP36 on Pi3
https://learn.adafruit.com/raspberry-pi-analog-to-digital-converters/ads1015-slash-ads1115
Tumblr media Tumblr media
0 notes
yeseul-itp-blog · 7 years ago
Text
Kinetic Exercise - Idea
After reading The Power of Unwitting Workers, I wanted to use people’s everyday behavior to create electronic energy. And... I really liked the simplicity and cleverness of Gravity Light. Still, Gravity Light needs human to ‘work’ to create energy to light up the light--
What if we use people’s work that is used to open doors? We open doors more than 10 times a day, and there are about 200 people on the floor! We may create a simple pulley system around a door to change the direction of the human force being used to pull/push the door and convert it to electrical energy to turn on LEDs!
Tumblr media
Jason Yung and I are working together for this assignment and we today made a simple prototype to prove the concept. We were glad that the mechanism itself worked :)
(click to see the video)
A list of things to do:
- make a gear mechanism following our plan
- make a circuit (movements from a stepper motor -> bridge rectifier -> capacitor -> light up LEDs)
- test over-the-door hanger to see if it secures our installation at the place
Questions:
- we’re planning to use a nema17 stepper motor--what would be the most efficient ratio between gears?
- how many LEDs can we light up? Can we light up a LED strip?
0 notes
yeseul-itp-blog · 7 years ago
Text
Lifx: the first trial
Koji and I both were curious to try out Lifx so we decided to sit together to get our hands onto it. 
I tried Arduino Lifx library that Tom just wrote and share on his github. 
The first thing I tried is to run the example code (blinking) on Arduino101. It gave me errors when uploading the code. I tried to solve this following some posts on the Arduino community, but haven’t gotten it working yet.
BLE firmware version is not in sync with CurieBLE library !! * Set Programmer to "Arduino/Genuino 101 Firmware Updater" * Update it using "Burn Bootloader" menu.
The library worked like a charm with ESP8266.
A problem that Koji (he was trying the node library) and I both encountered is that the light bulb loses its connection to the network after a certain amount of time for an unknown reason. We used the Lifx mobile app to connect the bulb to the itpsandbox network. When the bulb loses its connection, arp and ping command returns these error messages on terminal: (copied from Koji’s computer). 
koji@devpi:~/lifx $ arp -a 128-122-6-172.DYNAPOOL.NYU.EDU (128.122.6.172) at <incomplete> on wlan0 koji@devpi:~/lifx $ ping 128.122.6.172 PING 128.122.6.172 (128.122.6.172) 56(84) bytes of data. From 128.122.6.178 icmp_seq=1 Destination Host Unreachable From 128.122.6.178 icmp_seq=2 Destination Host Unreachable From 128.122.6.178 icmp_seq=3 Destination Host Unreachable From 128.122.6.178 icmp_seq=4 Destination Host Unreachable From 128.122.6.178 icmp_seq=5 Destination Host Unreachable From 128.122.6.178 icmp_seq=6 Destination Host Unreachable
Reference
https://lan.developer.lifx.com/docs
0 notes
yeseul-itp-blog · 7 years ago
Text
RESTful API + Node.js
Hayeon Hwang and I worked together to created a simple webpage using RESTful API and Express.js. The website runs on a computer and users can access this website from any devices connected to the internet by typing the ip address with the port number. When users click the yellow blank buttons, they arrive at the webpage which shows the answer that fills the sentences.
Tumblr media
We used GET requests to trigger actions from buttons on the html website.
index.html
<form method="GET" action="/numguess">   I know the number you picked is   <input type="submit" value="  ...  " /> </form>
app.js
app.get('/numguess', function(req, res){  const randNum = Math.floor(Math.random()*10);  res.send(String(randNum)); });
References
https://expressjs.com/en/guide/routing.html
0 notes
yeseul-itp-blog · 7 years ago
Text
Final + Thoughts
While researching Optical Communications, I stumbled upon a topic called hyperspectral imaging. I was fascinated by the potential to expand human visions and in this case to see more than we can see. I have learned about the topic by reading articles online, talking to a physicist in UK whose research is in optics, and being involved in some online communities trying to build a DIY device which can take hyperspectral images. Some of the resources I found are loosely listed here.
Hyperspectral imaging uses a special camera to divide the visible+infrared spectrum into hundreds of thousands of bands and take spectrum data from each band for each pixel. The data from camera is hundreds of thousands of layers images which is described as this data cube. While each pixel of an image taken with digital cameras have three types of information (RGB), a pixel of an image taken with hyperspectral imaging offers a lot more information and the analyzed data can reveal the chemical compound, material, moisture level, among many more.
Tumblr media
The technology and the device are not affordable to many people and the process of taking images using the device and analyzing them using software involves quite a knowledge and process. Hyperspectral imaging has been mainly used for industrial, military, medical, and research purposes not for consumers. Although there are some companies working on consumer applications such as an phone app for food scanning (1, 2), the move is very recent and not many people even know about this technology yet. 
I see an opportunity in hyperspectral imaging for answering these questions I’ve had:  What if humans could see more than the visible spectrum of light?  What new things we might be able to see? What would it even look like?
What if there’s a glass-type device that people can wear and experience different visions? I propose a concept of an mixed reality app called Hyperspectral Lens. Using this app, users can select different spectrum they want to see. The world seen in the different spectrum might not make senses to people right away, but they might be able to learn how to see over time. What would be a range of spectrum that people like to be in? What would people want to do with it? What do I want to see and do with it? What do I imagine about the world where people can switch their visions over different spectrums?
I need more research to figure out technical problems to be solved for the implementation, but I don’t think this is impossible at all considering that smaller hyperspectral cameras are being made recently and GPUs are getting faster and more powerful.
Tumblr media
The slides from my presentation that I delivered at the class is found here.
I love making things and being in a process of making a project is exciting, but at the same time I’ve been thirsty for a designated time to think and research about topics as much as I want. Throughout the class, I touched many different topics stemmed from Optical Communication. I started my research journey by studying the history of optical communication and the technologies which led me to think about semiotics and light as a medium to deliver information. While I was sketching my light art installation ideas named Light Allusion, I encountered hyperspectral imaging which reminded me of questions I’ve had about seeing the invisible spectrums. I only presented about my last destination to the class, but every topic I explored and questions & ideas I’ve got on the way are equally precious to me. I’ll share some more ideas and thoughts later.
0 notes
yeseul-itp-blog · 7 years ago
Text
Prototype
I just realized that a blog post I wrote about my project idea has disappeared... It’s frustrating. I don’t remember everything, but at least I remember that these two sketches were attached on my post.
Tumblr media Tumblr media
0 notes
yeseul-itp-blog · 7 years ago
Text
unit converter - chrome extension
One of the inconvenient moments while using the web is when I encounter measurements in the imperial system such as inch, feet, pound, etc. I’m familiar with the metric system and where I’m living now uses the imperial system. I use the built in calculator on my mac or google unit converter to manually convert units, but I found the process cumbersome and time consuming. I encounter the situation almost everyday when I buy clothing/shoes/hardwares/furnitures, see distances on a map, read recipes, check the outside temperature, etc. This problem happens to people who uses the imperial system as well when they’re in a country where people use the metric system.
Tumblr media Tumblr media
What if my web browser remembers which units I use and automatically convert measurements into the system I’m familiar with? I made a simple chrome extension doing this using regular expressions (e.g., “/\s?\inches\b/gi”). Here’s a demo of myself using the extension on an amazon page.
Some of my classmates to try this app during the play testing session in the class. Here are notes from the user testing:
People found the app very useful. Especially, people from countries that uses metric system like me.
Some people were doubtful if the converted numbers are correct. They wanted to see the original measurements on the webpage as well as the converted numbers. 
Some ideas:
Another interface could be showing a converted unit when users click on a measurement. Users can switch between two different units by clicking a measurement. 
An extension that users can define their own unit (e.g., a pinch of powder, mom’s container, ancient units in different cultures...) could be another fun extension.
0 notes
yeseul-itp-blog · 7 years ago
Text
Testing DMX Lighting
It’s more than a week from now on, but Michael and I once considered using separate lighting other than the screen might be effective for project. Our plan was to light up the columns in the space to highlight some moments during our performance. We’ve already change our plan and decided to give all the power to the screen itself, but ..
Here are some photos/video from making things work.
I was excited to control Colorblast12 and Colorblast8 from OpenFrameworks using oFxDMX.
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
0 notes
yeseul-itp-blog · 7 years ago
Text
Progress
 I’ve been keeping notes on my dropbox paper (here). I have two different projects in my mind: light as an expressive medium and hyperspectral imaging. 
1. Light as an expressive medium: experiment with lights as a communication tool. there are things that can’t be communicated using traditional ways and lights could work as a communication tool for implicit information.
Following Marina’s suggestion, I made a Pinterest board (here) to collect images which shows lights used as an expressive medium.
Light as a trigger of shared experience: I talked to Tom who taught me Light and Interactivity class. We talked about to whom I want to talk to and what I want to convey. He talked to me about using a specific quality of light to share memories with someone I’m close to. I experience a specific music or smell strongly recalls specific memories. When I listen to a specific music repetitively during a certain part of my life, I feel like I’m even recording the moments onto the song. How about lights? They do, but not as the light itself. I have some quality of lights which rings past moments in my memory, but the lights are part of a scene which includes other objects, people, etc. Is it still meaningful to separate the light itself from the scene?
We recognized alphabets because we’re trained. How have we been trained towards light languages?
Projects Ideas
light allusion: a film composing a story purely using lights
light allusion: a room size installation that visitors can experience a series of situations. visitors walk into the room and the change of lighting in the room tells a story
find a film or a play and remove all objects/people/environment and leave only lightings and see overtime changes
2. More than we see: hyperspectral imaging
I encountered this technology while I was doing research and fascinated that it enables human to see a tremendous amount of information which are not normally visible to human.
Whereas the human eye sees color of visible light in mostly three bands (long wavelengths - perceived as red, medium wavelengths - perceived as green, and short wavelengths - perceived as blue), spectral imaging divides the spectrum into many more bands. This technique of dividing images into bands can be extended beyond the visible. In hyperspectral imaging, the recorded spectra have fine wavelength resolution and cover a wide range of wavelengths. (wikipedia: hyperspectral imaging)
My curiosity didn’t allow me to stop learning more about this technology and I started thinking about what I could do with hyperspectral imaging.
There are hypserspectral cameras used in research and industry, but they’re still very expensive. I talked to Eric Rosenthal and he introduced me to do an exercise which can explore the concept of hyperspectral imaging: take black and white pictures of the same scene with different color filters in front of a camera and combine the images with multiple projectors with the filters used for taking the pictures. Or, I can use software called ImageJ for combining and analyzing the pictures.
I found a community trying to build a low cost single pixel camera for hyperspectral imaging and reached out to the founders. One of the people (Richard Bowman) replied me back and I sent him interview questions to answer. 
Project Ideas
Create images which can be only visible in a certain range of wavelengths? The image might not reflect any visible lights which means that it’s not visible at all (just all white or all black) to human eyes but visible through a special camera. Just like music that only dogs can hear.
0 notes
yeseul-itp-blog · 7 years ago
Text
It was last night that all ginkgo trees near my neighborhood decided to drop their leaves :) They all fell even before they turned to yellow.
Tumblr media Tumblr media Tumblr media Tumblr media
0 notes
yeseul-itp-blog · 7 years ago
Text
final idea: explore the relationship between caption and artwork
I’ve had an idea of interactive installation posing questions about the veracity of curatorial captions since few years ago. I made a quick prototype (the video below) before, but haven’t had a chance to fully implement the project. 
Here’s a part of the description I wrote for this work in the past: 
The audience is invited to adjusts the length of the description to explore the intersection of the true nature of the work and captions. The length of the caption can be adjusted using a physical controller equipped with a single knob. The original caption initially explains itself in hundreds of words; turning the knob to left or right of center causes the length to be incremented or decremented. The modified descriptions adjust the length of the caption by removing or adding words at random positions within the original text.
vimeo
Based on this idea, I’d like to extend the concept and sophisticate the implementation by:
- use markov chain / grammar component analysis to create make-more-sense sentences
- train a lstm model with curatorial contents and generate a new caption based on it
0 notes