Tumgik
thenashesizer-blog · 6 years
Text
Nashesizer Usability Questionnaire for Gemma
This questionnaire is a way for Gemma to log her experience of using the Nashesizer, offer feedback on how well it works and note glitches in the functionality that we can address later by refining the design and debugging the code.
The Nashesizer, like any new interface, will take time to learn and get used to. While initial impressions are important, we also want Gemma to spend time working with the device to get more familiar with it’s various modules, functionality and layout and once she’s got a good measure of it, offer additional insight.
The following general questions should help focus feedback for each of the modules:
Is it comfortable to use? If not, how might it be improved?
Is the module in the right place within the layout? If not, where should it move to?
Does it behave reliably? if not, how doesn’t it work as expected?
Do you notice any glitches? Does it seem temperamental or behave in odd ways?
What else could it do that it currently doesn’t.
Joystick module
Navigates through tracks (horizontal) + clips (vertical)
Make sure to select focus on the Live track that matches the current Hub track #
Initial impressions of using it:
Later impressions of using it:
Buttons module
Transport mode (default - white) - play, stop + record
Track mode (press leftmost button for 1 second - turns yellow) - rightmost button mute, second rightmost button solo
Initial impressions of using it:
Later impressions of using it:
Rotary Encoders module
Left to right adjusts panning, send D, send E, send F
Initial impressions of using it:
Later impressions of using it:
Motorised slider module
Left to right adjusts volume, send A, send B, send C
Initial impressions of using it:
Later impressions of using it:
0 notes
thenashesizer-blog · 6 years
Text
Nashesizer v2 iteration 1 - Demo Video 22-05-18
vimeo
Nashesizer v2 iteration 1 - demo video 22-05-18 HB from Lewis Sykes on Vimeo.
A demonstration video - lightly edited to tighten up my numerous lengthy pauses and 'ermms' - of a first iteration of The Nashesizer v2. Details 2-way communication with Ableton Live 10 using Open Sound Control (OSC) - via the LiveGrabber v3.4 Max for Live devices and a custom-coded OSC to Serial bridge - the Nashesizer 'Hub' - written in Processing. Also demonstrates the functionality of the currently integrated modules - joystick, buttons, rotary encoders and motorised sliders.
0 notes
thenashesizer-blog · 6 years
Text
The Nashesizer v2 Iteration 1
Tumblr media
(Could be better) photos of the Nashesizer v2.
This is the first iteration of the Nashesizer - a bespoke music controller for Gemma Nash.
Its a basic test layout that integrates the master Teensy 3.6 base control module and the standalone joystick, buttons, rotary encoders and motorised sliders modules.
Having designed the modules to be reasonably as compact as components will allow, the overall dimensions are a fair indicator of how big the final device will be (~40 [w] x 25 [d] x 10 [h] cm) - although it should shrink down a little further still.
This is a first step in settling on a final layout and arrangement of the various modules that’s optimised for Gemma’s comfort and dexterity level - and in response to her feedback of actually using it.
Next stage iterations will add a 7” touch screen display (top right) and a Time of Flight distance sensor with ‘on/off’ button as a ‘gestural controller’ module (top left).
Tumblr media Tumblr media Tumblr media Tumblr media
0 notes
thenashesizer-blog · 6 years
Text
Demo Video - mid March 2018
A short demo video demonstrating the collective functionality of the various standalone modules fabricated to date - the joystick, buttons and rotary encoders - once they’d been integrated with the master Teensy 3.6 ‘base’ module and Ableton Live via the LiveGrabber Max for Live devices.
Mike Cook was still working on the motorised sliders module so it’s not yet included.
Functionality is still a bit ‘temperamental’ - but it does give a feel for how the various modules within Nashesizer work together and link to Live.
vimeo
0 notes
thenashesizer-blog · 6 years
Text
Fabricating Standalone Modules
A first stage in designing the Nashesizer and in integrating it’s various modules into it’s code has been to fabricate standalone versions of the joystick, buttons and rotary encoders modules.
The designs are based on simple top and bottom plates laser cut from 3mm acrylic and separated by nylon standoffs of appropriate height - though there are additional design refinements within each module.
The joystick module required a mounting solution for it’s 4 x NeoPixel Mini LED indicators laser cut from 2mm acrylic.
The buttons module required additional right-angle brackets to mount the PCB vertically on it’s back.
The rotary encoders module demanded the most in terms of design - and required custom mounts to house the four rotary encoders turned through 90 degrees to act as ‘thumbwheels’ and slots in the top plate to accommodate the wiring.
Apart from the rotary encoder module which is currently adjustable in order to test and settle on optimum spacing between the encoders once Gemma has the device to play with - the other modules were designed to be as compact as the components allowed. This was so we could test different module layouts and configurations while keeping the whole unit as compact as possible (it’s still quite big compared to many commercial MIDI controllers).
Once we’ve settled on final designs we’ll upload the Illustrator files to the Nashesizer GiHub account.
joystick module
Tumblr media Tumblr media Tumblr media Tumblr media
buttons module
Tumblr media Tumblr media Tumblr media Tumblr media
rotary encoders module
Tumblr media Tumblr media Tumblr media Tumblr media
0 notes
thenashesizer-blog · 6 years
Text
PCBs, PCBs...
This Tumblog is overdue some posts on Nashesizer progress…
First off some documentation on development of the PCBs for the joystick, buttons and rotary encoder modules.
These follow the schematics developed by Lewis and Mike and available via the Nashesizer GitHub account - https://github.com/nashesizer/schematics
The joystick and buttons module use an Arduino Nano clone as I2C master. The rotary encoders module uses a Teensy 3.2 (there weren’t enough pins available on the Nano to accommodate the 4 encoders and their built-in RGB LEDs) mounted on a Breakout (rev 3) from Tall Dog via Tindie - https://www.tindie.com/products/loglow/teensy-32-breakout-revision-d/. This module also incorporates an I2C safe Bi-Directional Logic Level convertor to adjust the I2C lines from 5 to 3.3V.
The PCBs are made up on either an Adafruit Perma-Proto 1/2 sized breadboard PCB - https://www.adafruit.com/product/571 - (it features a track layout that mirrors a breadboard) or on a Protostack prototyping board - https://protostack.com.au/product-category/boards/prototyping/ -(“based on a standard breadboard with a few of our own improvements”.
joystick module PCB
Tumblr media Tumblr media
buttons module PCB
Tumblr media Tumblr media
rotary encoders module PCB
Tumblr media Tumblr media
0 notes
thenashesizer-blog · 7 years
Text
New GitHub Repository
Tumblr media
All our working code and schematics are now being uploaded to a project GitHub repository - https://github.com/nashesizer.
0 notes
thenashesizer-blog · 7 years
Text
The Nashesizer - initial specification document - 30-01-18
In an email from Lewis to the team on 30-01-17
I’ve worked up an initial spec document…  to collect together thinking and discussion to date and as a useful summary for the end of the ‘Scoping & Planning’ phase. It’s fairly long and a bit technical - sorry - but I think it describes what we’re trying to do and how we’re going to implement it reasonably thoroughly which is what is needed. Feedback welcome as ever.
The Nashesizer - initial specification document 30-01-18
Introduction
The Nashesizer is a custom-made music controller designed with and for sound artist Gemma Nash.
Gemma has Cerebral Palsy which affects her movement and co-ordination. This makes many commercially available music controllers difficult for her to use.
The Nashesizer responds by combining a range of input devices and an optimised layout that best suits Gemma’s comfort and level of dexterity. These inputs - ranging from a joystick to a gestural controller - have been selected and refined through an iterative design process that includes cycles of user testing and feedback.
The Nashesizer is closely integrated with the Digital Audio Workstation (DAW) Ableton Live - Gemma’s preferred choice of commercial music production software. Live is powerful but also complex - with a dense Graphical User Interface (GUI) that frequently requires both accuracy in positioning the mouse and also significant movement across the screen to access specific functionality. Through a considered mapping of device inputs to Live parameters (through OSC via the LiveGrabber Max for Live devices, MIDI and emulated keyboard presses), the Nashesizer provides a complementary mechanism for Gemma to navigate through and control Live more easily and so speed up her workflow.
Modules
Currently planned input ‘modules’ include: a joystick; a bank of four motorised faders; a bank of four rotary encoders (turned through 90 degrees to provide rotating vertical ‘track- wheels’ rather than horizontal knobs); a bank of multi-functional buttons; a 7” touchscreen display; a vertical distance sensor as a ‘gestural controller’ with an on-off foot-switch; and a DIY trackball using an adapted optical mouse sensor.
Each ‘slave’ module is controlled by an Arduino Nano - or in the case of the 7” touchscreen display a Raspberry Pi 3 Model B - that communicates with the ‘master’ Teensy 3.6 via I2C. The Teensy collects all the input data from the various modules, translates it into OSC, MIDI or keyboard presses and then sends this data into Live - either through the Nashesizer-Hub app (a custom-coded OSC via Serial bridge written in Processing) and the LiveGrabber Max for Live devices; via the Teensy’s USB MIDI functionality; or as presses on an emulated keyboard.
Using the I2C bus means that each module requires a standard 4-wire connection - VCC, Ground, SCL and SDA data lines - and can be easily wired together in series.
The DIY trackball will either adapt an ‘off-the-shelf’ optical mouse or use an optical mouse sensor with an Arduino Pro Micro configured as a HID USB Mouse which will plug into the computer independently.
Joystick module
The joystick is limited to movement in the horizontal and vertical axes.
In default mode it sends keyboard presses equivalent to the left, right, up and down functionality of the arrow keys on an OSX computer keyboard.
In Live’s ‘Session’ view, moving the joystick left or right moves focus between horizontal tracks. Moving the joystick up and down moves focus between clips in a selected track.
In Live’s ‘Arrangement’ view, moving the joystick left or right shifts position along the timeline. Moving the joystick up and down moves focus between vertical tracks.
Holding the joystick in any position automatically repeats movement speeding up navigation.
Four NeoPixels arranged in a cross provide additional visual display.
In an alternative mode (selected via the multi-functional buttons module) the joystick could also be used to set Loop Start and End and Start and End Markers in Live’s Sample Display/Note Editor window.
Motorised fader and rotary encoder modules
In combination with a module of four rotary encoders, four 60mm throw, motorised faders control set parameters within a selected track - nominally volume, panning, Send A - F.
By default a specific rotary encoder or fader will always control the same parameter e.g. fader one will always control volume irrespective of which track is selected.
The rotary encoders have a common anode RGB LED built in and each fader has it’s own NeoPixel LED Side Light Strip to provide additional visual display.
Selecting a different track - either through the joystick or via the Live interface - will automatically update the rotary encoder and fader positions and their LEDs (after a suitable timeout that prevents them continually resetting while navigating through multiple tracks).
A possible MIDI mode - selectable via the touchscreen and/or multi-functional buttons - will ‘release’ the default OSC communication of one or more of the rotary encoders and faders so that they send MIDI Control Change data instead - which can then be assigned to a specific parameter in Live via it’s MIDI Mapping functionality.
Additionally, the rotary encoders and faders can be globally switched on-off using a similar mechanism.
Multi-functional buttons module
A bank of four, large, tactile buttons each with an RGB LED provide various functions.
By default, the three rightmost buttons act as transport controls for Live - play, stop and record.
On hold, the leftmost button steps through the various ‘modes’ changing the functionality of the other three buttons accordingly - for example In ‘track’ mode, the two rightmost buttons mute and solo the currently selected Live track.
The four buttons will also likely be used to navigate the on-screen elements of the 7” touchscreen display (left, right and scroll up and down).
A complete list of button modes is still to be finalised. The potential for additional functionality is built-in - what that functionality actually is depends on ongoing testing and feedback by Gemma.
7” touchscreen display
Through a custom-coded controlP5 GUI in Processing, the 7” touchscreen display provides secondary visual feedback on currently selected track parameters by default.
Via it’s ‘single-touch’ touchscreen functionality, it will also enable some customisation of the outputs from Nashesizer input modules - e.g. the MIDI mode for motorised fader and rotary encoder modules described above. It could also be used to adjust/draw the Clip, TrackGrabber and Mixer envelopes in Live’s Sample Display/Note Editor window.
Initial testing of the 7” touchscreen display for Gemma’s ease of use indicates that any touch-controlled GUI element such as buttons and faders need to be relatively large.
A complete list of 7” touchscreen display modes is still to be finalised. Like the multi- functional buttons, the potential for additional functionality is built-in - what that functionality actually is depends on ongoing testing and feedback by Gemma.
Gestural control sensors
We tested a selection of distance/capacitive sensor modules that might provide ‘hands- free’ control of Live parameters. The Hover and ZX Gesture sensors that detect hand sweeps up and down and left and right - like a ‘hands-free’ XY pad - aren’t suitable. Gemma finds the movement too difficult to reproduce consistently. The TOF distance sensor which detects the height of a hand held above it - particularly when coupled with a foot-switch to activate-deactivate it and an LED matrix to provide a visual display of the distance from the sensor proved far more suitable and will be integrated.
This is likely to send MIDI data only and will be assigned to a specific parameter in Live via it’s MIDI Mapping functionality.
It could be useful for live performance by controlling audio FX via track sends in real-time - or during production by drawing Clip, TrackGrabber and Mixer envelopes in Live’s Sample Display/Note Editor window.
Modified optical mouse sensor
Gemma’s current trackball is old (>10 years), large and fairly clunky. However, she is very familiar with and adept at using it and she really likes some of it’s custom functionality - such as an easily accessible button to switch between coarse and fine control.
We did select and buy a modern commercial trackball for Gemma to try out in the earlier prototyping phase - but she didn’t like it. On reflection this is perfectly understandable - a mouse or trackball is absolutely essential for controlling a computer, but it’s a highly tactile, significantly embodied and deeply personal interface - and an ‘off-the-shelf’ commercial device is unlikely to suit.
Suitable trackball components for a custom design and build are also expensive - >£150.
Accordingly, realising a bespoke trackball for Gemma is an entire project in it’s own right - likely beyond the scope of the Nashesizer.
However, attempting a DIY trackball by fitting a cheap optical mouse sensor into a 3D printed mount below a free spinning marble or ball bearing is worth testing - though not a priority.
0 notes
thenashesizer-blog · 7 years
Text
Gesture and User interface explorations - Mike Cook
Tumblr media
As part of the provisional block diagram of the Nashesizer there are two modules one labelled TFT + touchscreen and the other distance / capacitive sensor module. These notes detail some of the interface we considered and the demonstration code as a proof of concept. This is being written in advance of the all important testing that Gemma has to do to see if these proposals are a help to her.
We considered three basic sensors for the initial work:- 1) Hover sensor from http://www.hoverlabs.co/products/hover/ - This is basically and electric field sensor that can be detect the field disturbance from a hand. They have built in various types of gesture sensor recognition into this, mainly swiping up,down, left and right as well as a distance detection above the sensor.
2) A TOF distance sensor using the chip used in this product https://www.adafruit.com/product/3316 this of course will only measure one parameter.
3) The zx gesture sensor from Spark Fun https://www.sparkfun.com/products/retired/12780, while this is no longer available their is an almost identical product using different components. This is like a combination of the other two and while using optical sensors it can be made to recognise gestures.
I made an Arduino shield that would accommodate the plugging in of all three sensors one at a time. User feedback was by either a text output on a serial monitor or an 8 by 8 array of Neopixel LEDs. In addition a foot switch was used in some tests to gate the sensor output.
Touch screen A 7 inch touch display was attached to a Raspberry Pi computer and used to present various forms of user touch interfaces. These were in the form of sliders and envelope drawing. The slider's width and spacing can be easily changed to test what Gemma finds comfortable. Unfortunately this display, while being touch sensitive, is not multi touch sensitive so things like a dual XY control is not very useful.
0 notes
thenashesizer-blog · 7 years
Text
Lewis - Notes from Skype with Mike Cook - 11th Jan
Ableton Live
We agreed the need for the team to each have a working version of Ableton Live that includes Max for Live (M4L) functionality. Lewis reiterated he’d email Ableton asking if they’d be prepared to support the project by providing licences.
Testing the ‘gestural control’ demos
Mike’s demoed a couple of alternative ‘gestural control’ demos for Gemma to try out - but with her being ill we haven’t yet managed to meet up. Lewis to confirm a session at Gemma’s on the afternoon of the 29th January - before the Drake Music Lab North meet up at Eagle Labs.
Testing the Raspberry Pi 7” Touchscreen display
The Nashesizer needs it’s own display - but we’re uncertain how Gemma might manage if it also had touchscreen functionality. Mike has an official Raspberry Pi 7” Touchscreen Display and a range of existing demos that we can use for Gemma to try out and help us decide whether this could be useful for her. The larger size and screen resolution (800x600px) means this option would certainly allow for more info to be displayed and likely make touchscreen interactions less ‘fiddly’ for Gemma.
We agreed that coding a GUI in Processing, which would allow it to be developed on a laptop - rather than having to work directly on the Raspberry Pi - is the way to go. Lewis to investigate and start testing the various Processing GUI libraries that are available.
OSC, MIDI and 2-way comms
We discussed 2-way comms between Live and the Nashesizer, why we’d use OSC in preference to MIDI and various approaches to realising it.
We’ve already agreed that it is pretty essential that the Nashesizer gets updated when a change is made directly in Live. Also that it gets updated when a new project is loaded.
While current testing confirms that the latest LiveGrabber set of M4L devices will do the former - we weren’t sure if they could do the latter (though a subsequent test by Lewis indicates that LiveGrabber does send all Parameter values out on project load). Lewis to work up a new default Nashesizer Ableton Live file to include LiveGrabber M4L devices.
These requirements aside we agreed there are other benefits to using OSC over MIDI…
While Ableton Live has a well implemented MIDI mapping system that makes assigning specific MIDI controller knobs, faders and buttons relatively painless it still means that at the Nashesizer end we’ll have to work with the fairly abstracted MIDI protocol e.g. assigning a particular knob, fader, button, touchscreen interaction and other potential inputs to a specific MIDI channel, Note On/Off, Control Change Number/Value etc. Using OSC we can just address a specific parameter within Live e.g. track3/volume/1.0 which is much more literal to write and read back in the code.
Using OSC ‘future-proofs’ the Nashesizer for more flexible networked setups that Gemma may want to explore in the future.
monome.org’s serialosc acts as a bridge to send OSC via Serial over a USB cable. While monome.org devices use a specific set of OSC messages, serialosc is tried and tested - so we wondered whether it would be possible to use it to send OSC messages between Live and the Teensy 3.6 in the Nashesizer. Unfortunately, we’ve since realised that while it is possible to configure some Arduino boards to use serialosc it requires setting a serial number on the FTDI chip so that the device can be seen once its attached to the computer. Teensy’s don’t have an FTDI chip - so this approach won’t work.
We’ve since decided to programme our own bridge in Processing and Mike has already developed a test app and drawn up an initial specification…
The Nashesizer-Hub - Initial specification document
Scope, the Hub is a program written in Processing to handle OSC messages between the Teensy and Live.
In addition the Teensy can also supply MIDI messages directly into Live. This is done by setting the USB Type to "Serial + MIDI" when compiling the Teensy code. Teensy OSC messages can be sent from the Teensy into the Hub through serial string messages. The Hub will then send the message to Live through OSC. This allows MIDI and OSC messages to be exchanged with Live through the one USB connector.
Any OSC message generated by Live is decoded and a string sent via serial to the Teensy.
The format of the string messages to and from the Hub is yet to be decided but should be designed so that they are easy to decode.
Current example code simply prints out any message.
Powering the Nashesizer
Mike has already suggested that we use a 12V 4A PSU to power the Nashesizer (the prototype was powered via the USB cable which frequently didn’t supply enough current to drive both the motorised fader and OLED display) and he has several we can use. He also has the required step down switching regulators or buck convertors to convert the 12V supply to values suitable for other components.
More robust USB socket
The micro-USB socket on the Teensy 3.6 in the prototype wasn’t robust enough - so we need a USB-A female to micro-USB male solution so that the Nashesizer get’s connected via a standard printer style USB cable. Mike suggested Pimoroni and/or MODMYPI might stock a convertor board with both connectors on. As it turns out Pimoroni stocks suitable 50cm panel mount extension cables.
Optical Mouse module
We’ve previously suggested the idea of testing an optical mouse sensor as an alternative to a trackball which are really expensive. James has flagged up a project that uses a couple of these sensors and a marble in a custom mount to create a DIY trackball. This ADNS-3050 Optical Sensor Board (Tindie, $20.95 ) may be a good option - but we’ll start by buying and deconstructing a cheap (max £10) second-hand optical mouse and seeing if its usable.
0 notes
thenashesizer-blog · 7 years
Text
Craig- Notes from webinar with Lewis 01-12-2017
Lewis called me and we discussed the progress of the project so far.Lewis, as project lead, was keen to find out what my thoughts were.My role is as music technology consultant.We agreed that although the prototype was a valuable experience, it was somewhat a rushed process.Moving forward i stated that i would prefer future sessions planned in advance with Gemma to enable real world use of the controller in situ.The feedback from these sessions will help drive development.This will be a constant two way process culminating in several implementation and testing phases during the time we have with the funding from Sound & Music.
These sessions will focus around using Ableton and establishing how the hardware and software can be customised to make Gemma’s current workflow more efficient and/or creative.Gemma will give feedback during these sessions and we can establish in fine detail what she wants to achieve, the scope at this stage will be set by Gemma herself because we are intent on making an instrument WITH Gemma not FOR Gemma.I will create lesson plans and conduct end of session reviews and document flow diagrams detailing processes which need to be coded derived from Studio One, essentially the I/O skeleton of the system, transposed into the Ableton equivalents to help Gemma understand the new software more easily.We will also begin to build a template/templates which will evolve over time which will be used as the basis for the main physical functions of the Nashesizer.
We will use the Korg Nanopad and Keyboard which was bought for Gemma at this stage to help her get used to using Ableton with a MIDI controller.Previously she had used only a mouse and keyboard, drawing in MIDI and editing sounds.We have all agreed, including Gemma that some of these sessions may be recorded on video and also that they should be critical by nature working within an open, respectful environment because we feel this is the only way that improvements can be made, faults need to be identified before we can seek a solution.
0 notes
thenashesizer-blog · 7 years
Text
Lewis - Notes from meeting with James Medd + Mike Cook - 14-12-17
Mostly we caught James up with discussions to date…
Key agreed points that came out…
Ableton Live - Lewis to contact Ableton, alert them to the project and request support via user licences for The Nashesizer team and ongoing advice from the Ableton technical team who are likely most knowledgeable about what we’re attempting.
We’re agreed that two-way OSC over Serial via USB between Live and The Nashesizer is essential. We could do this via MIDI easily enough - but it’s protocol adds a layer of abstraction we’d like to avoid. We’re each going to look at various combinations of Live, found Max4Live devices (James), monome.org’s serialosc and the Arduinome firmware (Mike) and other OSC via Serial and Teensy 3.6 examples (Lewis), report back with findings and decide on best approach from there. We’ll likely develop a customised solution integrating various elements from this research and testing.
While we still have to test whether a touchscreen is actually that effective an input mechanism for Gemma - and Lewis still has to think more about what it could actually be used for and it’s various ‘modes’ - we agreed that using a Raspberry Pi 3 Model B or Zero would be the best solution for actually driving it. Mike confirmed the Pi could control the touchscreen directly, via its Linux OS or via a Processing sketch and associated GUI libraries.
Now that we’ve received the first tranche of money from Sound and Music Lewis will get on with compiling a bill of materials (BOM) to distribute for feedback. He’ll then order some of the components we’ve agreed are likely to be included in a testing and development version of The Nashesizer - the joystick, motorised sliders and rotary encoders.  We’ll then start working up test versions of these modules - extending and adapting Mike’s schematics from the prototype as well as addressing particular design and layout issues we’ve already identified and prioritised - optimising the spacing of the motorised sliders for Gemma’s use and designing and fabricating the mechanism for an alternative rotary encoder set up where the encoder is turned through 90 degrees.
Gemma still has to try out Mike’s ‘gestural control’ module demos - though we liked the idea of a pixel matrix as visual display for the Skywriter Hat as a possible ‘hands-free’ X-Y pad. We agreed the Adafruit VL53L0X Time of Flight Distance Sensor or Sparkfun ZX Distance and Gesture Sensor requires an ‘on/off’ switch - ideally a foot switch - to be anyway useful as an input mechanism - but again this is something Gemma has to test.
We discussed the idea of developing a feedback questionnaire for Gemma - to encourage her to spend more time testing prototypes and not dismiss them too quickly.
Unfortunately Gemma’s not been well - and so we’ve not been able to meet up and discuss our planning and scoping with her nearly as much as we’d like - though this definitely has to happen before we start to firm up a clear outline for the next ‘development and testing’ phase. This will also likely have an impact on the proposed timeline - though we’re aware Gemma has a Metal residency opportunity in mid-Feb we’re keen to respond to.
0 notes
thenashesizer-blog · 7 years
Text
Lewis - Notes from meeting with Mike Cook + Craig Howlett - 04-12-17
We discussed a few issues in a brief meeting before the DM Lab North meet up at Eagle Labs.
Mike showed the demos of the ‘gestural control’ module he’s made up for Gemma to test… using the Skywriter Hat and Sparkfun ZX Distance and Gesture Sensor. He’s going to make a post about these to the Tumblog shortly.
We discussed ideas for integrating an Adafruit 3.5" TFT 320x480 + Touchscreen (£29.94, Digi-Key) - agreeing that the larger display would be useful but questioning whether the touchscreen would actually be that effective for Gemma. Using it for the Ableton Live transport control seems obvious - but since these are so integral to driving Live they may be better as physical buttons. Lewis is going to think a bit more about the various ‘modes’ for the touchscreen and what it could be used for. Mike suggested switching to the larger Raspberry Pi 7” Touchscreen Display (£59.99 - Cool Components) - which could make touchscreen functionality central to driving The Nashesizer though it may take up too much space.
We spent the rest of the meeting discussing communication between Ableton Live and The Nashesizer.
The current prototype has only one-way communication - from The Nashesizer into Ableton Live via MIDI. Changes to one of the input modules is read by the base module’s Teensy 3.6 which converts this data into relevant MIDI messages (Note On and Controller Change) and then sends these via Serial over USB. The MIDI mappings in the default project assign these MIDI messages to various controls within Live.
The problem with this approach is that The Nashesizer won’t update if a change is made within Live itself. Also there’s currently no way to initialise The Nashesizer to reflect project settings when Gemma opens an existing project.
We did buy a Pro licence for Remotify - a service which allows the creation of “midi remote scripts to build advanced midi mapping controls, not possible with Ableton's own mapping engine alone” - for the previous prototype stage, but although it makes mappings more customisable its still essentially one-way communication - the controller into Live.
Ableton Link has an SDK for developers and a cross-platform source code library - but its intended as a technology that “synchronizes musical beat, tempo, and phase across multiple applications running on one or more devices”.
There are commercial controllers which do have this two-way communication built in - such as Ableton’s own Push and Novation’s Launchpad series - but they aren’t open sourced.
Robert Henke has created a series of Max4Live devices for Ableton Live - including ControlChange8 which “provides 8 automateable knobs that can send MIDI CC messages”.
Of definite interest is the Max4Live Connection Kit by Ableton - “These devices allow you to connect, control and monitor Live with a range of innovative technologies and communication protocols.” - such as Arduino, webcameras and the Leap Motion gestural controller and also includes a selection of OSC & MIDI utilities.
Tumblr media
Most promising from current research is SHOWSYNC’s Livegrabber - “a set of free Max For Live plugins that send actions from Ableton Live to any device on the network that supports Open Sound Control (OSC)”. These are worth investigating further - the GrabberSender plugin “sends all [Live] data to OSC destination on the network” for example.
However, getting OSC into The Nashesizer’s Teensy 3.6 isn’t straightforward. It would generally entail a wired Ethernet connection or a WiFi module - both approaches adding additional hardware, software layers, cables, configuration etc. - making The Nashesizer more complicated to set up and work with -  something we’re keen to avoid.
If we could use the existing USB connection to send OSC via Serial (in a fashion similar to monome.org devices) that would be best - and on a cursory search there is existing code examples in gratefulfrog’s GitHub repository which demonstrates OSC over Serial using SLIPEncodedSerial.
Alternatively, we could attempt to adapt code from the Arduinome - a clone (based on the early monome 40h protocol) that mimics the monome using the Arduino physical computing platform. While its a relatively old project (circa 2008) the latest v3.3a firmware (~2012) should work with the latest monome.org serialosc according to this “Arduinome in 2016 - Can it be done?’ post on the monome forum.
A combination of this and the Livegrabber M4L plugins may be the way to go - certainly worth testing. Mike, Lewis and James to discuss this further and attempt a ‘proof of concept’ demo.
0 notes
thenashesizer-blog · 7 years
Text
Lewis - Notes from discussions with Mike Cook & Craig Howlett - 28-11-17
As a follow up to the Retrospective Notes from The Nashesizer meeting on 6th November 2017 Lewis has since had Skype/mobile calls with Mike Cook and Craig Howlett - below brief summaries of key points.
Lewis worked up a rough block diagram for a next version Nashesizer based on previous deliberations as a discussion starter…
Tumblr media
Mike Cook
Seeing Gemma use her current set up
We agreed that it would very useful to actually see Gemma using her current favoured hardware/software set up to work on projects at home and also perform. This would give the team valuable insight into how she currently works, what solutions she’s already implemented to make things easier for herself and what she still struggles with. Our design process should take this into account. So we’re going to suggest a session with Gemma where we film her in action - likely at one of her one-to-one sessions with Craig.
‘Gestural control’ module
We discussed the idea of a ‘gestural control’ module for Gemma - the distance/capacitive sensor module in the block diagram above - and while we’re not convinced it will actually be that usable by Gemma (its difficult even for the able-bodied to get reliable and repeatable control out of them) we thought it was worth working up some demos for Gemma to test - which Mike is going to action and then post details to the blog.
Tumblr media Tumblr media Tumblr media
Mike suggested trying out the Skywriter Hat (£16 - Pimoroni) - which uses electrical near-field 3D sensing to generate positional data and detect common gestures like flicks and taps - and also the Adafruit VL53L0X Time of Flight Distance Sensor (£15.50 - Pimoroni) which uses a tiny invisible laser source and a matching sensor or Sparkfun ZX Distance and Gesture Sensor (£21.38 - Cool Components) which bounces infrared (IR) beams of light from the two LEDs on either side off of an object above the sensor. These sensors are far more precise than alternatives such as ultrasonic range finders.
A particular usability issue with these types of sensors is that they output data continuously - so we discussed the option of an additional ‘ON’ button - the module would only output data when this is pressed - and Mike suggested this might be easier for Gemma to control with a foot switch so we’re going to try this out.
Track ball module
Tumblr media
Mike had previously sent a link to these APEM R Series removable bezel trackballs - which look ideal but are expensive (>£150). We mooted the possibility of sourcing a second hand Atari track ball or deconstructing Gemma’s old and retired track ball for parts.
We discussed various ideas for refining it’s use - a couple of additional buttons to disable movement in the x or y-axis; a button to switch between fine and coarse movement (Gemma’s current track ball while old and bulky has this functionality and she likes it) - through this could also be a more variable rotary encoder.
Mike suggested we could use an Arduino Micro Pro configured just for the trackball with it’s own separate usb socket.
Alternatively, we could just use an optical mouse turned upside down. It’s an intriguing possibility - it certainly works though how easy it would be for Gemma to actually use requires testing - and its definitely a much cheaper solution.
Motorised faders module
Mike confirmed that multiple faders and encoders would only need one micro-controller within a single module.
We agreed that we’d need to physically ‘mock up’ a fader bank to check that Gemma can use it effectively - perhaps by optimising the spacing required between faders.
We also agreed that an earlier idea of buying a second hand Behringer BCF2000 MIDI controller which has 8 motorised faders and hacking it probably wasn’t worthwhile in the short term.
Joystick module
Though quite compact (~40mm square) compared to the joystick in the prototype the APEM 100113 2-axis, 4-position joystick is reasonably priced at £13 and worth testing.
TFT + Touchscreen module
We agreed that an Adafruit 3.5" TFT 320x480 + Touchscreen would not only provide a larger on device display but also the potential for a flexible touch interface - although we’d need a Teensy 3.2 with it’s larger RAM to drive it. While this wouldn’t have the tactile quality of a bank of physical buttons we could easily test different ‘virtual button’ spacings and layouts - such as the Ableton Live transport buttons - and see which worked best for Gemma.
Tumblr media
A Sparkfun 2x2 Button Pad with RGB LEDs set up as a radio group could sit next to Touchscreen and change the functionality of the screen.
Craig Howlett
While Craig is certainly interested in the development of the device overall his main role is to work regularly with Gemma on a one-to-one basis and support her developing a better understanding and skills in working with Ableton Live, integrating her existing hardware into her workflow and in testing and providing feedback on the various iterations of The Nashesizer.
We agreed that Craig should contact Gemma and arrange a series of regular, min 2-weekly sessions in her diary - at least until the end of February 2018. Craig will then confirm those dates with the rest of the team so that we could join them occasionally.
As discussed with Mike, an early session would document Gemma’s practice - to give us better sense of where she’s up to and the way she currently works when making a piece of work and/or performing live. We need to start with where Gemma is currently at - to understand what she’s comfortable and confident with; what she finds more difficult to realise; and what she’d like to do but currently can’t. The Nashesizer needs to respond to these issues and needs.
We appreciate that Gemma is currently far more familiar with Studio One than Ableton Live - but we want to encourage her to try Live more and to begin to appreciate it’s more sophisticated functionality and creative potential.
We discussed that Craig could start by comparing Studio One and Ableton Live like for like - so that Gemma can transfer the understanding she already has from one piece of software to the other.
Craig should also focus on the less familiar Live ‘session’ view - working up a demo to show how to build scenes and then move between these using Gemma’s Korg nano pad and Live’s MIDI mapping functionality to easily trigger an individual scene.
Later sessions should focus on getting ready for Gemma’s Metal residency in mid-February - actually working up creative projects that might be useful for the residency and in the process develop her knowledge and skills in using Live.
Craig suggested he could work up a flexible lesson/session plan on a week-by-week basis.
Generally we agreed that we need to encourage Gemma to use Ableton Live more - to nudge her out of her comfort zone; to try things out but not dismiss things too quickly; to gently challenge her understandable avoidance techniques; and to be a bit more critical in attempting to help move her forward.
Perhaps, as Craig suggests, we could work towards a deadline where Gemma agrees to delete Studio One off her computer.
Lewis will also contact Gemma and arrange a meet up soon to update her with developments and get her feedback, input and buy in of our plans.
0 notes
thenashesizer-blog · 7 years
Text
Lewis - Retrospective Notes from The Nashesizer meeting on 6th November 2017, 4pm Salford Eagle Labs
Present: Lewis, James, Mike + Craig - Gemma came along to the DM Lab North meet up later on.
Development Notes
Everyone is generally happy with ‘The Nashesizer Development’ document Lewis worked up for Gemma and Nicole Rochman at Sound and Music - including the proposed schedule, allocated days per team members and overall budget.
Introduction
We discussed the prototype developed through the Drake Music Lab North West Challenge - detailing what worked, what didn’t and what we’ve learned through the process - then outlining ideas for next stage development. Lewis later summarised key points for Gemma and got her feedback and input.  
Joystick
The principle of using the joystick as a key navigation mechanism within Ableton Live seems to work and make sense for Gemma. The prototype and its associated default Ableton Live file (including all the relevant MIDI mappings) enabled movement left and right between tracks and up and down between controls (volume, pan, aux sends 1-4 etc.) in Live’s ‘session’ view. The next iteration should also allow similar navigation between audio/MIDI clips within Live’s ‘session’ view and perhaps also by marker and track in Live’s ‘arrangement’ view.
Gemma likes the arcade joystick - even though and perhaps because it’s large and basic and so easy for her to use. However, this means that any 3D printed casing that accommodates the existing joystick will also be relatively bulky.
James did attempt to deconstruct and adjust the height of the shaft for the prototype - but despite being a relatively simple mechanism with four microswitches it’s actually well-designed for purpose and not straightforward to adapt.
Generally the thinking is that we should stick with this type of joystick - Gemma agrees she doesn’t have the fine motor control to make effective use of a progressive joystick or one with a higher number of (up to 8) positions.
It would be worth looking at slightly smaller alternatives - such as this Small Arcade Joystick via Proto-Pic or more ergonomic and compact designs such as this APEM 100113 2-axis, 4-position joystick via Farnell.
Tumblr media Tumblr media
Lewis has since found single axis joysticks - such as this Penny & Giles JC100-002-5K via Farnell - which although quite expensive (£42.39) may well suit Gemma and be worth trying out.
Tumblr media
Motorised Fader vs Rotary Encoder
Apart from the joystick, the prototype incorporated both a chunky (and particularly high resolution) 1024 pulse per rotation, quadrature rotary encoder and a 100mm motorised fader.
The key thinking here - in part based on Gemma’s feedback of her Akai Pro MPK mini mkII - was that a controller with multiple knobs and small buttons didn’t really suit her. What she needed was an alternative method to the track ball to navigate quickly to a given element within Live she wanted to tweak (via the joystick) and then a single fader and/or knob that worked well and was easy for her to control to adjust the value.
Lewis’s initial thinking was that the fader and encoder should be effectively interchangeable - having navigated to a specific control in a given track using the joystick, Gemma could then set a value with either the encoder or the fader module and the other module would update accordingly. In practice this didn’t work out - in part because the two modules were developed separately (the encoder by Lewis and the fader by Mike) but more significantly sequentially - one after the other. Accordingly the code developed for the early Teensy master and module slaves was a ‘rapid prototype’ and integration of the final set of inputs was somewhat bodged - they got caught in a feedback loop that was never really resolved.
The upshot is that we’ve now reconsidered this initial approach - while the next version will likely include multiple inputs, they won’t be interchangeable. Each input module will likely drive a specific and consistent control/element within a track/clip.
We’re also going to design the core code for both the master and slave devices to be far more robust and extensible. Mike’s suggested a ‘cooperative system’ approach in which each module within the overall design is part of a sequential state machine but with a secondary feedback system so that each is allocated the appropriate time it needs to complete its task.
These issues aside, Gemma also likes the fader far more than the encoder - which despite it’s high resolution, size and 3D printed custom knob she still found difficult to use. We thought that making a single large encoder with an ergonomic knob would overcome how fiddly Gemma finds arrays of small, close packed knobs on commercial controllers. But on reflection it seems to be more about Gemma’s restricted movements - she actually finds the wrist/whole hand twisting action required to turn a knob more difficult than say the combination of thumb/first finger movements required to control a track ball or the single finger/whole arm movement required to move a fader. We didn’t appreciate this initially and needed to build the prototype rotary encoder module to find this out.
This thinking will have an impact on our choice of future input modules. The Korg nanoPAD2 and microKEY Craig selected during the prototype phase, provide Gemma with a very suitable MIDI keyboard and XY-pad and buttons - so we don’t need to duplicate these. Motorised faders do seem to be a good choice - albeit that we need to control when they update so that they’re not in constant motion as Gemma moves between tracks (a simple timeout between a track change should sort this).
Tumblr media
Lewis suggested we should buy and hack a used Behringer BCF2000 MIDI controller which has 8 motorised faders (~£120 second hand via eBay). From a Google image search it looks as though the faders and adjacent right hand side button bank are on a separate PCB to the top set of encoder knobs and buttons which also makes the possibility of adapting and rehousing these faders more likely. Even if we end up desoldering the motorised faders from the PCB it’s potentially a cost effective alternative to buying new components.
Tumblr media
Isotonik Studios have a series of M4L control surface scripts and automapping utilities which integrates the BCF2000 into Ableton Live and are worth investigating further.
We’re also planning to test an alternative rotary encoder set up where the encoder is turned through 90 degrees - essentially providing a rotating disk or one-axis track ‘wheel’ which should suit Gemma better. These can be standard 24 pulse per rotation quadrature rotary encoders - far cheaper and more compact than the encoder from the prototype. We’re also going to try mounting the encoder on a bearing so that the angle of the wheel can be adjusted to best suit Gemma’s hand position and thumb and finger movements. Choosing rotary encoders with an inbuilt ‘click’ should also make using them more ergonomic and haptic.
Other input modules
Additionally we’re thinking about testing and integrating a distance/capacitive sensor for gestural control and an Adafruit 3.5" TFT 320x480 + Touchscreen to provide not only a larger on device display than the prototype (a late addition to the initial design this actually worked very well) but with the potential for an additional yet simple touchscreen interface.
Tumblr media
We also discussed integrating a basic set of transport buttons into the controller - play, stop/pause and record - so that Gemma can control Live’s playback and record functions from the device itself.
A modular system
An initial conception for The Nashesizer was that it be modular - a ‘base’ board with a set of independent modules that could be rearranged and reconfigured for different uses. While the prototype did realise this approach to some extent - the Teensy 3.6 in the base board acts as the I2C master with an Arduino Nano inside each module as I2C slave - the engineering required to fabricate a robust but flexible connection system between base board and modules was just too demanding. We’ve since decided that ‘modular’ in this context means configurable during fabrication - the layout of the various input modules integrated into The Nashesizer can be customised for Gemma’s preference while it’s being built - but once fabricated modules will be essentially fixed and can’t be reconfigured.
One issue with the prototype was power requirement. The relatively large current demand of the motorised fader (~0.6A) meant that the OLED display frequently didn’t work when it was plugged directly into a laptop USB port (which conventionally only provides 500mA). The solution was to use a powered USB hub with a 1A port - fine for a permanent home set up but not ideal in a live setting requiring more kit, cables and power sockets. Mike has since suggested we integrate a 12V PSU into the next version and step-down convertor(s) to adjust the voltage to the required values for all of the various modules and components.
The prototype is currently connected to the computer via the Teensy’s micro-USB socket. This is fiddly to plug in and not robust - the socket has developed an intermittent fault due to the strains of continual plugging and unplugging. So we’re going to integrate a far more robust USB A type socket into the device and then connect this internally to the Teensy’s micro-USB socket.
Gemma and Ableton Live
Gemma currently uses and is familiar with PreSonus’s Studio One. However, she’s agreed that The Nashesizer should be aimed at Ableton Live which she also owns a copy of and is keen to get more familiar with. So a key aspect of the project is for her to shift to working in Live and to develop her understanding and abilities in using it. Craig will support Gemma through a series of ongoing one-to-one sessions to get to grips not only with it’s functionality but also it’s creative potential.
Next Steps
Lewis to upload these notes to the project Tumblog and invite feedback from the team. Then to arrange one-to-one meetings with team members to discuss and plan next steps.
Please add comments and additional things we discussed if I’ve omitted anything or gotten the wrong end of the stick.
0 notes
thenashesizer-blog · 7 years
Text
The Nashesizer Development - Gemma Nash for Sound & Music ‘Pathways’ Programme
0 notes
thenashesizer-blog · 7 years
Text
PHASE 2
Following on from the initial ‘The Nashesizer’ MIDI controller prototype produced for the Drake Music Lab North West Challenge we’re now going to move to a next stage development of a working production version as part of Gemma Nash’s ‘Pathways’ programme for Sound and Music.
0 notes