A flexible live-mixing tool for sound engineers that enables better interaction possibilities and more efficient workflows.
PRODUCT / UX CONCEPT
2014
IN COLLABORATION WITH SVEN
HFG SCHWÄBISCH GMÜND
ABOUT
The concept of hybmix is based on the idea of combining both haptic and touch interactions in one single product. By doing so, it improves the user experience and enables better interaction possibilities, while simplifying the information architecture as well as improving the software and hardware ergonomics. hybmix provides a clear view of the essential functions and processes on the console and enables the adjustments of the mixer's setup and settings according to the context of use. With a large display as the basis, information and feedback can be changed dynamically and adapt to the situation, while keeping the physical interface unchanged.
USER RESEARCH AND CONTEXT ANALYSIS / UX REVIEW / JOURNEY MAPPING / SCENARIOS & USECASES / OPPORTUNITY AREAS / PAPER PROTOTYPING / WIREFRAMING / INTERFACE DESIGN / INTERACTION DESIGN / BRANDING / PHYSICAL MODEL
CONCEPT
To suit the console to the typical workflow of sound engineers we divided the interface into two essential modes, making it easier to get a quick overview and show low hierarchies of information. The Setup-Mode contains all fundamental functions that need to be set up before an event and won’t need to be modified again. The Show-Mode focusses solely on the live situation. It enables fast interactions and direct access to relevant parameters and functions of the console.
CONCEPT
One of the main characteristics of hybmix is the combination of hardware and touch interactions. Knobs and faders offer not only a familiar input situation to an audio engineer but also enable quick, direct and very precise interactions, essential for the most important mixing functions during a live show. Primary parameters are always available for fast access via haptic encoders, providing accurate and immediate feedback in combination with a reduced and context–sensitive visuality. Furthermore, each encoder has an expressive parameter–specific graphical representation that bridges the interface–gap while staying highly legible and transparent about the current action performed.
In addition, a detail area makes it possible to display settings on a larger scale in order to compare and adjust them via touch gestures.
We kicked off the project believing that the core problem was clarity and that we could solve it by reducing “everything” to one flexible knob…
As a first step, we researched mixing consoles, their context of use and the requirements for the users.
We followed several audio engineers at different events to understand their workflows and their mixing processes.
In parallel, we interviewed experts to get qualitative insights about the field, their approaches and the problems they have encountered. Talking to them helped us better focus our thinking and frame the users’ needs and core competencies such as stress-resistance, social competency, technical knowledge and musical creativity.
All of this led us to frame two main questions: “How to combine and design haptic and digital input elements to allow quick action?” & “How to structure the console with low hierarchies that support the process and provide clear overviews?”.
The research allowed us to focus on the most relevant context, the one of Live-Mixing, showing that the identified issues have the most effects in this area.
The analysis of the typical workflows and interactions situations when mixing live shows led us to find out a key opportunity area: there is a need for dividing the mixing process into phases.
Our findings led us to revisit the problems we had initially identified and to frame them into four categories: clarification, accessibility, consistency and flexibility.
We sketched holistic concepts in fast sprints to learn what works and what the dependencies are. We built use cases to validate our initial concepts, which then defined the product direction.
We quickly learned that our original idea of having just one knob had no future because what counts most for the user is fast and direct access to the parameters of the console!
Cardboard prototyping allowed us to get an ergonomic feeling for the interface at an early stage. It forced us to think in hybrid–UI terms and helped us to answer questions of sizes, angles, mental models and interaction patterns.
Through multiple rounds of testing and evaluation, we could eventually create and realise the final concept applying an integrative design approach. Knob-studies helped us find the right synergy of sizes, spaces, relations, interaction and feedback in ergonomic terms.
A crucial part of the GUI-thinking was the digital visualisation of the physical interaction with the knobs. We started off with radial visualisations to find out that the models of linear graphics represent our EQ-parameter best. To reduce confusion with neighbouring channels, as well as foster the direction of reading, we decided to left-align our digital feedback. This also came in handy for right-handed users.
For the final model, we build a metal frame and fit a 4K-Display with a resolution of 3840x2160px. On top, we fit two acryl glasses divided by one layer of polycarbonate. We lasered the acryl glass to fit faders and encoders that attached with magnets.
We simulated the dark GUI in After Effects, ran it on our model and presented through use cases in a role play.
The design process can sometimes be a messy one.
There were times along the process where we felt a bit disoriented, being forced to take decisions with a vast amount of possible directions while working in a highly complex technical environment of product, usage, and ecosystem.
For us it was helpful to, first of all, accept that it’s not about the easy linear way, that sometimes is important to take a step back, to break down the problem into small, solvable chunks and be even more intentional about your future actions. Especially in teamwork it was important to be open about those feelings, to trust your colleague and embrace the process.
We took a few rounds to define our problem properly and develop theses. Framing our scope and the constraints of the ecosystem made it easier to determine the product placement and specification.
The only way to learn is to do it: exploring possibilities and raising questions through prototyping.
Helping a furniture producer design a user-friendly online platform to affirm its digital brand and support the work processes of its teams.