Real-time tracking for broadcast

Converge

The Project

In 2019, while working as a designer on the Blacktrax team, I received an urgent call from a business developer. "Hey, I'm heading to Japan in four weeks and we really need to wow Panasonic. What can you do?" he asked. Rising to the challenge, I designed and delivered a UI prototype for a sophisticated, real-time robotic camera tracking system that utilized infrared camera technology—all within a tight deadline.

Blacktrax infrared tracking cameras - Industrial design by Fredicus
project INSIGHTS

Rushed Discovery

The idea was sold to the client before we had the final product! To tackle this challenge, I collaborated with a broadcast engineer whose industry expertise proved invaluable.

Blacktrax 1.0's interface was based on data table patterns and it wasn't intuitive for broadcast professionals new to infrared tracking. Its core functionality remained the same as before: users could assign objects to trigger actions, such as making cameras automatically track people's movements in real-time.

Blacktrax 1.0 UI - interaction model based on data tables and configurations
user interface concepts

Design Phase

I quickly realized that, at its core, the goal was to figure out the best way for operators to visualize a network of inputs and connecting nodes that had their own characteristics and nuances.

I needed to make it easier for those building these relationships to visualize the layout and make changes on the fly. This required a better interaction model that transcended datatables.



I brought this concept to life by displaying objects alongside their related sensors and connections. For example, selecting a beacon would display its corresponding properties — such as battery life and visibility — on the right side, thus enabling intuitive adjustments and immediate understanding.

If it’s a camera you can see its network, the port and set auto-iris (ability to compensate for large variations in light levels), ultimately making all objects like actors on a stage.

2D view - beacon selection and contextual right panel (inspector)
2D view - camera selection and contextual right panel (inspector)

visualizing actors on a stage

I created a second view in 3D to show the impact of the system in real time. In this example, the red circle represents the beacon on the actor's neck, while the blue box shows the camera’s field of view. If the actor’s head was cut off in the frame, it would appear as a mistake. To prevent this, I designed a feature allowing the human operator to manually adjust the camera frame in relation to the sensor, ensuring precise and professional framing.

3D view - what the camera sees -- the actor has its head cut off
3D view - adjusting the camera frame in relation to the sensor (offset)
Deployment and impact

The Outcome

•Successful pitch led to partnership with Panasonic, later expanding to Sony, Canon, and NCamera/Zeiss
•22% year-over-year revenue increase with adoption by major clients including Apple's Steve Jobs Theater
•Adoption by major entertainment venues and virtual studios, including the Steve Jobs Theater at Apple
•Established foundation for new AR/XR features, enabling unprecedented creative expression for users

NEXT CASE

Blacktrax Website

Educating leads to buy a real-time tracking solution

View Project