Real-time tracking for broadcast

Converge

The Project

In 2019, while working as a designer on the Blacktrax team, I received an urgent call from a business developer. "Hey, I'm heading to Japan in four weeks and we really need to wow Panasonic. What do you got?" he asked. Rising to the challenge, I designed and delivered a UI prototype for a sophisticated real-time robotic camera tracking system that utilized infrared camera technology—all within the tight four-week deadline.

Blacktrax infrared tracking cameras - Industrial design by Fredicus
project INSIGHTS

Rushed Discovery

The ideas was sold before we had it. To tackle this challenge, I collaborated with a broadcast engineer whose industry expertise proved invaluable. Blacktrax 1.0's interface was based on data table patterns and wasn't intuitive for broadcast professionals new to infrared tracking.

Its core functionality remained the same as before: users could assign objects to trigger actions, such as making cameras automatically track people's movements in real-time.

Blacktrax 1.0 UI - interaction model based on data tables and configurations
user interface concepts

Design Phase

I realized this was about how do you visualize a network of inputs And what I realize was that every object, were all nodes. Which had their own characteristics and preferences.

I needed to make it easier for the people who are building these relationships to visualize layout and make changes on the fly. This required a better interaction model than tables.



I reflected this concept literary by displaying objects alongside their related sensors and connections. For example, selecting a beacon would display its properties—such as battery life and visibility—on the right side, enabling intuitive adjustments.

If it’s a camera you can see the network, the port and set auto-iris (ability to compensate for large variations in light levels). It makes all the objects like actors on the stage.

2D view - beacon selection and contextual right panel (inspector)
2D view - camera selection and contextual right panel (inspector)

visualizing actors on a stage

I created a second view in 3D to show the impact of the system in real-time. For instance, in this example, the red circle represents the beacon on the actor's neck, while the blue box shows the camera’s field of view. If the actor’s head were cut off in the frame, it would appear as a mistake. To prevent this, I designed a feature allowing the human operator to manually adjust the camera frame in relation to the sensor, ensuring precise and professional framing.

3D view - what the camera sees, the actor has its head cutoff
3D view - adjusting the camera frame in relation to the sensor (offset)
Deployment and impact

The Outcome

This prototype led to a new partnership between Panasonic and Cast, showcased at NAB 2019, paving the way for Cast to enter the broadcast market. In just four weeks, we delivered a functional design, drawing on my expertise with the first system, which had established a key product line later expanded with features like augmented reality and new hardware such as Helios.

NEXT CASE

Blacktrax Website

Educating leads to buy a real-time tracking solution

View Project