Architectural Lighting Control Software — Technical Overview

Architectural lighting control software provides the logic, programming environment, and playback engine for façade and media lighting systems. It is responsible for:

  • defining lighting scenes and show timelines,
  • mapping virtual pixels to real fixtures on the façade,
  • converting media content (video, generative graphics, data streams) into lighting data,
  • scheduling and synchronizing playback across controllers and networked devices.

Modern systems typically distinguish between:

  • architectural scene design software for static and dynamic façade lighting, and
  • media façade / pixel-mapping software for video, generative content, and large pixel arrays.

On your architecture-facade stack, these roles correspond to tools such as DITRA LightFORM Studio (scene design / timelines) and DITRA LightMEDIA / pixel-mapping environments used with controllers like ArchiCORE. 

 

Roles of Architectural Lighting Software

Architectural and media façade lighting software typically covers several functional areas:

  1. Fixture patching & addressing
  2. Scene and timeline programming
  3. Pixel mapping for media façades
  4. Content ingestion and conversion (video → light)
  5. Playback and scheduling
  6. Live control and overrides
  7. Diagnostics, logging, and system monitoring

These functions allow designers and integrators to move from a conceptual lighting design to a fully deployed, maintainable control solution.

 

Architectural Scene Design Software

Architectural scene design tools focus on static and dynamic lighting scenarios rather than full video playback. Typical capabilities include:

Fixture Management and Patching

  • Definition of fixture types (channels, color model, control protocol).
  • Assignment of DMX, DALI, or Ethernet-based addresses to physical fixtures.
  • Grouping by façade zones, architectural elements, or luminaire types.

Scene and Timeline Programming

  • Creation of scenes: fixed intensity, color, or CCT combinations for all or part of the façade.
  • Arrangement of scenes on timelines, with transitions, fades, and overlaps.
  • Use of keyframes to define color/intensity at specific time points.

These tools are often compared to non-linear video editors: the user arranges lighting cues and layers on a time axis.

Effect Engines

  • Built-in effects such as waves, gradients, chases, flicker, or noise-based patterns.
  • Parameter control (speed, direction, amplitude, color palettes).
  • Layering of multiple effects over the same fixtures to achieve complex looks.

Visualization and Pre-Programming

  • 2D or basic 3D previews of façades and fixtures.
  • Real-time preview of scenes and timelines before deployment.
  • Offline programming while the actual installation remains in operation.

Deployment and Integration

  • Export or direct upload of show data to controllers (e.g., playback controllers in cabinets).
  • Synchronization with time-of-day, astronomical clock, or event schedules.
  • Optional integration with higher-level city/CMS systems via API or gateway.

 

Media Façade / Pixel-Mapping Software

Media façade software is optimized for pixel-level control at video frame rates. It is functionally closer to a media server than a traditional lighting console.

Pixel Mapping

  • Import of façade geometry (2D or basic 3D), including locations of pixel nodes, strips, or grids.
  • Definition of virtual canvases that match the resolution and shape of the physical installation.
  • Mapping of each LED pixel or group to coordinates on the virtual canvas.

Content Ingestion and Generative Engines

  • Import of image sequences, pre-rendered video, or stills.
  • Support for high-resolution video formats and multiple layers.
  • Generative modules that create effects in real time (noise fields, audio-reactive patterns, particle systems).
  • Integration with external sources (NDI, screen capture, real-time data feeds).

Output Protocols and Performance

Media façade software typically outputs via:

  • Art-Net, sACN (E1.31), KiNet, or similar Ethernet-based protocols,
  • sometimes via proprietary or vendor-specific pixel protocols,
  • often at frame rates of 25–60 fps and hundreds to thousands of DMX universes.

Performance considerations include:

  • network topology and bandwidth,
  • universe mapping and segmentation,
  • synchronization across multiple controllers/zones.

Scheduling and Show Control

  • Time-based playlists for daily, weekly, or seasonal content.
  • Triggering via calendar events, control commands, or show control protocols (OSC, HTTP, DMX in).
  • Priority handling between “base” architectural scenes and temporary event content.

 

Typical Software Workflow

A typical architectural / media façade project using such software follows these steps:

  1. System modeling 
    - Define fixtures, controllers, and network layout.
    - Map physical positions into the software (2D/3D view).
  2. Programming / Content authoring
    - For architectural scenes: program timelines, cues, and effects.
    - For media façades: design or import video/generative content and assign it to the pixel canvas.
  3. Simulation and validation
    - Run scenes and content in preview mode.
    - Validate smoothness, readability, and synchronization.
  4. Deployment
    - Upload shows to controllers or connect live output to gateways.
    - Verify correct addressing and timings on site.
  5. Operation and maintenance
    - Use scheduling for regular operation.
    - Apply overrides for special events.
    - Monitor logs, alarms, and device status via the software or integrated CMS tools.

 

Integration with Controllers and CMS

Architectural and media façade software often operates in conjunction with:

  • Playback controllers (e.g., DIN-rail or rack-mount units with internal timelines).
  • Lighting CMS / city platforms that manage monitoring, alarms, asset information, and dispatching.

Typical integration models: 

  • Direct show deployment: Software exports show data into a controller, which then runs autonomously.
  • Live streaming: Software remains the active output engine, streaming Art-Net/sACN/KiNet to pixel and DMX nodes.
  • Hybrid operation: Static/daily scenes handled by CMS or architectural controllers; special shows driven by media façade software on demand.
       

 

Key Technical Considerations

When selecting or deploying architectural/media façade software, engineering teams typically evaluate:

  • Protocol support: DMX512, RDM, Art-Net, sACN, KiNet, DALI, SPI, etc.
  • Universe and channel capacity: ability to handle required pixel counts and universes.
  • Timeline / show logic: complexity of scheduling, conditional logic, and long-running operation.
  • Visualization: quality of 2D/3D preview relative to real façade geometry.
  • Interoperability: APIs, OSC/HTTP triggers, ability to coexist with consoles or other show systems.
  • Reliability: logging, backup strategies, watchdogs, and support for recovery after power/network failures.