Display Stream Info

Display stream meta and info on the video stream

Overview

Displays metadata on the video stream

Inputs & Outputs

  • Inputs : 1, Media Format : Raw Video
  • Outputs : 1, Media Format: Raw Video
  • Output Metadata : None

Properties

| Property | Description | Type | Default | Required |
| show_fps | Display effective frame rate at which the video is being processed. Boolean. | bool | null | No |
| show_frame_count | Display frame count? | bool | null | No |
| show_clock | Display clock? | bool | null | No |
| clock_format | Clock format. Conditional on show_clock being true. | string | %d %B %Y, %H:%M:%S %p %z | No |
| timezone | Timezone. Conditional on show_clock being true. Options: us/pacific, us/central, us/mountain, us/eastern, other | enum | us/pacific | Yes |
| timezone_offset | Timezone offset from GMT. Conditional on timezone being other. | float | null | Yes |
| show_text | Display text? Options: disabled, top, bottom | enum | disabled | No |
| text | Text to display. Allows templates, see documentation for supported variables. | text | null | No |
| show_objects | Display objects? If enabled, Location relative to detected object where the label will be displayed along with bounding box. Options: disabled, bottom_left, bottom_right, top_left, top_right | enum | null | No |
| show_object_detail | Display object detail? Options: disabled, tracking_id, confidence, all | enum | null | No |
| object_types | Display object types. If specified, only display objects of these types. Else displays all objects. \n \nEx. car,person \n \nAccepted formats: \nobject_label: any object of this type, with or without a classifier attribute. Example: car \n \nobject_label.class_type: any object of this type that has a specific classifier attribute. Example: car.red \n \nobject_label.*: any object of this type that has at least one classifier attribute. Example: car.* (this will match car.red, car.yellow, etc) | model-label | null | No |
| highlight_object_types | Highlight object types. If specified, highlights objects of these types. Else highlights no objects. \n \nEx. car,person \n \nAccepted formats: \nobject_label: any object of this type, with or without a classifier attribute. Example: car \n \nobject_label.class_type: any object of this type that has a specific classifier attribute. Example: car.red \n \nobject_label.*: any object of this type that has at least one classifier attribute. Example: car.* (this will match car.red, car.yellow, etc) | model-label | null | No |
| color_scheme | Object color scheme. Use * to match all object types, *.attribute for objects with the specified attribute, or object_type.attribute for a specific object type with the specified attribute. Colors can be specified as R,G,B values or names. | json | {"*": "51,197,255", "highlight": "242,109,109"} | No |
| show_metadata | Display stream metadata contained in the video frame. Boolean. | bool | null | No |
| log_metadata | Log stream metadata contained in the video frame to Console, so it can be viewed under Deployment -> Logs tab. Boolean. | bool | null | No |
| meta_list | Metadata filter. Comma separated list of metadata properties to display/log. \nThe set of available properties is the combination of individual metadata properties added by all nodes previous to this node. \n \nExample: \nvideo1.source_name - This will display the stream name on the video. | string | null | No |

Metadata

Access metadata using the Function Node or using the API, from the snapshot or clip saved downstream of this node.

Metadata PropertyDescription
NoneNone

Customize

In order to customize this node for your use case, use the code below inside a Function Node.

from lumeopipeline import VideoFrame  # Lumeo lib to access frame and metadata
from lumeopipeline import Utils
import time

FPS_MEASUREMENT_INTERVAL = 30
frame_count = 0
fps = 0
last_epoch = time.time()
meta_keys_to_show = None

def process_frame(frame: VideoFrame, show_fps=False, show_metadata=False, log_metadata=False, meta_list=None, node_id=None, **kwargs) -> bool:

    global frame_count
    global fps
    global last_epoch
    global meta_keys_to_show
    global FPS_MEASUREMENT_INTERVAL

    show_fps = (show_fps in ["true", "True", True])
    show_metadata = (show_metadata in [u'true', u'True', True])
    log_metadata = (log_metadata in [u'true', u'True', True])

    if meta_keys_to_show is None:
      meta_keys_to_show = meta_list.replace(" ","").split(",") if meta_list is not None else []

    frame_count = frame_count + 1
    if frame_count % FPS_MEASUREMENT_INTERVAL == 0:
      time_elapsed = time.time() - last_epoch
      fps = int(FPS_MEASUREMENT_INTERVAL/time_elapsed)
      last_epoch = time.time()
    
    with frame.data() as mat:
      yidx = 50
      if show_fps:
        (label_width, label_height) = Utils.write_label_on_frame(mat, 25, yidx, "FPS: " + str(fps))
        yidx = yidx + label_height

      if (show_metadata or log_metadata):  
        try:
          meta = frame.meta()
          if meta is not None:
            for (key, value) in meta.get_all().items(): 
              if value is not None and (len(meta_keys_to_show) == 0 or key in meta_keys_to_show):
                if len(meta_keys_to_show) == 0:
                  label = key + " : " + str(value)
                else:
                  label = str(value)

                if show_metadata:
                  (label_width, label_height) = Utils.write_label_on_frame(mat, 25, yidx, label)
                  yidx = yidx + label_height

                if log_metadata:
                  print("[{}] {}".format(node_id, label))

        except Exception as error:
            print(error, flush=True)
            pass

    return True