The challengeUsers can currently search assets by file name or creator name. This is useful in some cases, but users often do not remember who created a specific clip or what the file was called — especially when clips have generic or random names like “image123”.
Our approachImprove the existing asset search functionality so users can find assets based on additional database-backed metadata, especially:
- Spoken content from video clips, using the transcript
- Notes added to footage items
- Other searchable content or context associated with an asset
This would allow users to search for details they remember about a clip instead of needing to remember the file name or creator.
ExampleIf a user remembers that they created clips at a trade fair in Hamburg, and the protagonists in the video talk about Hamburg, searching for “Hamburg” should return those clips because the transcript contains that word.
Expected outcomeUsers can rediscover relevant footage faster and more reliably by searching across meaningful asset context such as transcripts, content, and notes — not just file names and creator names.