Search within a video with transcription and timestamps
Improving search UX for video content
Fixing video’s biggest frustration
As more and more content is being delivered in engaging video formats, searching for information within a video has never been more difficult.
When looking for a particular clip, unless the learner knows which course, lesson and module the clip appears in, plus the time within the video, the clip often goes unfound.
This can be a very frustrating experience for your learners, and this lack of “searchability” can reduce the value that your e-learning is able to deliver.
With synchronised video transcriptions, spoken word from your videos are converted into written text and added to the lesson. This text is then indexable by your LMS, meaning it will now show up in search results.
To super-charge this even further, YouTube-inspired chapter timestamps and search tagging allow learners to jump right into the sections of the video most relevant to them. For example, in the example above, the user is able to jump to the sections of the video where the search criteria is found.
As a bonus, within an open course, this video is now indexable by Google, which improves your discoverability within search engines. Pretty neat.
This works with videos embedded from YouTube and Vimeo, as well as self-hosted video content.