iVideoSmart Bets on Video to Give Publishers a Boost
Nowadays, hardly a week goes by without fresh news of the struggles of media companies to adapt to the daunting economic landscape of the digital age. Even online news sources are grappling with how best to balance the costs of producing content with the revenue generated from advertisers, subscribers, and other sources. iVideoSmart (IVS), a 3 year old startup based in Singapore, proposes an advertising business model based on the selling power of video.
In short, IVS enables publishers to automatically match their article content with relevant videos on which advertising space can be sold. Loong Chee-Yuh, its chief technology officer, uses the example of an online newspaper website. if the paper publishes a story online about cars, IVS AI-powered widget will recommend a relevant video based on the analysis of the article, and advertisers will be able to buy space on the video, which will appear adjacent to the text. IVS’s main insight is that video views are worth more in advertising dollars because a user’s engagement with video can be tracked more accurately than it can with traditional online banner ads. This could be a boon for publishers.
“We look at an article’s contents, find the keywords, do some intelligence scripting, and run things through our natural-language-processing engine” in order to match an appropriate video with the text, Chee-Yuh says. The footage in the video might be provided by the publisher or licensed by IVS from its network of over 165 content providers globally, but all of it is “premium content,” he says, not user generated, as is common on YouTube and Facebook. In fact IVS built a B2B content exchange to enable this auto syndication and distribution of content. Because the publishers host the videos on their own online properties rather than posting them to third-party sites, they’re able to keep more of the profits of the advertising. Meanwhile, he says, the publisher “doesn’t have to operate or maintain anything”—the widget does all the work.
Chee-Yuh—who began his career in tech at the Info-Communications Media Development Authority, a Singapore government agency —shares that IVS is delivering over 210m video views a month on 900m Page views and reaching 92m unique users . Most of its clients are based in Indonesia, the Philippines, Malaysia, Hong Kong and Taiwan. The company recently completed a $4.5 million series A+ round.
Chee-Yuh sees challenges—and opportunities—ahead. The processing of human languages in order to make the sharpest algorithmic pairing of video and content can be tricky. Chinese, for example, only rarely has spaces between words, “so we have to find ways to make sense of [the characters],” he says. Additionally, because media companies own the information about the users who visit their sites, IVS can refine its algorithm only so much; Chee-Yuh wants to be able to tailor the selection of videos not only according to the surrounding content, but also to data about the user.
The biggest challenge, he says, is knowing which aspects of the product to refine. The goal of total optimization wasn’t a good use of company time, Chee-Yuh found: “There would be times when we would be trying to optimize our code, to reduce the network load, and then we’d get just a measly 1% performance improvement.” Instead, he now focuses on the refinements that account for the biggest gains in customer satisfaction.
The outlook for IVS is promising, and not only because of the success of its latest round of funding. They’ve also decided to employ a lean DevOps team in a push to “develop new products and innovate rapidly.” Chee-Yuh says their goal is always to “find the right customer at the correct pain point and offer the correct solution.”