Shauli Zacks
Updated on: May 5, 2024
In a recent interview with SafetyDetectives, Yaron Litwin, Chief Marketing Officer at Canopy, shared insights into the company’s innovative approach to online safety. With over 15 years of experience in consumer-tech companies, Litwin brings a unique perspective to Canopy’s mission of protecting the mental health of families in a digital age. Discover how Canopy’s AI technology is revolutionizing parental control and internet safety, and learn about the challenges and trends shaping the future of digital parenting.
Can you tell us about your background and what led you to your role as CMO at Canopy?
For the past 15+ years I’ve led consumer-tech companies to drive growth and build strong brands. As a father of young teens, I understand first hand the challenges, frustrations, and fears parents have with kids glued to their phones. In Canopy I saw an opportunity to build a mission-driven AI tech brand that helps parents protect the mental health of their loved ones.
What inspired the founding of Canopy, and how does the company’s mission drive your strategies in marketing?
Canopy’s AI-based technology was developed over 15 years ago in Israel (where it protects almost all school computers in the country), and is constantly improving. Originally, a need was noticed in the religious community – they were the first to realize the potential dangers of the internet, but needed it for use in education and work. The same need exists, of course, in America. Canopy was developed as a parental control app for use in the USA and other countries, and over time we saw that it wasn’t only parents looking to protect their children who were interested – there were many adults who wanted to use our technology to give themselves a cleaner online experience. We began marketing to them – not only to the faith-based community, but for others who are looking to improve their mental health and productivity, or just to get more control over their own internet experience.
Canopy uses AI to enhance internet safety. Could you explain how this technology works and what makes it unique in the marketplace?
Most parental control solutions work off a blacklist (blocking known problematic websites) or whitelist (only allowing specific necessary, safe websites). Canopy is unique in that a child or adult can access any website they want through their browser, and AI will be used to scan all images and videos before they appear on the screen, blocking anything that contains nudity (or even partial nudity, depending on the settings). This is very useful for websites where you may not expect to see explicit content, and on social media platforms which have their own safety measures in place that tend to be inadequate. This same technology is what powers our unique Sexting Alerts feature which can notify parents if their kid is sending or receiving nude pictures. Such a feature is an incredible safety measure as we see increased threats of sextortion, revenge porn, and cyberbullying.
What are some of the most significant challenges you face in marketing a product that deals with sensitive issues like online safety and parental control?
While there is a growing awareness of the dangers posed by the internet, I wouldn’t call it a full awareness. We grew up in a completely different world than our children, and even younger parents who had the internet as kids did not experience the same threats as their children do today. That is the main issue we are facing – driving parental awareness. Beyond that, we aim to educate parents on the need of a fully rounded online safety strategy, which should encompass the safety features put in place by device manufacturers and social media platforms, parental control apps like Canopy, and open, honest dialogue with our kids on their online activities and potential dangers.
What trends in digital consumption should parents be aware of today?
The growing spread of deepfake pornography is something that all parents should take seriously. Children are being cyberbullied and sextorted with explicit pictures and videos of themselves that are not even real! It is easier and cheaper than ever to produce such content, and it will only grow more popular. Kids need to be taught not to believe everything they see, certainly not to create and spread it, and to approach their parents if they find themselves victims of this worrying threat.
What role do you see AI playing in the future of online safety and digital parenting tools?
At Canopy, we believe that bad AI can be fought with good AI. As explicit content, including deepfakes, spreads across the internet at a rate we have never known, only an AI-based tool like Canopy can filter and block such content at scale. The only alternative is to severely restrict internet use with a limited whitelist. That solution may work for our younger children, but it is not enough for older kids who need the internet for school and other purposes, and it is certainly not enough for adults who wish to avoid online pornography.