An Interview With the CEO of Neuromorphic Vision Leaders, Prophesee
Luca Verre, CEO and Co-founder of Prophesee, speaks with Automate Pro Europe’s editor, Joel Davies, about the last six months and a number of new developments the company’s been working on.
For those readers who are new or can’t remember our last interview, would you mind re-capping who you are and what Prophesee do?
I co-founded Prophesee seven years ago in Paris. We are a spinoff of the Vision Institute which is a Research Institute working on neuromorphic vision technologies – technologies that are inspired by the way the human eye works. What we do at Prophesee is we develop a vision sensor that mimics the human eye, and we develop computer vision and AI algorithms that mimic the brain. The reason we develop these technologies is that it is the technology that is fundamentally different from the way conventional technology works and it is very suited for applications in the field of machine vision and AI.
People have realised that using conventional frame-based technology is not very efficient because you keep acquiring a lot of data over and over again, despite the fact that there isn’t any change in the scene. Typically, you lose information between two frames because the acquisition is done at a fixed point in time and there is an interval where you are effectively blind.
When you think about the way the biological system works, you realise that in fact, it’s by definition, a machine vision system. The eyes are not there to please our brain, they’re there to make us safer, smarter, and the way they do it is by having photoreceptors that are working continuously in analogue time in a way that is parallel and extremely efficient. At Prophesee we do the same.
We have more than 50 patents on the technology and have raised about $100 million so far with investment partners such as Sony, Intel, Bosch, Renault, Nissan, the European Investment Bank, Xiaomi and Sinovation, which is a large venture capital fund focused on AI. Our business model is very simple so in the end, we are a fabless semiconductor company that develops and sell a vision sensor.
How did the deals with those investors come about? How important are they going forward?
The investment announcement was putting together existing investors, who are, for example, Bosch, iBionext, 360 Capital, Intel, etc. The new investors are investing in Prophesee with, I would say, a strategic angle. Why? Because now that we can open up to their consumer applications, China of course becomes a very attractive market where mobiles and drones are a very large market segment. We had the opportunity to work with some of these leading VCs, collaborate in the space and receive their investment which is a good opportunity for us to network and grow the business in China.
That’s why we raised this investment, to enable the company to boost the presence and the development of the Chinese market. The Macnica announcement was part of the operational activities and special activities we were doing worldwide, in the US, Europe, China and Japan, mostly. We made this announcement with Macnica because they are definitely among the key players in the European space. They are a global distribution partner, but their focus would be particularly in Europe for the type of agreement we have.
It’s good for us to have that type of partner, together with FRAMOS and RISTAR in Japan who have a good knowledge of bringing in new technology and who are willing to invest to push new technology in the market. They are distributors that can bring a lot of value. They also invest in training their sales team in field applications since the technology is still a little bit more complex than selling conventional frame-based technology because the ecosystem is still emerging.
You mentioned consumer-based products. Are you moving into that space?
We are looking for applications in the consumer space. Today, when you look at the conventional image sensor business, what is driving the majority of the business? Almost 70% of the conventional image sensors today are sold in the mobile space. So, the mobile space is driving the growth of the conventional image sensor, which is a market estimated at around a little more than $20 billion.
There has been a big growth in the mobile market over the past 10 years but of course, now it’s slowing down because not everyone is buying a new phone every year. That’s even though there has been a clear trend of having more and more cameras inside the phone on the back and the front.
You can clearly see amazing growth in automotive and robotics. This is the panorama of the conventional market. We started with industrial because with the medical industry there was a clear product-market fit on one side and on the other, there was also some constraints because of the size of the sensor. We couldn’t really go after markets that were more cost-sensitive and also in terms of the mechanical side and the certain physical constraints that we couldn’t solve with our Generation Three which was too small.
With Generation Four, we will definitely open up opportunities. We have already engaged with many customers in evaluations of our latest Generation Four to start some field tests on various usages like driver monitoring or some smart access control for people detection, counting and tracking. Also, given the recent regulations that the governments are trying to enforce around privacy, there is an argument that certain customers seem to grasp around the fact that event-based sensors don’t produce an image, so they have this sort of privacy embedded in the acquisition process. So, we see some interest in that direction.
The last time that we spoke you had just released OpenEB for free. How did that go?
Very good. We released OpenEB in March and six months later we have more than 400 unique users and more than 2000 unique users of the software overall. Those are good numbers after almost a year from the first commercial release of our Metavision offer that shows the interest and the adoption of the technology.
I think we’re seeing more and more of the users experimenting, inventing, and developing applications. There is also a portion of users coming from academia and research where we see a large number of papers being published in top tier conferences based on our technology fundamentally. It’s good to be cited in the paper. So, we are in it and stimulating the research to push AI technology overall.
Then, of course, there is a growing number of users in the industrial space for more commercial purposes including aerospace and defence, IoT and mobiles. Today we have hundreds of accounts that are experimenting and implementing the technology in their system and more and more with a certain level of autonomy. We have a large application engineering team in place and the partners that I’ve already mentioned are investing to build internal knowledge.
[Guillaume Butin, Marketing & Communication Director, Prophesee]: Related to user-led development, we are announcing our new Third Generation Evaluation Kit (EVK) which features our Generation 4.1 High-Resolution Sensor which we collaborated with Sony on. It’s more cost-efficient compared to our original EVK Two and allows you to use event-based vision to test applications right away because it’s fully compatible with software that we have already available today. It contains over 95 algorithms, 11 applications that are already out of the box, and hundreds of pages of tutorials and recordings. It’s a fantastic way to get started and discover event-based vision by yourself, whether that’s in a factory or wherever you need to experiment.
Over the summer you picked up several accolades. Is praise like that important for Prophesee? Does it ever add any pressure?
I think it’s important in the sense that it is good to have some sort of recognition and acknowledgement because that’s a way to build credibility in the market and strong equity value. It’s also good in terms of brand awareness, so people will refer to Prophesee as the company that is building this innovative technology. And pressure – for me, there’s a much higher pressure that comes from the market and the customer in the need to deliver. We might add one more hopefully soon as we’ve been shortlisted for the Vision Award.
What will you be showing at VISION 2021?
[Guillaume]: We will have our booth and be among friends because FRAMOS, Macnica and IMAGO will be there. It’s the first time we’ll get the chance to all meet together in one place physically. It’s the first time seeing each other after almost a year and a half of working remotely. So, we are super excited about going back on the ground. We have a booth and we will show event-based vision through one-on-one demonstrations so people can understand the fundamentals of the technology and experiment by themselves physically.
They can do a demo where they will see what the sensor sees and they can move around and the sensor and the events will react accordingly. There will be one demo that shows the ‘invisible’ between the frames and how we fill that gap. There will be a demo around vibration monitoring, so how we use almost 1 million pixels to build this super high precision frequency map with pixel-by-pixel response to predict machine failure.
For the first time, we will showcase a particle size monitoring demo where we project very small bullets of different sizes in front of the sensor at super high speeds using compressed air and count and measure the bullets at the same time at a high precision rate. We will also be running a demo where we show that displaying the motion pixel by pixel and sensing motion pixel by pixel is extremely efficient compared to traditional approaches.
Can you tell me about the collaboration with Sony?
The Sony product is the same technology as our previous generation in the sense that the fundamental event-based technology is the same. Then, of course, there have been great improvements in several respects. I’ll mention the top three. For me, the first one is the size. Until Generation Three, we had been using a front-side illumination 2D processor. We’d be processing one wafer with one photodiode and a lot of intelligence around it, with the drawback of size and fill factor because the light was only partially coming to the photodiode. Roughly 25% was full light and the rest was lost causing reduced performance in terms of low light.
When you lose 75% of the light, you have some drawbacks. So this is really the first benefit: the fact that with Sony we accessed the rear-side 3D stack process. Sony has one of the most advanced processes in the industry. There are not many players equipped with such a process. It’s publicly known that Samsung has one – as does TSMC, STMicro, and SK Hynix – but Sony has their own and it’s fully optimised for the image sensor and this for us has been a major step.
The second is electro-optical performance improvement because now we stack, so we’ve got almost 100% view factor and all the light goes on the photodiode getting better performance in low light. Plus, the photodiode is optimised for an image sensor process. So in that respect, we have improved quantum efficiency, improved contrast, sensitivity, and improved bandwidth because we redesigned the readout. This has been the second key aspect where there’s been great work in optimising the process to push the electro-optical performance of the sensor to its best with Sony’s help.
The third one is that until Generation Three, we had a sensor with the basic functionalities of event-change detection. Then certain digital functionalities for formatting, which were actually part of separate companion FPGA, were developed. The work we have done with Sony is to make it all one chip so there is no FPGA anymore. Then and this has been mostly our work, developing what we call event-signal processing. So this digital processing pipeline embedded inside the sensor now takes the event and prepares the event in a way that is more optimised for an interface with the digital world of conventional SoC.
In the first benefit, we show the technology is scalable – also for the mass market. The second is that we have the best event-based sensor out there. The third is about integrability. The sensor is now integrable with any computer platform. The first and most important step is to show the market that the event-based technology is scalable – this is the key message. I think this message is so important because it means people no longer see event-based technology as a niche technology, constrained by size.
Has the chip shortage affected you at all? What is your feeling about it generally?
For the sensor actually no, not at all. We have felt some of the effects because we also produce some evaluation kits, some cameras, so some of the components of the camera are affected, but we do not do this in a large enough volume that it has an impact on us.
For me and for the business, I think this is definitely creating a big push in the semiconductor industry. Governments are realising that semiconductors is a key sector and that’s why you now see all these tens of billions in investment all over the place – it’s Intel, the US government, the European community. I was reading that SMIC in China was investing 10 billion to build a new foundry. In Japan, there’s Sony, the Japanese government, and TSMC committed to building foundries in the US.
Everyone understands that today 60 to 70% of the high-end chips are built in Taiwan. There is this high dependency with Taiwan being Chinese according to China and non-Chinese to those not in China. Everyone, including policymakers, is putting semiconductors at the top of their agenda. So we, as a European semiconductor company, will be more visible and I think we can attract more investors and in that respect, it’s positive right now. I think as a European company we leverage this position of being in between China and the US, and that’s also good to some extent.
In fact, we have a very balanced shareholding with investors coming from all over the place. We have business in many places and that’s good. People know that semiconductors are a cyclical business and, I mean, you see the market cap valuation of all these companies that have really skyrocketed. They went very, very high. Ambarella, for example, Nvidia, and the double market cap in less than six months, which was already huge actually.
There is a risk that because it’s a cyclical industry, an overreaction to the shortage will create massive stock and then at some point, this will create a bounce back in the supply chain and maybe there will be some deflation in interest and therefore valuation. In our case, we’re not in the commodity business so image sensors and, in particular, the one that’s related to AI is projected to grow for the next 10, 20 years because there are major trends that will not stop despite the cyclical nature of the semiconductor industry.
How do you feel about the last 9 months and how do you feel about the next year?
Positive, despite Covid. It’s creating some challenges still. But yes, it’s been very positive. We managed to build a team in China and the US, despite the fact that we never physically met the people. And we’ve done business with companies we have never really met. So that’s a challenge. But we’ve done very well I think, much better than we were expecting, and that’s good. We have also managed this relationship with Sony for the past 18 months since Covid started, although we knew the people before. It’s not easy with an 8-hour time difference and big cultural differences. We did well and I think we put in the effort to do so.
Hopefully, in the second half of next year, we will be able to travel more. From the business side, I am very confident we will grow because of this big trend we were talking about before. For the company what I am looking for is to push business and push sales, to keep growing the team, to keep developing more products and more technology and eventually raise more money and keep going.
This interview originally appeared in Automate Pro Europe magazine 2. All information was correct at the time of publication.
You can find more information about Prophesee on its website.
New code reading solutions from Strelen Control Systems realise the sophisticated evaluation of tiny codes...
More than 250 key stakeholders from across the image sensors industry will convene in London...
Rohde & Schwarz brings its exceptional insight of wireless communications testing and a deep understanding...
ProovStation has raised 10.4 million euros to deploy the first network of AI-assisted testing stations...
Emerson has introduced a new ultrasonic metal welder for bonding larger batteries, conductors and wire...
Seco Tools has announced it has acquired Premier Machine Tools (PMT). As of February 1st,...
SICK has launched an industry-first safety light curtain system for Smart Box Detection, designed to...