Bertrand Serlet arrived at Building 8 in Rocquencourt in the mid-eighties to study integrated circuit programming, and became VP Development of Apple software. He speaks of his enthusiasm for scientific renewal, his keen interest in technology prospecting, and his unique professional experience.
PhD in Computer Science at Orsay University
Researcher at Xerox Parc
Software Engineer then Software Director at NeXT and Apple
From Rocquencourt to Cupertino
When I was ten, I got a JR01 computer for Christmas. The IBM-built mechanical and electrical toy had three horizontal strips that you could move, a button that lit up four little lights to make three input bits and four output bits. That’s when I discovered that a problem could be modelled, although it was pretty hard with so few bits! I was fascinated with the toy. Without it, I wouldn’t have discovered Boolean algebra until much later.
In the 1970s, the French Education Authority launched a project to put 50 Mitra 15 computers in 50 high schools. The school in my little provincial town was one of those selected. I spent a lot of time on that computer. Once again, without this experience, I would have learned about computing differently. There was no manual and no course. We figured it all out on our own. I went on to study mathematics and physics, and then began a DEA (French Master of Advanced Studies) in computing at IUT Orsay. I was taught by Jean Vuillemin, one of the greats of Inria, and I loved the course. I learned how to use algorithms properly and develop my own sense of style, which is important in computing. I followed in Jean’s footsteps and became an intern at the institute, which was on the cutting edge in France. I worked in the famous Building 8, where Gilles Kahn, Philippe Flajolet, Jean-Marie Hullot and Jérôme Chailloux also worked. The atmosphere was casual but intense. As a young researcher, it was highly stimulating and formative. I naturally followed up with a thesis on the representation of VLSI integrated circuits. At the time, these chips had thousands of transistors. Today they have billions.
Around the same time, I saw a presentation on Xerox PARC and was fascinated by it. I took the opportunity to do an internship with them for a few months, through contacts that Jean knew. On another trip to the United States, Louis Monier, who had formerly worked at Inria and just arrived at PARC, asked me if I’d like to come work with him. Having just completed my thesis, it was the perfect time for an adventure and I started out at Palo Alto in 1984. I stayed there for four years and worked with incredible people. Then, in 1989, Jean-Marie Hullot recruited me to join NeXT, a start-up created by Steve Jobs where he had just started working. I stayed with them until it was bought out by Apple in 1997. Then came the years at Apple, where, from 2003 to 2011, I was Senior Vice President of Software Engineering. Our teams were almost always in start-up mode, with a lot of weekends spent in Cupertino. When I was heading up the Mac OS project, Steve Jobs asked me to take a hundred of my best engineers and have them work with Scott Forstall on the iPhone project, and that’s what I did. We were all very aware that we were working on revolutionary products.
I left Apple in 2011, exactly ten years after Mac OS X was released. Apple is a fascinating and exceptional company, but I wanted to discover the world of start-ups. First, I created a cloud computing company called Upthere. I don’t work for it anymore but I’m still on the Board. Right now I have two projects in the works. The first, “grokable.ai”, is in the field of artificial intelligence, with a remarkable French scientist, Williams Paquier. It’s a purely scientific and highly innovative project that involves imitating animal intelligence. My second project, which takes up most of my time, is called Fungible. We’re working on integrated circuits for Data Centres. So I've come back to the silicon world!
Robot, man's best friend
No other science has seen such incredible development as there has been in Computer Science. The improvement of chip performance, as predicted by Moore’s law, has led to impressive advances. The fastest object used by humans isn’t the wheel, car, refrigerator, or television. It’s the smartphone! Apple released its first iPhone just ten years ago, and today several billion people use smartphones every day. This accelerated progress is remarkable!
I’m a bit sceptical about some technologies, such as the quantum computer. It’s probably because I don’t understand the technology and I have the impression that these kinds of machines will only be able to do certain types of computing. If it doesn’t lead to applications in general computing, it’s just silicon and chips! There are still huge possibilities for optimising the use of silicon to compute more with less energy.
My first digital memory
When I was ten, I got a JR01 computer for Christmas. With the IBM-built mechanical and electrical toy, I discovered that a problem could be modelled, although it was pretty hard with so few bits! I was fascinated with the toy. Without it, I wouldn’t have discovered Boolean algebra until much later.
Artificial intelligence is the field that will certainly evolve the most in the next few decades. We know how biological neural networks work. What we don’t know is what algorithms they use. An insect has enough intelligence to fly and find its way in a 3D world. Soon we’ll know how to do it with software. We've come a long way in the last ten years, but the next few decades will see even more incredible advances. All scientific and social fields will be affected by AI. Two centuries ago, humans were mainly concerned with food production and agriculture. After the industrial revolution, we spent our time manufacturing objects. In the twentieth century, humans confined themselves to office work, with a lot of paperwork, which doesn’t contribute much to society. In 50 years, the mechanical aspects of office work will no longer exist and will be replaced by artificial intelligence.
A large part of our unfulfilling day-to-day tasks, such as driving or operating any machine for that matter, will have disappeared, and anything that wastes time and involves risks will be automated. Access to information will advance towards total ubiquity and will be increasingly easy.