It could herald dangers like powerful autonomous weapons and ways for the few to oppress the many, he said, as he called for more research in the area.
But if sufficient research is done to avoid the risks, it could help in humanity's aims to 'finally eradicate disease and poverty', he added.
He was speaking in Cambridge at the launch of The Leverhulme Centre for the Future of Intelligence, which will explore the implications of the rapid development of artificial intelligence.
All great achievements of civilisation, from learning to master fire to learning to grow food to understanding the cosmos, were down to human intelligence, he said.
'I believe there is no deep difference between what can be achieved by a biological brain and what can be achieved by a computer.
'It therefore follows that computers can, in theory, emulate human intelligence - and exceed it.'
Artificial intelligence was progressing rapidly and there were 'enormous' levels of investment.
He said the potential benefits were great and the technological revolution could help undo some of the damage done to the natural world by industrialisation.
'In short, success in creating AI could be the biggest event in the history of our civilisation,' said Prof Hawking. 'But it could also be the last unless we learn how to avoid the risks. Alongside the benefits, AI will also bring dangers, like powerful autonomous weapons, or new ways for the few to oppress the many. It will bring great disruption to our economy. And in the future, AI could develop a will of its own - a will that is in conflict with ours. In short, the rise of powerful AI will be either the best, or the worst thing, ever to happen to humanity. We do not know which.'
So Stephen Hawking just used 1000 words to state the most obvious thing I've heard since I learned what artificial intelligence was. I could have made that whole speech in one sentence: "If we don't throw a 'kill immediately' switch on all of these things, we're fucked." My number one fear in life is robots becoming smarter than humans and enslaving all of us. "The rise of powerful AI will either be the best, or the worst thing, ever to happen to humanity." Uhhhh, yeah, no shit Stephen. That's like saying "me betting all of my money on the Philadelphia Eagles winning the Super Bowl will either be the best or worst thing ever to happen to me." If I win, my team wins a championship and I'm a much richer man, but if I lose, I'm shit out of luck financially and my team loses. If we can make robots that do everything for us and make our lives significantly easier, then that's great. If we make robots that become sentient and smarter than us and start attacking and enslaving us and shit, then we're fucked. It's not a very difficult concept to grasp.
Honestly, I feel like the level of technology we're at now is pretty good. I have access to pretty much all of the information I need right on my cell phone. I'm not a big "press your luck" guy, and I really feel like we're pushing things by making these robots and computers that are smarter than us. I guess I just have to trust that the people who make robots are going to make them nice, helpful robots instead of mean robots that are going to make me their slave.
Completely unrelated side note, how ridiculous is it that Stephen Hawking's voice has never changed after all these years? I definitely think if they can make technology that allows me to access all of the information I could ever want to, they could make Stephen Hawking a voice box that sounds a little more human and a little less incredibly robotic. Is it Hawking's choice to keep it the same? I think that's the only explanation here. He knows that having the robot voice is part of his #brand, so he's kept it that way for all of these years. That's the only reasonable explanation here, otherwise technology has failed Stephen Hawking.