It's our animal survival instinct that pushes us humans forward.
A sentient artificial intelligence could need something similar... but then it would be really dangerous for us humans!!
Here would come Asimov's laws of robotics:
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2) A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law