The standard definition of AI is that which we don't understand.
Every kid coming out of Harvard, every kid coming out of school now thinks he can be the next Mark Zuckerberg, and with these new technologies like cloud computing, he actually has a shot.
I would therefore like to posit that computing's central challenge, how not to make a mess of it, has not yet been met.
We believe we're moving out of the Ice Age, the Iron Age, the Industrial Age, the Information Age, to the participation age. You get on the Net and you do stuff. You IM (instant message), you blog, you take pictures, you publish, you podcast, you transact, you distance learn, you telemedicine. You are participating on the Internet, not just viewing stuff. We build the infrastructure that goes in the data center that facilitates the participation age. We build that big friggin' Webtone switch. It has security, directory, identity, privacy, storage, compute, the whole Web services stack.
The cloud services companies of all sizes; the cloud is for everyone. The cloud is a democracy.
On two occasions I have been asked, 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.
Steve Wozniak and Steve Jobs founded Apple Inc, which set the computing world on its ear with the Macintosh in 1984.
More computing sins are committed in the name of efficiency (without necessarily achieving it) than for any other single reason - including blind stupidity.
The interesting thing about cloud computing is that we've redefined cloud computing to include everything that we already do.
A distributed system is one in which the failure of a computer you didn't even know existed can render your own computer unusable.
The utility model of computing - computing resources delivered over the network in much the same way that electricity or telephone service reaches our homes and offices today - makes more sense than ever.
The computer industry is the only industry that is more fashion-driven than women's fashion. Maybe I'm an idiot, but I have no idea what anyone is talking about. What is it? It's complete gibberish. It's insane. When is this idiocy going to stop?
Because of its vitality, the computing field is always in desperate need of new cliches: Banality soothes our nerves.
If you stay up late and you have another hour of work to do, you can just stay up another hour later without running into a wall and having to stop. Whereas it might take three or four hours if you start over, you might finish if you just work that extra hour. If you're a morning person, the day always intrudes a fixed amount of time in the future. So it's much less efficient. Which is why I think computer people tend to be night people - because a machine doesn't get sleepy.
In the practical world of computing, it is rather uncommon that a program, once it performs correctly and satisfactorily, remains unchanged forever.
In the next 50 years, the increasing importance of designing spaces for human communication and interaction will lead to expansion in those aspects of computing that are focused on people, rather than machinery.
Microsoft and Dell have been building, implementing and operating massive cloud operations for years. Now we are extending our longstanding partnership to help usher in the new era of cloud computing, by giving customers and partners the ability to deploy the Windows Azure platform in their own datacenters.
Each time you toss out a 'singing' greeting card, you are disposing of more computing power than existed in the entire world before 1950.
It has long been my personal view that the separation of practical and theoretical work is artificial and injurious. Much of the practical work done in computing, both in software and in hardware design, is unsound and clumsy because the people who do it have not any clear understanding of the fundamental design principles of their work. Most of the abstract mathematical and theoretical work is sterile because it has no point of contact with real computing.
Long term, the PC and workstation will wither because computing access will be everywhere: in the walls, on wrists, and in 'scrap computers' lying about waiting to be grabbed as needed.
The computing world is very good at things that we are not. It is very good at memory.
Java and C++ make you think that the new ideas are like the old ones. Java is the most distressing thing to hit computing since MS-DOS.
The hope is that, in not too many years, human brains and computing machines will be coupled together very tightly, and that the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today.
Computer literacy is a contact with the activity of computing deep enough to make the computational equivalent of reading and writing fluent and enjoyable. As in all the arts, a romance with the material must be well under way. If we value the lifelong learning of arts and letters as a springboard for personal and societal growth, should any less effort be spent to make computing a part of our lives?
At Microsoft, we're investing heavily in security because we want customers to be able to trust their computing experiences, so they can realize the full benefits of the interconnected world we live in.
Follow AzQuotes on Facebook, Twitter and Google+. Every day we present the best quotes! Improve yourself, find your inspiration, share with friends
or simply: