Every kid coming out of Harvard, every kid coming out of school now thinks he can be the next Mark Zuckerberg, and with these new technologies like cloud computing, he actually has a shot.
Object-oriented design is the roman numerals of computing.
I would therefore like to posit that computing's central challenge, how not to make a mess of it, has not yet been met.
A distributed system is one in which the failure of a computer you didn't even know existed can render your own computer unusable.
Because of its vitality, the computing field is always in desperate need of new cliches: Banality soothes our nerves.
We believe we're moving out of the Ice Age, the Iron Age, the Industrial Age, the Information Age, to the participation age. You get on the Net and you do stuff. You IM (instant message), you blog, you take pictures, you publish, you podcast, you transact, you distance learn, you telemedicine. You are participating on the Internet, not just viewing stuff. We build the infrastructure that goes in the data center that facilitates the participation age. We build that big friggin' Webtone switch. It has security, directory, identity, privacy, storage, compute, the whole Web services stack.
The computer industry is the only industry that is more fashion-driven than women's fashion. Maybe I'm an idiot, but I have no idea what anyone is talking about. What is it? It's complete gibberish. It's insane. When is this idiocy going to stop?
In the practical world of computing, it is rather uncommon that a program, once it performs correctly and satisfactorily, remains unchanged forever.
The cloud services companies of all sizes; the cloud is for everyone. The cloud is a democracy.
On two occasions I have been asked, 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.
More computing sins are committed in the name of efficiency (without necessarily achieving it) than for any other single reason - including blind stupidity.
Steve Wozniak and Steve Jobs founded Apple Inc, which set the computing world on its ear with the Macintosh in 1984.
The utility model of computing - computing resources delivered over the network in much the same way that electricity or telephone service reaches our homes and offices today - makes more sense than ever.
The interesting thing about cloud computing is that we've redefined cloud computing to include everything that we already do.
If you stay up late and you have another hour of work to do, you can just stay up another hour later without running into a wall and having to stop. Whereas it might take three or four hours if you start over, you might finish if you just work that extra hour. If you're a morning person, the day always intrudes a fixed amount of time in the future. So it's much less efficient. Which is why I think computer people tend to be night people - because a machine doesn't get sleepy.
In the next 50 years, the increasing importance of designing spaces for human communication and interaction will lead to expansion in those aspects of computing that are focused on people, rather than machinery.
Microsoft and Dell have been building, implementing and operating massive cloud operations for years. Now we are extending our longstanding partnership to help usher in the new era of cloud computing, by giving customers and partners the ability to deploy the Windows Azure platform in their own datacenters.
It has long been my personal view that the separation of practical and theoretical work is artificial and injurious. Much of the practical work done in computing, both in software and in hardware design, is unsound and clumsy because the people who do it have not any clear understanding of the fundamental design principles of their work. Most of the abstract mathematical and theoretical work is sterile because it has no point of contact with real computing.
Each time you toss out a 'singing' greeting card, you are disposing of more computing power than existed in the entire world before 1950.
Long term, the PC and workstation will wither because computing access will be everywhere: in the walls, on wrists, and in 'scrap computers' lying about waiting to be grabbed as needed.
The computing world is very good at things that we are not. It is very good at memory.
If everybody would agree that their current reality is A reality, and that what we essentially share is our capacity for constructing a reality, then perhaps we could all agree on a meta-agreement for computing a reality that would mean survival and dignity for everyone on the planet, rather than each group being sold on a particular way of doing things.
Items of interest will be located, identified, monitored, and remotely controlled through technologies such as radio-frequency identification, sensor networks, tiny embedded servers, and energy harvesters - all connected to the next-generation internet using abundant, low-cost, and high-power computing.
Computer literacy is a contact with the activity of computing deep enough to make the computational equivalent of reading and writing fluent and enjoyable. As in all the arts, a romance with the material must be well under way. If we value the lifelong learning of arts and letters as a springboard for personal and societal growth, should any less effort be spent to make computing a part of our lives?
Not fair? Oh, I'm sorry I get this lovely laptop computing device when all you get is the ability to walk, control your hands, and know you'll survive until your eighteenth birthday." Then the kid was going, "Uh, I didn't mean..." But Tad wasn't done yet. While the whole class watched in horror, he put his hands through the metal support braces on the arms of his wheelchair and forced himself to stand up. Then he took a shaky little step to the side, gestured toward the chair, and said, "Why don't you take a turn with the laptop? You can even have my seat.
Follow AzQuotes on Facebook, Twitter and Google+. Every day we present the best quotes! Improve yourself, find your inspiration, share with friends
or simply: