+91 70951 67689 datalabs.training@gmail.com

Today we are in the middle of a revolution to business thinking! In around 2700 BC Sumerians invented the Abacus to make the arithmetic easier, faster and less prone to error. Since then mankind has been inspired by the power that technology has to change our lives. From Leonardo Da Vinci’s drawings of a mechanical calculator in 1502 AD to Charles Babbage’s difference engine in the early part of the 20th century, there have been paradigm changes in thinking – each time significantly changing our relationship with technology!

Cloud computing – revolution in computing

As thinking has become more abstract the more the advances in technology have accelerated. Alan Turing and Tommy Flowers realized that the meta or software should live separately from the body or the hardware. And arguably the most significant democratizing innovation in human history is when Tim Berners-Lee saw near-on-networks rather than physical or electrical plugs and from that devised a system that enabled dispirit computers to communicate with each other anywhere.


To understand technology’s latest revolution we have to go back to the year 1879. When Thomas Edison invented the light bulb he immediately knew that his invention had the power to change the lives of every person on the planet. The light bulb was an amazing invention but there was a problem – there was no publicly available electricity. Less than four years later Thomas Edison built the first power stations in London and Pearl Street, New York. People didn’t want the power station; they just wanted the light. This realization was the first new light bulb moment. Utility is the foundation of the largest single change to computer thinking in this century – a change so great that you can compare it to the computing equivalent of the industrial revolution; a revolution that was fueled by utility.


It all started with water; wheels were used to grind flour or power machines. But it wasn’t exactly scalable and factories had to be built at the source of that power. If a business grew and needed more power it would have to upgrade to a bigger wheel or shift the entire factory to a more powerful river. Steam power meant that factories could move away from rivers, but workers were still needed to run the power station. Every worker on the power station was one not working on the output of the factory. So when power, electricity and water became a utility industry has what it needed when it was needed. This meant the power station workers could spend time making better use of the power and making the factory more productive.


Now, as with electricity, water and gas, computing is also a utility; it is a resource to the switched on and off. Users consume and pay for as much as is needed and when it is needed. This is the essence of cloud computing. Utility-based computing enables businesses to exactly align technology expenditure with requirements; bursting instantly from one to tens of thousands of users and back down again. IT departments can dynamically scale processing, memory and storage only paying for the amount used.


Before the revolution in the dark ages of computing, companies bought technology based on what they guessed they might need in three years time. If they bought too little they ran the risk of having under-performing systems meaning unhappy customers and possibly lower profits. So, it was better to buy more just in case. Today any sized business can have access to the security, compliance, transparency, scalability and reliability of the systems that until now has only been affordable to the largest companies on the planet. And all this is possible through cloud computing.