Thursday 10 April 2014

India: Why is "Horn OK Please" painted on the back of almost every truck in India?

The OK was used as a safety measure on the highways.



Most highways in India were single lane. Which meant there was always a risk of running into an incoming vehicle while attempting an overtaking maneuver.

The "OK" was accompanied with a bulb over it, which the driver of the truck would switch on to signal the vehicle behind it that there is no oncoming traffic and hence it was okay to overtake. This arrangement was placed in the center.

One other thing that was written was a courteous 'Horn, please' , with the placement such that the 'Horn' was on the left side and the 'please' (continuation of the same message) on the right.

Gradually the practice of using the bulb to signal other drivers faded away (probably because of the non maintenance of the bulb and also the emergence of multi lane highways), and the OK remained, sandwiched in between the horn and please.

Sunday 6 April 2014

Adopting the cloud: The network strikes back

Adopting the cloud: The network strikes back


With every passing day, more and more IT managers are taking the plunge and opting for the world of cloud. This comes as no surprise when you consider the benefits it brings: adopting cloud gives employees the freedom to work where they want, whilst still cutting costs for businesses.
But with any new technology as soon as the honeymoon period is over, the real work begins. Cloud traffic is growing at an alarming rate, but this the pressure is on the network operators carrying the cloud, not the enterprises using the cloud, to deliver.



More cloud computing means more data centres, especially as cloud providers look to distribute them globally to reduce latency for regional customers.
As we well know, data centres can’t just be set up and left to their own devices, a much more intricate cloud ecosystem to connect each of these datacentres together. The resulting large scale machine to machine traffic between these data centres is now the biggest bandwidth consumer after video, putting increasing pressure on an already strained network.
And this will only increase as cloud becomes easier and easier to access through IaaS, PaaS, SaaS or network as a service.
Cloud will provide that extra storage businesses need, but the bandwidth needs to be available in order for it to get there in the first place.  Whilst in some ways scalable cloud services can take the pressure off the network by hosting data hungry services in local servers rather than streaming straight from the network each time, there is still going to be pressure on the network core to connect data centre to data centre.
But bandwidth is only part of the cloud puzzle and an increasing number of enterprises and organisations are using the cloud to deliver new services and applications. In fact, 46 percent of all IT spending by 2016 is expected to be spent on cloud-related platforms and applications.
As network operators are expected to evolve in line with customer demand, the challenge for network operators is twofold: meeting increasing bandwidth demands and the rapid deployment of new services for customers. This means that network operators not only need a scalable network but an intelligent one.
The cloud is one of the key factors driving internet demand and it’s now increasing at a rate of between 30 and 50% year on year. In the coming years, we’ll see network operators look to converged network and software technologies that can provide super-channels to meet this bandwidth demand, and automated transport layer technologies (like FlexROADMs and Software Defined Networking) to monitor traffic demands and manage the rapid provisioning of new services.
These automated technologies will also dramatically reduce operational costs, meaning operators can pass cost savings onto those using the cloud, and reinvest in more innovative solutions for their customers. 
So as the cloud takes a more front of stage role in enterprise computing, it’s essential that network operators have this scalability, flexibility and intelligence in order to, not only support cloud-based applications, but deliver greater responsiveness to changing demands.
Savvy cloud services providers are well aware that without the right partner in place, the network has the potential to strike back.

Thursday 3 April 2014

The Internet of Things Ecosystem: The Value is Greater than the Sum of its “THINGS”

The Internet of Things is more than just Glasses, smartphones and smartwatches. It’s more 

than just smart cars and cities and other “things” that are connected or understood by 

today’s usage models.



By 2020, it is predicted that the entire Internet of Things will have a market value of $8.89 

trillion. There are many factors that enable this value to be reached. There is revenue from 

the sale of Wearables and “things”, there is software licensing, hardware, and the reduction 

of operating costs in manufacturing, information technology, research and development, 

marketing and corporate operations.



Research firm, IDC, expects a globally installed base of IoT will reach around 212 billion 

things by the end of 2020, including 30.1 billion installed connected autonomous things. Intel 

predicts there will be 31 billion connected devices. Cisco, a notable leader in IoT research 

and awareness, predicts 50 billion objects will be connected to the Internet. Gartner predicts 

these billions of connected “things” add economic value will be $1.9 trillion dollars in 2020.

With this many zeros and connected things being predicted by smart people and forward-

thinking companies, it’s enough to make a skeptic out of anyone to think that many 

computing devices and trillions of dollars of market value will be generated in the next six 

years. Given the advances in smartphones and tablets and Big Data, could the expected 

market value be underestimated?


It would be easy, maybe even lazy, for us to think the smartphone will be at the heart of the 

Internet of Things. With 30.1 billion autonomous things sending and receiving information, 

and initiating pre-defined manufacturing, marketing or even personal preferences, the 

smartphone or any other modern device will not be apart of the equation.


In a way, the smartphone is to the Internet of Things,  as the beeper is the mobility 

revolution. The challenge for marketers, communicators and stewards of brands is to 

understand getting mobile right today is important to establishing best practices and 

foundational expertise needed to manage an automated future.


It’s important to understand the different segments that will makeup the Internet of Things 

ecosystem. After reviewing industry research, corporate press releases and blogs and news 

reports, the following IoT ecosystem framework was built to reconcile the many IoT 

announcements.


Hopefully, with this ecosystem framework combined with the IoT use-case framework we 

can start to rationalize what each new development in the emerging technology trend.

Tuesday 1 April 2014

How important is big data?

How important is big data?



If you want proof that big data is a big thing, just look at the jobs pages. Inside, you'll find desperate appeals for statistics-savvy data scientists, and the salaries are often eye-watering.
They're evidence that businesses in the UK and beyond are looking at ways of using the data they already collect – whether that's about customers, their industry or about financial markets – with as much efficiency as possible.

Digital markers

Data is everywhere, it's doubling every two years, and it's fast becoming a business' biggest resource. The digital markers we all create – whether a GPS-tagged photo, check-in on Facebook or a click on a website – could last forever.
So far, we've got about 1.8 zettabytes of data on planet Earth, but it's how we link up, analyse and use all of this data – and much more besides – that's going to completely change our world. And that means business.
Some say that by mastering big data, companies can stop wasting time on useless initiatives and pitches that go nowhere - and increase margins by a whopping 60 per cent.
Even if it's a fraction of that figure, harnessing big data properly can make the difference between a profitable business and a non-starter.
Big data is big money. ABI Research states that global spending on big data by organisations exceeded US$31 billion in 2013, and will reach a staggering US$114 billion in 2018 – and that includes salaries as well as hardware and software.
It's a data-rich, skills-poor environment at present, but the future will be one of increasingly sophisticated hardware and ever-better data analytics that collect, calculate, predict and much more besides.

The iceberg problem

Unfortunately, big data is scattered. It's estimated that IT departments spend about 70 to 80 per cent of their time maintaining existing systems and databases, largely because there are so many of them.
Separate, created for narrow purposes, and often based on databases that don't talk to each other, it's what's known as 'the iceberg problem', where systems can't exchange big data, which therefore goes untapped.
"One of the companies we're working with has 6,670 'icebergs' used by various arms of the business," says Dr Andrew Sutherland, Senior Vice President of Technology at Oracle EMEA.
"Some are very important, each one was built for a reason, and all had and still have a business case; and they were all built by very clever, experienced people – but each application has been built to fulfil one need, and they use completely different databases."

False economy

It's a common problem, but businesses who've invested a lot in such icebergs often don't want to replace them. For a business wanting to embrace big data, that viewpoint represents a false economy.
Petabytes of big data are now flooding in to businesses and a consistent, common platform is required, capable of exchanging data and running apps – whatever their source – using only one stack that can be managed, maintained, recovered and backed up as one.
It's all about consolidation. "Whether you're a police force, a gas extractor or a retailer, there will be areas of your business that you want to be able to do more efficiently or cleverly – and you can only do that if you consolidate and simplify your IT, so you can concentrate on the apps that add value to your business," says Sutherland.
One problem here is that there often exist huge teams of IT consultants whose job it is to specify and commission more icebergs; asking them to give their blessing to using one unified, interoperable system that best takes advantage of big data is like asking turkeys to vote for Christmas.
But if you do get big data into one place, what happens to it next?

Machine learning & super-computers

The big data revolution will be led by machine learning and super-computers. They've been called the 'coal in the furnace' of the internet that will drive Web 3.0, but the sole reason for the new generation of artificially intelligent super-computers is the massive rise in personal and business data.
Eventually enabling websites and apps to track and predict our every action and desire, super-computers feed off big data.
Reasoning, perception, social and even some degree of cultural awareness are the selling points for Watson, IBM's super-computer that runs on 16 terabytes of RAM, and brings a much-needed injection of artificial intelligence to big data analytics.
Its smart learning DeepQA software is able to search databases, spot patterns and make predictions, which could revolutionise the financial and medical industries, both of which are currently drowning in data.

Big data in medicine

As well as reducing the need for investment bankers, big data analytics could also banish bad decisions by doctors.
"Watson could be used to search through millions of pages of academic research and drug trials, something a human could never do or keep up with, and in double-quick time," says Joe Peppard, a professor at the European School of Management and Technology in Berlin, Germany, which consults on IT strategy for large corporations.
"It could provide a doctor or nurse with the required diagnosis and even the most appropriate medication plan, having taken into consideration all the latest thinking on any specific health problem."

Call centres streamlined

Such data analysis could also be used to vastly improve call centres, where about 60 per cent of enquiries end in frustration for callers.
"Watson is able to search a far greater amount of data in a much faster time," says Peppard, "so in many cases it could pre-empt enquiries so that the answer is available before the question has been asked."
"Machine learning, and its application in advanced analytics, is one area that will make both the public and private sectors data-savvier than anything we've seen so far," says Dan Shey, Practice Director at ABI Research.
"Big players such as IBM and HP are understandably moving to this direction, but at the same time we can also see analytics startups, like Ayasdi and Skytree, that have machine learning in their very DNA. Eventually, such innovations will put analytics within any domain expert's reach. At that point, data will stop being big."

Uncovering the Universe

Harnessing big data properly can mean new discoveries and possibilities, and nowhere more so than in deep space.
The European Space Agency's Gaia mission to create a complete three-dimensional map of the Milky Way involves taking super hi-res photos of the heavens, but to make any use of this data requires a powerful information system.
Archiving and processing scientific data collected by the Gaia satellite, the Italian National Institute of Astrophysics (INAF) uses a data management system from Oracle to store its "very precious heritage of astronomical data that will have to be stored for the whole 21st century and beyond," says Roberto Morbidelli, Scientific Operation Manager at INAF.
Processing petabytes of big data from Gaia is a huge task, but Oracle has history; it also works with the operator of the Large Hadron Collider, CERN.
The sources and uses of big data in business are now being revealed, but if we know one thing it's that the hyper-efficient collection and analysis of data is already becoming a major tool.
"In five or 10 years' time, big data will not be a source of competitive advantage, it will just be a necessary part of normal business," says Eddie Short, UK and EMEA Leader for Data and Analytics at KPMG Management Consulting. "Without exploiting it, a business will die."