Intel uses big data to replace tarot cards -

Intel IT has developed a predictive analytics system which it claims can reduce chip test time by 25 percent.

If the testing system works, it could save Intel more than $30 million over the next year.

Chipzilla claims that the new testing process does not mean having to take any short cuts. It has been spending a fortune on Big Data infrastructure for some time to gain new ideas into its design process.

But last year the company used a new multi-tenant Big Data platform combining a third-party data warehouse appliance with Apache Hadoop. This gathered all the historical information during manufacturing and combine new sources of data which had been too unmanageable to use.

Using this information a small team of five people was able to slash $3 million off the cost of testing just one line of Intel Core processors.

Chipzilla apparently rolled out the idea to more products and thinks it can save $30 million in cost avoidance in 2013-2014.

The upside of the testing process is that it is also a lot quicker and Intel thinks that it will decrease post-silicon validation time by a quarter.

According to Techcentral, the improvements are a development of the value from Big Data and Business Intelligence.

Intel is now spending more on a range of on-demand self-service business intelligence systems to perform new analysis of its operations.

It is not clear if this means that its move to fashion bag making is doomed. Any kid with a slide rule could tell you that move was pants.

Chris Shaw, Intel EMEA IT director, said that big data was helping develop new predictive systems.

Until now Intel had to get data, put it into your database and crunch numbers which what happened in the past.

Shaw said that big data could take information from real time sensors and work out what will happen in the future.