Friday, January 16, 2009

New Algorithm to Revolutionize Data Processing?

Science is all about collecting and making sense of data through hundreds, if not thousands of experiments. With many new large scale projects such as the LHC coming online in recent years, scientists around the globe are being flooded with data. Until now their only choice was to divert money that could have gone into the experiments into purchasing a super computer.

Developers at the University of California, Davis sought to change that, and they have succeeded. In the November-December issue of IEEE Transactions on Visualization and Computer Graphics, a paper describing as well as announcing the creation of the algorithm was published. The algorithm not can analyze as well as create images from the data. Something that would have taken a super computer to do, can now be used on a high end laptop.

The algorithm works by splitting the data into tiny parcels. It then analyzes all the parcels separately using what’s known as the Morse-Smale complex. The data is then combined, and anything that isn’t used is simply discarded of. This drastically cuts down the file sizes and the computing power needed to run.

Using algorithms like these, scientists can hopefully overcome the problem of information overload. This is also will cut down costs of running large scale experiments as the cost of processing the data will dramatically decrease, allowing scientists to continue to work towards the ultimate goal of finding just how the universe works.

More information can be found on the University of California, Davis’ website at: http://www.news.ucdavis.edu/

4 comments:

Dan said...

It is a true fact that more often than not, scientists today are producing so much data that it takes teams of scientists and valuable time to analyze it all. This new software created by the computer scientists at the University of California allow scientists to be able to easily analyse these large sums of data from the comfort of their own laptops, running with as little as 2GB of RAM.

JFilipe17 said...

Smaller is always better. This is a great way to minimize the amount of of spaced used by files. If this becomes something which all people begin to use it would speed up computers everywhere. The way that this algorithm using the Morse-Smale complex is amazing. Simply that it discards all data that isn't used instead of keeping it is great. This will help out scientists all over the world and save the trouble of getting a Super Computer.

David said...

Computers is a major aspect to many people's lives. If the computer malfuntions then it can ruin everything. Minimizing the files used on the computer and getting rid of all the data that isnt used helps save memory

Dr. Fox-Billig said...

Nick,

The use of this algorithm is a great breakthrough in data analysis with the benefit of speed. What is the Morse-Smale complex?

Also, please proofread before you publish :-)