With memory, as with real estate, location matters. A group of researchers from Advanced Micro Devices and the Los Alamos National Laboratory have found that the altitude at which SRAM (static random access memory) resides can influence how many random errors the memory produces.
A network researcher at the U.S. Department of Energy's Fermi National Accelerator Laboratory has found a potential new use for graphics processing units -- capturing data about network traffic in real time.
Supercomputing power is being concentrated in a smaller number of machines, according to the latest Top500 list of high-performance computers. Keepers of the list are uncertain how to parse that trend.
Helping scientific supercomputing take advantage of emerging big-data technologies, high-performance computing manufacturer Cray is releasing a set of packages promising to optimize the process of running Hadoop on the its XC30 machines.
China has maintained its lead in the twice-yearly ranking of the world's most powerful supercomputers, with the Chinese National University of Defense Technology's Tianhe-2 system bringing 33.86 petaflop/s (quadrillions of calculations per second) to the contest, almost twice the calculations offered by the runner up, the Titan Cray system run by the U.S. Department of Energy's Oak Ridge National Laboratory.
As companies take steps to develop private clouds, mainframes are looking more and more like good places to house consolidated and virtualized servers. Their biggest drawback? User provisioning is weak.
Major market shifts in the database world don't happen often. When they do, they're massive, creating an impact that can last 10 to 20 years. When I entered the job market, it was right at the tail end of the last major shift from the mainframe to client/server.