Someone seems to be testing a new kind of graphic card technology, with 520-bit colors and we managed to catch it on one of our websites.
If a standard resolution is 1280×1024, this makes an overall of 1.310.720 pixels. Many of us use the 32 bit colors -> 4 bytes per pixel -> this means that we need 5.242.880 bytes (~5 Mb) per screen.
My Google Analytics reported today a 520-bit color depth (it could be some kind of error). If the report is right, this means that a new technology is out there: 520 bit means 65 bytes per pixel – 85.196.800 bytes (~81 MB).
The second interesting thing is that this guy who has the 520-bit color depth also has a 10.000×10.000 screen resolution.
I tried to find out more about this technology and it seems it is used for 2D/3D supercomputing simulations. Here is a short list of some public supercomputers:
CCRT (CEA Supercomputing Center):
– HP/OSF1 256 quadri-processors alpha nodes
– Linux 128 quadri-processors AMD Opteron
IDRIS (CNRS Supercomputing Center)
-1024 processors Power4 cluster
MareNostrum (Barcelona Supercomputing Center)
-2406 dual 64-bit processor nodes @ 2.2GHz, ~42 TeraFlops
-they say this is the most powerful supercomputer in Europe
This could mean that a supercomputer is now connected to the Internet in a special project. If you want to find out more on supercomputers and their global network, you should check out DEISA.
My stats showed me a clue about University of North Carolina and this is how I found out that they also have a supercomputer. Here is something they do with the technology:
The nanoManipulator, developed at the University of North Carolina at Chapel Hill, is an interface to scanning probe microscopes allowing users to see, feel, and manipulate samples ranging in size from DNA to single atoms. The interface controls the microscope, provides interactive 3D visualizations of scanning probe microscope data , and allows the user to actually feel the shape of the sample through a force-feedback device. An enhanced nanoManipulator can be used collaboratively by teams of scientists in a “virtual laboratory” environment that allows remote access to a shared microscope and previously collected data. Because the collaborative nanoManipulator handles multiple data flows, e.g. video and system control dataâ€”all having different bandwidth, loss, and latency (delay) requirementsâ€”a network Quality of Service is required at 100-150 Mbps for optimal visualization and haptic/touch response. In contrast to some high-bandwidth applications that have “bursty” bandwidth demands, the typical scientific experiment using the nanoManipulator lasts for many hours, creating a long-lived high demand on the network.