Quantum Computers

Quantum Computers can solve puzzles that we yet not can imagine. They might not be in traditional computing any faster than a desktop computer, but with a bit length of 512 qubits instead of 64 bit it can solve mathematical equations faster because it need to to take fewer steps to come to the answer.

http://en.wikipedia.org/wiki/Qubit

http://en.wikipedia.org/wiki/Quantum_computing

http://en.wikipedia.org/wiki/Quantum

A 64 bit computer can only store a value of maximum 9,223,372,036,854,775,807  (32-bit computer can store 231−1, or  2,147,483,647)
http://en.wikipedia.org/wiki/9223372036854775807

a 512 bit computer would store 2^511 = 6,7039039649712985497870124991029e+153
that is:  6 703 903 964 971 300 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000,00

This opens up for higher memory addressing and addressing locations in  Planck scale in physics simulation: http://en.wikipedia.org/wiki/Planck_scale
A 512-bit CPU would be capable of addressing 384 Yottabytes.
Hence, a processor with 64-bit memory addresses can directly access 2^64 bytes (=16 exbibytes) of byte-addressable memory.

 

The use of it might not be obvious to start with but it can be used in areas like: simulated reality/physics simulation high resolution MRI images,  and for stem cell research it can hold the values needed.

Here is one manufacture of Quantum Computers http://www.dwavesys.com/
http://en.wikipedia.org/wiki/D-Wave_Systems 
System is said to cost $ 10 million USD.

High resolution MRI imaging

Unlimited Computing is focusing on high resolution MRI imaging

http://en.wikipedia.org/wiki/Magnetic_resonance_imaging

We focus on building high resolution 3D models of organs to be 3D Printed, but also for diagnostic and healthcare use.

 

 

 http://en.wikipedia.org/wiki/3D_modeling

The path we are walking on is opening up some really interesting new ways of seeing how medical solutions can be applied wit the breakthrough of new technologies and methods of doing things that is not known yet to the public.

 

Cells in the body = information. Atoms = even more information.

Cells in the body = information, a lot of information! 

It is a suggestion that the human genome store data as much as  150 000 000 000 000 000 000 000 bytes (How Much Information is Stored in the Human Genome_ _ Bitesize Bio – Copy) in your body. The number can be disputed, but let’s go with the most extreme number that has come up.

http://phenomena.nationalgeographic.com/2013/10/23/how-many-cells-are-in-your-body/

http://en.wikipedia.org/wiki/List_of_distinct_cell_types_in_the_adult_human_body

http://en.wikipedia.org/wiki/Composition_of_the_human_body

If you would convert those cells to bytes and then into bibles(everybody knows how thick a bible is about 3 cm) you would get  638 297 872 340 426 000 bibles of text. Pile them up on each other and you would get a distance of  19 148 936 170 212,80 km or  127 659,57 time to sun from the earth tall pile of bibles stacked on top of each other.

If you would transform that number into Solid State Drives you would need  312 500 000 000 drives at at total cost of $  78 125 000 000 000,00 USD. If you would pile those 0,7 cm tall SSD disk on top of each other you would get  2 187 500,00  km away, only 5.69 times to the moon.  You can get a further sense of it at looking at the scale of the universe http://htwins.net/scale2/ 

This information can be used in the future for diagnostics and medical care.

Now let’s look at atoms in the body = a lot more information.

A 70 kg body would have approximately 7*1027 atoms. That is, 7 followed by 27 zeros:

7,000,000,000,000,000,000,000,000,000
http://education.jlab.org/qa/mathatom_04.html 

if you would store that as bytes you would say it is 7 000 yottabytes. So we are going from cell level from 150 Zettabyte to atom level of 7 000 yottabytes.

Now to store information about each atom you would need to store things like spin, temperature, location, movement speed, movement direction “(inside a matrix)”  you would need a l lot more storage capacity than given above. When talking about inside a matrix it is like inside a cube with x,y,z dimensions, where the human body is inside.

What we eventually can achieve at the bottom line of this if we can handle all this information?: we can achieve to cure nearly any disease known on the planet if we also could apply  new “information”  to the “matrix”. Then the dead would live, the crippled would walk and the blind would see. The only thing we need is to be able to understand huge amount of information and to manipulated it to then change the body.

 

Super-computing standards draft

Unlimited Computing is currently working on drafts for standards for how  Super-computing  machines should be both hardware and software design plus how the services should be designed to give optimal performance and compatibility with the future.

The standards could also apply in some degree to normal computers because we are reaching a ceiling for how much data a normal desktop computer can transfer and store a day – today.

The process started as a result of a design where more than 64 bit computing power is needed. The 64 bit can with quantum computers reach as much as 512 qubits.

Today larger storage capacity, computing power, memory capacity and network speed is needed every day to handle the exponential growth in data that needs to be processed and accessed.

The need for standards rises as you want to write software for different OS platforms and different bit length on  like 32 bit software – 64 bit and more. The standard is need to compile software for hardware running more than 64 bit. This to ease the work at software companies around the world in their development of OS and software for different hardware.

The need for standards also rises for data integrity security for storage solutions and data integrity for network connections.
The reasons for this is that solar storms can affect both storage solutions and network connectivity.  An example is a test a transfer on a gigabit CAT 6E network of more than 800 gigabyte, where we found by file compare 1 bit error per 2 gigabyte of data.(statistically) The TCP /IP protocol check sum should catch it, chances are very small, but it is like winning in the lottery, you are handing in a trillion lottery tickets, you are doomed to win more than one time. This can cause serious corruption of data sets, software bugs, and database/software crashes.

 

And most important the need for standards to lower the cost of servicing hardware and software, and initial cost for software and hardware solutions for the industry of large scale computing.

 

 

 

 

 

The computing power of the human brain.

Real life tests has shown that near death events have generated  a movie flashing in front of your eyes with pictures from your entire life experiences. The estimated data amount for a 30 year old person is estimated to be 3.4 Exabyte of data.

That is 3.4 Exabyte of data in one second. You would need  123 621 120 CPU cores at 3 Ghz each or at 8 cores per CPU at 3 Ghz you would need 15 452 640 computers. . A total cost for the CPUs only would be   $19 532 136 960. USD. This prices for a human brain would be of enormous $ 19 532 136 960 USD with today’s technology. Take care of the brain, there is no know good mechanics for it today that can fix it or build a new one.

It would require a total of  7 726 320 kW of energy supply at 500 watt per computer.

The brain do it more sexy than we can imagine with a heat release of 50-100 watt for the human body.

The numbers gets so high that we redo the calculations again and again and find them still hard to believe.

The question that rises is does the human brain operate on another quantum level than we know of regarding like voltage usage, signaling etc than we know of in physics today. Does it get the energy from a hidden energy source that we don’t know how to tap into yet, that does not produce heat as a bi product, because 7 726 320 kW of energy would require about the same amount of energy to cool the server room down. Now of course the brain does not function on this level all the time(this is just values to happen during a adrenaline kick for a second or two)  because I think the chemical and electric battery of energy and potential energy would be depleted and would need to be recharged.  Does it uses quarks for signaling and storing information? The brains has many mysteries and it is formed to function as a I/O operating device of this universe physic rules. One thing is for sure question rises that gives a cause to do continue to do more research.

So this numbers are for during extreme life threatening panic situations and not valid for normal daily operations of the brain.

When it comes to bit operations per second we have the follow to offer:
28 672 000 000,00 bit @ 1-30 HZ a second and that is just from data stream from the eyes. ( close to 29 billion bit or nearly 1 billion bit per Hz .)

http://en.wikipedia.org/wiki/Hertz

A computer today operates at 64 bit at 3 Ghz compared to the brain. You would need a 9.55 computers to compare it with the brain computing power at normal operations.
When we go over the eyes it estimated that in full view we have a resolution of 512 megapixels. That is same like 256 HD flastscreen or camcorders to get to the same resolutions as the eyes.. That generates a raw data stream to the brain of 25 088 megabit a second. That is 25 Gigabit second into the brain. That 3136 megabyte a second, Roughly 12 terrabyte an hour. Here the brain stores information faster than any known storage solution today do(except high end server farms)
A computers network card is of only 1 Gigabit compared to it.(this is something that has to increase dramatically in the computer industry or go over to optical networks, because copying a “server backup” of example 200 gigabyte over a network of 1 gigabit connection takes 26 minutes. So here the brain and eyes are performing many times better than our desktop computers are doing.)

We have reached critical amount of data because of HD video from camcorders and high resolution pictures. The storage solutions capacity and transfer speed has to go up by the factor of at least 10 very soon.

The brains storage capacity

Our research and estimates shows that the brain during a period of 30 life years is storing about 3.4 Exabyte of data.  http://en.wikipedia.org/wiki/Exabyte 

This is mainly video data from the eyes. Other meta data like thoughts, sounds, feelings etc is not included.

In a life time of 90 years we are talking about 10.2 Exabyte of data stored in the brain. This finding propels further research.

The brain compared to the storage solutions that are around today is enormous and not even likely to be compared. Over a period of 90 years the brain would produce something like  21 192 192 SSD 480 GB Solid state drives at a total cost of $ 4 980 165 120 USD. Before you bang your head into the wall over life problems remember the brain cost $ 4 980 165 120 USD.

We can clearly see that hard drive market, the memory market and the solid state market is going into the future with big changes. Computer hardware can’t be worse of than nature (biological material) when it comes to capacity.

The price for each terabyte is going to go down while the speed and the capacity is going to skyrocket in the future.

http://en.wikipedia.org/wiki/Human_brain 

The New Era of Supercomputing