Simulated reality – MATRIX

Simulated Reality just like in the movie The Matrix

http://www.warnerbros.com/matrix

http://en.wikipedia.org/wiki/The_Matrix

http://en.wikipedia.org/wiki/Simulation_hypothesis

http://en.wikipedia.org/wiki/Simulated_reality

http://thematrix101.com/ 

 

Simulated Reality – Matrix

Benefits of running a world simulation inside a simulated reality just like the movie Matrix:

# You could run a safe world without accidents
# Resources can’t get depleted.
# Environment can’t get polluted.
# No wars
# Never get lost.
# No dust that need clean up.
# Good weather all the time.
# No financial crisis.
#No poverty
# No land conflict. Endless amount of land, possibility that people live on same address just in different dimensions.
# No diseases.
# A think and create reality.
# Time travel.
# Free space travel
# Possibility to roll back what you have done to the start.
# you can have simulated workers to help you build things and fill up the environment with at lot of sim people. Never alone.
# Faster than light travel.
# No corrupt governments.
# Build houses, roads, buildings and cars by installing them as an application in matter of seconds.
# You can stack billions of people into the reality, where each body only need a bed.
# Universities could be open for all, extended learning capabilities. The system helps you learn, understand and memories things.

The benefits are many and the ideas how things should be arranged just comes in never ending stream.

The idea is that simulated reality needs close to “Unlimited Computing” power, depending on the complexity of the world and the number of people in the simulation.

https://www.youtube.com/watch?v=HVrGMnk5E_M

Is the Matrix real? Do we live in a simulated reality?

http://nickbostrom.com/ 

http://www.simulation-argument.com/

CERN particle acceleration

CERN particle acceleration.
The European Organization for Nuclear Research.

Unlimited Computing is follow the work at CERN with interest in the future discovery of the world largest science lab. The reasons for following this work is that it might lead to new discoveries and products in both computer technology and in the medical field.

Particle accelerator is working to find the smallest particles and the true nature of the basic elements in our reality. We don’t know yet the true nature of the elements in our reality, two example is what creates gravity or what creates mass?

http://home.web.cern.ch/

http://en.wikipedia.org/wiki/CERN

http://en.wikipedia.org/wiki/ATLAS_experiment

http://en.wikipedia.org/wiki/Compact_Muon_Solenoid

http://en.wikipedia.org/wiki/Large_Hadron_Collider

 

 

 

Nanotechnology

Nanotechnology

What is nanotechnology: it is the manipulation of matter on an atomic, molecular, and supramolecular scale.

http://en.wikipedia.org/wiki/Nanotechnology

Scale of nanotechnologyThe Scale of the Universe – http://htwins.net/scale2/

iPhones use nanotechnology in microchip, memory and cpu – all nanotechnology.

Computer have CPU with only 22 nanometers distance between the copper wires.

Nanoscience and nanotechnology are the study and application of extremely small things and can be used across all the other science fields, such as chemistry, biology, physics, materials science, and engineering.

Future medical solutions will use nanotechnology. Specialized nano robots, nano-tech pills, the future will show what will come.

It can be used to access parts of the brain that you cannot reach with traditional surgery instruments without damaging the brain.

 

 

Stem cell research and design software

Unlimited Computing is looking into stem cell research with new eyes on how to tackle the information that is stored in stem cells.

What it stem cell?

http://en.wikipedia.org/wiki/Stem_cell

http://stemcells.nih.gov/info/basics/pages/basics1.aspx

 

 

 

We are currently looking for partners willing to let us write custom software for stem cell research and for stem cell design to cover a global market need that is growing.

 

 

If you want to take the next huge step in Stem Cell technology please feel free to contact us.

We have 27 years of computer experience in our team.
We made software for a long time.

 

Project CADIS: Cybernetics Artificial Digital Intelligent Systems

Unlimited Computing is currently working on a project  called CADIS: Cybernetics Artificial Digital Intelligent Systems. In short you can call it AI – Artificial intelligence.

http://en.wikipedia.org/wiki/Cybernetics
http://en.wikipedia.org/wiki/Artificial_intelligence

We have seen that AI is the solution for handling large amount of information that transforms like DNA/Stem cell activity.

We also see that it can be used in diagnostic imagining, to pre-scan examinations before radiologist view them to find those cases with findings of a disease, so they can be treated first.

We also would need AI in simulated reality (like the movie “Matrix)
This AI is needed for holding the simulation alive and a dynamically experience of the simulated reality.

With AI we can simulated how a specific drug would affect cells in the body without testing it on a living patient. Trials and errors can then be done many million times with recording of each event for learning later on.

Advanced pattern recognition uses AI.

More about AI:

You have Siri on Apple iPhone that uses AI.

There is projects like the computer Watson IBM.

http://en.wikipedia.org/wiki/Watson_(computer)
http://researcher.watson.ibm.com/researcher/view_group.php?id=2099
http://www.ibm.com/smarterplanet/us/en/ibmwatson/

 More AI videos:

Quantum Computers

Quantum Computers can solve puzzles that we yet not can imagine. They might not be in traditional computing any faster than a desktop computer, but with a bit length of 512 qubits instead of 64 bit it can solve mathematical equations faster because it need to to take fewer steps to come to the answer.

http://en.wikipedia.org/wiki/Qubit

http://en.wikipedia.org/wiki/Quantum_computing

http://en.wikipedia.org/wiki/Quantum

A 64 bit computer can only store a value of maximum 9,223,372,036,854,775,807  (32-bit computer can store 231−1, or  2,147,483,647)
http://en.wikipedia.org/wiki/9223372036854775807

a 512 bit computer would store 2^511 = 6,7039039649712985497870124991029e+153
that is:  6 703 903 964 971 300 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000,00

This opens up for higher memory addressing and addressing locations in  Planck scale in physics simulation: http://en.wikipedia.org/wiki/Planck_scale
A 512-bit CPU would be capable of addressing 384 Yottabytes.
Hence, a processor with 64-bit memory addresses can directly access 2^64 bytes (=16 exbibytes) of byte-addressable memory.

 

The use of it might not be obvious to start with but it can be used in areas like: simulated reality/physics simulation high resolution MRI images,  and for stem cell research it can hold the values needed.

Here is one manufacture of Quantum Computers http://www.dwavesys.com/
http://en.wikipedia.org/wiki/D-Wave_Systems 
System is said to cost $ 10 million USD.

High resolution MRI imaging

Unlimited Computing is focusing on high resolution MRI imaging

http://en.wikipedia.org/wiki/Magnetic_resonance_imaging

We focus on building high resolution 3D models of organs to be 3D Printed, but also for diagnostic and healthcare use.

 

 

 http://en.wikipedia.org/wiki/3D_modeling

The path we are walking on is opening up some really interesting new ways of seeing how medical solutions can be applied wit the breakthrough of new technologies and methods of doing things that is not known yet to the public.

 

Cells in the body = information. Atoms = even more information.

Cells in the body = information, a lot of information! 

It is a suggestion that the human genome store data as much as  150 000 000 000 000 000 000 000 bytes (How Much Information is Stored in the Human Genome_ _ Bitesize Bio – Copy) in your body. The number can be disputed, but let’s go with the most extreme number that has come up.

http://phenomena.nationalgeographic.com/2013/10/23/how-many-cells-are-in-your-body/

http://en.wikipedia.org/wiki/List_of_distinct_cell_types_in_the_adult_human_body

http://en.wikipedia.org/wiki/Composition_of_the_human_body

If you would convert those cells to bytes and then into bibles(everybody knows how thick a bible is about 3 cm) you would get  638 297 872 340 426 000 bibles of text. Pile them up on each other and you would get a distance of  19 148 936 170 212,80 km or  127 659,57 time to sun from the earth tall pile of bibles stacked on top of each other.

If you would transform that number into Solid State Drives you would need  312 500 000 000 drives at at total cost of $  78 125 000 000 000,00 USD. If you would pile those 0,7 cm tall SSD disk on top of each other you would get  2 187 500,00  km away, only 5.69 times to the moon.  You can get a further sense of it at looking at the scale of the universe http://htwins.net/scale2/ 

This information can be used in the future for diagnostics and medical care.

Now let’s look at atoms in the body = a lot more information.

A 70 kg body would have approximately 7*1027 atoms. That is, 7 followed by 27 zeros:

7,000,000,000,000,000,000,000,000,000
http://education.jlab.org/qa/mathatom_04.html 

if you would store that as bytes you would say it is 7 000 yottabytes. So we are going from cell level from 150 Zettabyte to atom level of 7 000 yottabytes.

Now to store information about each atom you would need to store things like spin, temperature, location, movement speed, movement direction “(inside a matrix)”  you would need a l lot more storage capacity than given above. When talking about inside a matrix it is like inside a cube with x,y,z dimensions, where the human body is inside.

What we eventually can achieve at the bottom line of this if we can handle all this information?: we can achieve to cure nearly any disease known on the planet if we also could apply  new “information”  to the “matrix”. Then the dead would live, the crippled would walk and the blind would see. The only thing we need is to be able to understand huge amount of information and to manipulated it to then change the body.

 

Super-computing standards draft

Unlimited Computing is currently working on drafts for standards for how  Super-computing  machines should be both hardware and software design plus how the services should be designed to give optimal performance and compatibility with the future.

The standards could also apply in some degree to normal computers because we are reaching a ceiling for how much data a normal desktop computer can transfer and store a day – today.

The process started as a result of a design where more than 64 bit computing power is needed. The 64 bit can with quantum computers reach as much as 512 qubits.

Today larger storage capacity, computing power, memory capacity and network speed is needed every day to handle the exponential growth in data that needs to be processed and accessed.

The need for standards rises as you want to write software for different OS platforms and different bit length on  like 32 bit software – 64 bit and more. The standard is need to compile software for hardware running more than 64 bit. This to ease the work at software companies around the world in their development of OS and software for different hardware.

The need for standards also rises for data integrity security for storage solutions and data integrity for network connections.
The reasons for this is that solar storms can affect both storage solutions and network connectivity.  An example is a test a transfer on a gigabit CAT 6E network of more than 800 gigabyte, where we found by file compare 1 bit error per 2 gigabyte of data.(statistically) The TCP /IP protocol check sum should catch it, chances are very small, but it is like winning in the lottery, you are handing in a trillion lottery tickets, you are doomed to win more than one time. This can cause serious corruption of data sets, software bugs, and database/software crashes.

 

And most important the need for standards to lower the cost of servicing hardware and software, and initial cost for software and hardware solutions for the industry of large scale computing.