.
MORE ABOUT QUANTUM CODES
Some are saying we should base the web on quantum physics, i.e. when you take two subatomic motes like electrons and twirl them in the fields, it's well known they are linked. So by the (no doubt) unusual motifs of subatomic physics, when you seperate them by a distance and you change one, the other one "knows" even if it's 10 million miles away, and it knows fast. This would presumably be via a faster than light signal between the particles. The idea is to use this, well proven by experiment, to make it so no one would be able to tamper with information enroute between source and destination. If the electrons "know" they say it would be safe to send and retrieve web based information via this motif.
I think this is way ahead of our time, even if it's Memorial Day 1979! My belief is, all physical systems like this must have some loophole that would make it fail, and the only real safe codes may be by just using the same method to make the web safe already in use.
Here's why I think the above idea about using this motif if it makes a link supposedly safe, is unsound.
Einstein believed there must be some way around the Uncertainty Principle, this is derived from experiment in subatomic physics. The Uncertainty Principle says because the probe and the seen particle are about at the same level of energy you can't change the energy without ricochets that change the size or you can't change the mass without changing the time. Einstein believed if you had a probe of small enough energy you would be able to find out the time and space occupied by the particle and other motifs to any level of resolution, just like a beam of light is small when it illuminates a sofa, and then our eye. From this Einstein may have gotten his belief it was just a coincidence that the probe and the baryon or mesons were at about the same power level.
I agree with Einstein because gravity is the foundation field, all the other forces speak gravity but not necessarily to other forces than gravity. The history of gravity has influenced science more than any other physics. And the conservation laws in subatomic physics go from simple to more and more conserved properties conserved; all that's true of each force is true of the forces below it plus more, a sort of pile of power. The lowest energy field would energise up to the higher one up, gravity to electromagnetism, then electromagnetism to higher up, and more explaining the uniformity of nature and energy conservation of energy. All the physics is from the lower field at higher or lower levels. If gravity creates the other fields by bunching up the lines of force, the strength of and the relative power of each force is determined by the strength of the gravity. If gravity were more powerful it would cause more bunching and the ratio of the strong force to electric and or the electric to gravity would be changed. This would change the energy of the probe like light to the measured particle so it would be more like the light to the sofa cushion which has no Uncertainty Principle, and I think this is what Einstein was saying. There is no necessity for the measured particles to be at the same energy level as the observed particle-the moon doesn't have to be of the same energy as the light, because the gravity that would energise up to the levels of the other forces and their relative energies are arbitrary. Since the strength of gravity determines the relative strength of the probe and the measured subatomic mote and it's strength is by axiom, the Uncertainty Principle would not be the deeper physics which is based on gravity being the foundation. (Four Possible Ways around the Uncertainty Principle).
Schrodinger believed that if it were so about the Uncertainty Principle and you couldn't say where or if a meson was, if you had a cat in the box it may be both alive and dead, and no way to say if this was proof or disproof, this is the famous Schrodinger's Cat. Einstein got lots of hoots about this when he said, "Do you seriously think the moon is not up there when you don't look and see?" The Uncertainty Principle also says because the probe and the measured are about the same, you can't measure the mesons or other motes of subatomic physics without changing it. So it was assumed to measure would be to observe, which Einstein believed was not enough proof.
Einstein's way out of this was like what is now being used in the low energy quantum experiments. If you can't see the cat, send a mouse by and see if the mouse moves like a cat is in the box. He believed a low power probe might see without changing what's seen so much. While the experiments up to about 2000 seemed to disprove Einstein, these more recent low power experiments of this sort are indeed in favor of Einstein. This makes sense if the 20th century experiments were assuming what they were trying to prove by using a higher power probe see with.
If the low power experiments are proven this would mean it was of no worth to base the web on, if any physical event has a way to encode and decode, it has a way to encode and decode. This can be used anywhere. A signal must have a change, and any change causes other change, which can be seen by another machine of the same type. Events in physics are mostly simple, but the number line is more complex, so tougher to solve. Any signal in it's basic physics will be simple, it's just made of linear and angular motion, and easy to solve by a like motif.
Some say high speed computers may be bad for the web, I think Quantum Computers are not a threat to the web if based on the number line more than physics even if they're super high speed, because of the way codes are made and decoded. This is by the usual modus vivendi of the codes. It's a bit complex but the essence is it's much easier to multiply up to a huge number and make a code out of it than it is to divide out the code. In the Oct. 2007 Scientific American about "Diamond Spintronics" the authors think if much higher speed computers are developed they would be able to divide (and unmultiply) so fast that the web and others who use the codes would be brought down. But I believe this is in error since the number line allows any level of complexity just by multiplying a more huge number up to make the codes. No matter how fast the computers may divide, it will always be easier to multiply than divide the numbers that make the web safe. This might be achieved by changing the codes more often with higher and higher numbers, the way they use to make automobile locks. A web a month saves Dr Mom with more than the same password all of us are using!
.
.