Wednesday, October 26, 2011

Are Cybersecurity Problems a Mirage?

Remember word processors? Memos are great! Scott Borg in what I think is an amazing article because of a simple omission he may seems unaware of says malware is what he calls a "cybersecurity nightmare." To quote,

"The world of cybersecurity is is starting to resemble a spy thriller. Shadowy figures plant 'malicious software' on our computers. They slip it into emails. They transmit it over the Internet. They infect us with it through corrupted Web sites. They plant it in other programs. they design it to migrate from device to device-laptops, flash drives, smartphones, servers, iPods, gaming consoles, copy machines-until it's inside our critical systems. As even the most isolated systems periodically need new instructions, new data, or some kind of maintanence, any system can be infected."

He then goes on to list sabotage problems like planes crashing, trains colliding, health machines failing, banks being discredited, poison drinking water and so on. Then he continues with what we can know about malware to stop it, e.g. we can't read the code right before our ears with my vision how, right?

We can't find it in hiding because of many places the machine can't read the code, the code to check the bad code may already be tainted, and so on.. Sharing is alright if health not illness. Hope you recieve a card from Blue Cross, says Hope You Feel Rich By This Month! If you go to the doctor and she Says, the Opthamologist will see you now, your eyes are brown and you see right...

My simple question about malware is;

Why weren't there the same problems in the old days, say with the security of the banks or the power wires, my old word processor or when mom calls? (Thanks mom each Mom's day each time you call me..)

The answer could be in simplicity only up to the point where you couldn't know if you win or lose, (If you don't know how you know what you know, you lose more than otherwise"). Consider the three ways computers send code in the second paragraph; instructions, data, or some sort of maintenance. The "wiser" machines of "older tymes" had much more of the data sent, not "instructions or maintenance". People did all the machine operation and maintenance with stuff like business agreements and hand shakes. The wires were secure, the copy machines were secure because there wasn't all the extra add on nonlinear stuff between the machines that can't be known so well. I always thought it was so idiot about the slow upgrades I would see taking up hours of my time, by surprise. Why do I need these, to go in the library and read, or to send a letter was simple and no problem in the old days. This is why I say the middle of the three, of instructions, data and maintenance is too complex. My solution would be to stop the codes for data by converting it to analog first and sending it high speed, then the machine reads it much like an older letter by an onsite sensor, ect. Edison imagined the wires of his Ma Bell machine to be used to send automated letters. A Letter is a letter and if you ask me and doesn't have to be a number so much. This method using microfilm to store the web sites once edited would simplify the process of sending documents like memos, all one simple type of machine. And it's secure. This would solve the problem of tainted malware web pages. There are mechanical computers that are much faster and more energy saving by far than electronic machines. I was a bit too aware like mom the journalist when we found we couldn't buy a simple word processor anymore in the old days. It was simple without all the idiot complications of a PC.

Once data was more sound, like a great library, the question of instructions or maintenance might be solved by just doing without. We didn't need them then or now or we certainly need a way to adapt to the need to change so the foundation is sounder. I was a musician for years to no avail, I tried many ways to remember and be good. Finally 25 years later I got wise 10 years ago then and realized slower but surer is more the merrier, evolution takes time, and knowing what you get is also how the chess machines win. If you make a move where it understands the payoff it will chose to make a move where it understands the payoff over moves where it doesn't know, this is wiser and how it would win. As I listen to my better older songs I found that if I would improvise so for any one mistake I did it three times the the more sound way, even if I was unaware of exactly what I got, it was more and more evolving, a sort of Bayesian method used by many computers. Loose lips sink ships, so even if this makes a great song using both the method of improvement by question and answer "like science" for the song and then memorizing by saying the words of what I wanted to remember in each loop of the rep won by the science method to set boundaries and structure on my improvisations won by the three out of four method also, this method is not the same as for something like a cheap war machine or where a person's life is at risk if harm was so.

Because slower but surer is more like the old days. What about the "modem wars" you may ask? The modem wars are where the army is being attacked millions of times, faster than humans can think via modems. If the US stopped all the codes by the above it would take a while to convert all the machines to a complete system without code between any of them. This is a step that would slow our machines a bit but the malware problem would be solved. As we are vulnerable due top the web, any nation that's without a secure system is too. Thus while it would take awhile to convert out the code we would be more able in war not less if we do. If other nations didn't also convert, we might lose some business, and it would cost more for the machines. I think this is like the middle ages, if they had a castle and it was secure, though not a complete luxury like the web often is here, it would make it so we could make it to the more advanced methods if we realize here we may pay a lot more if we allow the codes than if we don't. This would involve much the same method as ye old word processor, careful checks by simple more reliable machines we know we can trust, no demons or dungeons, the older ways were best. Decartes would say in the old days, I think therefore I am, (they spelled wrong on his 400 year commemorative stamp in the Old World www.) The idea here is simplicity of our machines is predictable, but only up to a certain level, if the machines are always below this level of complexity, we would win more than without. This also relates to the problem of the complexity of the business of making chips has gotten so international bad characters can sometimes put hardware in that may cause problems months or years later, or why not not at all? If all our machines are with our own ways to solve like the royal days, and the code can't be sent, all shall be well, hope so.


The hardware problem could be solved by standardizing the the chips by one common type of chip so there's no hardware where bad code could hide. I think a good solution might be the IP of molecular crossbar chips. These are a simple mesh of wires with a bit where each wire meets. This is a real simple chip to make. Some might object to the lower speeds this type of chip has even if cheap to make. A solution here that's been invented might be to perhaps use a particle as a bit where the wire meets and as it rotates around in mostly continuous space it would store the angle well. Each small bit has three components; a write component, the particle that moves at libitum, a brake on the particle by another to store the angle, and a read component that only would read the brake, which itself doesn't much give way when being read. Since space unlike a quanta is continuous, and the field that is stored or changed is much more precise than a quantum, even one of these by storing the angle could store perhaps an 80,000 bit number. And it's nonlinear so even two of these small bits could store 80,000x 80,000 times as much as a base two chip. This would be super low power in consumption and high in efficiency, and super fast. So the standard chip would be of value even if simple.

About the code, there three levels, the top level is the general method of using prime numbers to agree to the transaction. This level is probably secure because it's based on the well known idea that it takes much more number crunching and division to factor out a code than it does to multiply up two numbers and create the code, the standard method of cryptography currently used. It's based on the ancient science of number theory, and since it involves both division and multiplication, it also involves addition and subtraction, and new operations other than these four operations are not likely to be found. If it were not mostly sound the web would have long since crashed.

The third level is just the binary ones and zeroes the machines encode from the prime numbers for ease of making simple chips.

The second level, however is where the problem seems to be after all a number is just a number without giving instructions to a machine. This is why even a single number out of place can down a machine. The machine can't understand if it's under attack, it's mostly a robot per se.

At the dawn of the internet we would hear of the wonders of all computers speaking the same language. Actually this isn't a language yet, rather just a series of instructions, leading to my thought about the code; how is it we understand when we speak and we know the other person isn't trying to hit us (booses are the exception, hint boss don't read this). The answer is that we share a language with rules of grammar and vocabulary, ect. This isn't just code we always accept without hopes for more. To say computers always speak a common language seems like saying a copy machine is reading and understands the volume. And here may be an important solution or improvement to the problem of bad code.

If we reedit the code and make it like a real language with steps like we speak in sentences, in two part components, noun verb found in all languages, with rules of grammar for each step, the machine could perhaps know what's allowed or to be stopped. In essence, the noun verb would always be about physical power just as in brain research any change in ideas has a change in physiology in parallel, ect. Another form of the noun verb might be a question and answer sort of like Plato's method of dialogs. (Why are you doing this? Like a claim and proof in rhetoric, a thought the straight line and the punch line in comedy, a setup a bluff and a move in the NCAA, or the idea of Noether's that all there is can be classified by what changes and what stays the same, important to physics and the foundations of math.). If the computer doesn't agree to the claim it could stop just that claim and save the rest, leading to much reduction of computer idiocy.

My mom the English professor says, speak and say well, real well, not real good. You look up the definitions of usage and, like who cares, there is no standard usage just a hope of marms to control the language that never has won or would hope to win. With computers on the other hand her utopia might be seen! The machines might not fail if all definitions are set at the outset. This is essentially what we didn't achieve when the codes of the second level were founded. Like the synapses being much more in area than just neurons like other life, making the code a real language might be both much more reliable and economical about efficiency than no codes as in the above. The only problem here may also about standard use since presumably there might be more worth in evolving the language just as with real language, e.g. because of per se economy to make cheap machines. Even so I believe redefinition of the web by this method may save it from malware.


I've defined the three levels of the code, as with the "simple language".the possibility of higher levels of the code; this would be like a spy who sends a letter to another with code words or letters embedded in the usual sentence. However if the letter is only read at the middle level and no higher because the machine reading the letter by what it knows is well proven, no bad code would be involved. This would be like a third person like me reading the spy letter who would, say, Well Duh! No doubt the machine might call the boss so the higher code is stopped, even so real language might well stop malware.
..

Monday, October 17, 2011

Problems and Solutions About "Linked Pages"

A Comic "Rate Schedule" on a sign I read;


Answers 1$

Answers That Require Thought 2$

Right Answers 3$

Dumb Looks Are Free



Tim Berners Lee was the founder of the internet in 1989. On Christmas day almost 1990 so the story goes he had the first web page and one computer this was the size of the internet then, high speed, is all there was and will be, or whatever.

Tim is up to his tricks yet more, yes! His hope is for "linked pages" where you type in and get an actual answer not based on just popularity of the site like Google, rather Tim offers the example of some type of enzyme that solves a problem in GE or DNA and you type in Google and you find nothing, but type in the linked pages and you get 8 possible compounds that could be the answer more definitely, all of them solve the problem by offering a solution that is more directly about the problem. This could be of huge worth, e.g. much better for millions seeking health solutions, it might literally save millions of lives and billions of dollars. Linked Pages may be a solution, now if I type in Linked Pages When will they Be Available, you get not much, but with Linked Pages You Would. While High speed is a relief for web woes studies show so many people are so frustrated with their computers this may also be a real improvement for most people who want to use the web more efficiently, me!



THE PROBLEM;

As with the "rate schedule" we can find answers, but how do we know if the answers are true? This could cause problems, for example each year thousands of doctors, lawyers, and pizza hut fans are slandered, don't know why, and their business is harmed. Like you know, that site for the defence of these persons, reputation.com, a site devoted just to holding on to their business against these people who slander them on the web, ect.

If there is this possibility with linked that literally anyone could just go online and type in any lie they like, and then anyone else asks about them like a background check by potential employer, and they find all the answers, and they are all lies.


SOLUTIONS

Though this is bad and though I believe linked pages are better than not, I think there may be ways to find the truth;

One would be to seperate on the search engine the linked pages from the popularity pages. This has the problem that 90% of the books published when there were actual volumes were nonfiction, that is problem solving, and so most people would seem to pass the popularity pages and go right to the linked pages, thus another strategy might be best...

In the ancient art of rhetoric there is a discussion and when often inevitably if not moreso there is a dispute sometimes, there are two ways to resolve it, not Tim Berners Lee typing in 200 million pages typing out is of more worth! The two ways are to either ask an expert or for the person making the claim in dispute to improve it and resubmit for consideration, then if it's not accepted, to either ask another expert or improve some more, resubmit, ect.

For web pages to know the truth, using the popularity of sites may be one way to insure that most people have already agreed that the page is improved for consideration, and the other way is to ask an expert, i.e. authority sites like wikipedia for science, and sites like WEB Md. for medical stuff. This would be based on a more complex ranking system, the best of both linked pages and popularity.

..

Thursday, October 13, 2011

Why Do Galaxies Spin as a Solid Body?

Hubble thought that the types of galaxies called giant ellipticals (a round oval!) are in transition to the spirals or barred spirals. In this scenario the newer galaxies are with the quasars and then they lose power and settle down to a disc with many small spirals. Over time these could evolve to fewer spiral arms of more size somehow. Later research disproved this simple evolution scheme. Because the spiral arms don't actually form in the ellipticals having too little dust for more star formation. Ellipticals have quasars and the other types of spirals don't, so it would seem the lower energy spirals are without the quasar continually and from the start the more massive galaxies are massive and the lower power galaxies are influenced by lower mass from the start.

Perhaps the quasars are so radiant of dust they move all the dust away and though the mass is great, fewer new stars may form, unlike with spiral galaxies that continue to form new stars from the dust.

The problem of solid rotation is common to all types of galaxies and so is the problem that they all spin faster than the centrifugal force allows and they should explode.

It's been assumed there is the missing mass of dark matter that's holding the galaxies together or that perhaps gravity is just stronger at greater distances than usual. If you look at the spiral galaxies before they are old we see the wheel of solid light and around the outside are smaller streamers, these then accrue and got to more and more massive dust arms with regular spacing. These small arms seem to be what will be the larger arms by consolidation. The cause of the streamers and the consolidation of the arms I believe may be black holes and also why all the galaxies spin as a solid body and too fast.


A Possible Mechanism of the Creation of Black Hole Jets

We could well assume the jets are just a carryover from the same physics of dense stars and other superdense magnetic fields. There would be problems with standard relativistic physics here because the escape velocity is so high if nothing can move faster than light no jet can reradiate and the jets seen are often hugely energizing. A simple explanation of a barred spiral is just that the jet is so massive it makes the galaxy spin as a solid body because like a solid line of mass well unified the jet is so strong the rotation is faster. This is where the solid rotation of other galaxies might also arise; if you look at the dust lanes of spiral galaxies they might seem to be low energy bars by a rounder looser continuation of this method. Thus the spirals are connected more loosely but they still have enough inner attraction to spin both as a solid body and also faster than they would if not held together by the more massive cohesion of the smaller black holes. This would be how the spin is unified; the arms are ploughing through the field with stronger cohesion and the resistance to the somewhat less resilient solidity than the more massive jets of barred spirals makes the spiral, this is why the spiral arms always lead and never trail the motion of the spin. When the galaxies are older an odd number of lanes is allowed because there are an odd number of central black holes not yet merged, if there were just massive poles there would always be even numbers of lanes. Ultimately this idea predicts that later in the evolution there would tend to be more even poles.


HOW Do THE JETS MOVE FASTER THAN LIGHT, And What Other Physics Might Be Involved?

Stephen Hawking used the idea of the field being so dense near the event horizon of black holes that virtual particle antiparticle pairs are created, some fall in because of more mass, others radiate out because of more energy leaving the black hole with an ever dwindling mass. Somehow with time this causes more and more evaporation leading to the explosion of these small black holes, e.g. left over from the big bang. I think if a black hole was losing mass by this it would have less and less gravity to make more radiation and so there would be no exploding black holes, Hawking Radiation has already been well proven with strong fields in the lab. If Hawking had looked at supermassive bodies then known he would see no radiation out of the event horizon and the jets would of be disproof of his other idea that all the matter and information that falls into a black hole is lost.

The mechanism he used, though not his own discovery and though not a quantum theory of gravity as hailed by the press in those days, can still be useful to physics perhaps. After all while knowledge of how common matter interacts with gravity is not a quantum theory of gravity either this was the basis of Newton's improvements in physics and perhaps I think some of relativity and gravity. I think of super massive fields as ionizing machines perhaps by a method like Hawking's. Inside the event horizon there would be great tidal forces and this could lead to creation of particle and antiparticles pairs like via Hawking's idea. This not only creates lots of particles, it also seperates them fast and thus perhaps causes lots of charge seperation, then unified to the poles. This could be much the same as the physics of more common stars like pulsars but with the twist that the gravity is almost strong enough to completely overcome the quanta, that is, for a lower power star there's a certain delay that takes time to connect, a real measure of inefficiency. Thus gravity beyond a certain point might have more ions created. There would be many more leptons than heavy particles at inside the event horizon because having less mass more can be more easily caused via creation. Deeper in heavier particles would also be created, these however have lots of inward radiation pressure keeping the tidal force from causing separation unlike the lepton in free fall with acceleration at a richer worth above. This heavier area of compression can't stand up to the gravity by any known force yet here. T'Hooft and I agree there is no proof of destruction of mass energy and I believe there is here a fifth and reaction sixth force by Newton's Third Law inside massive fields, this would explain why the jets have unusual spectra not seen in the accelerators and also of course the source of the jets, cannot be fusion. One important thing to look for in the particle accelerators might be the fifth and sixth force to explain these anomalies and unify the earth and heaven as in old Christmas celebrations, old celebrations all month! Perhaps the results from the LHC are involved. Though there are no long range forces to be seen other than gravity and electromagnetism, more force at close range might have something like the unexplained bonding of the heavy particles recently seen by the LHC. By the energy of the jets we may then find these elusive particles though presumably not stable. (As we go from lower higher mass in common physics beyond the leptons all the particles have generally shorter and shorter life times for more mass, so more massive forces may not be stable except under extreme gravity. They would mostly radiate out with super power only via gravity, and probably not be useful for bombs because if there is no stability except for short times there is no fuel like U238 nearby.).

As I believe, perhaps the jets need faster than light relative motion to reradiate out of the black hole's speed of light inward motion of the field. Feynman believed faster than light signals were the only way to explain black hole cause by unifying the outside with the inside or indeed energy conservation would have the energy be destroyed falling in. I agree with Van Flandern and no doubt Einstein's own idea of the EPR, a faster than light connection. See upper left of my page Physics Synopsis for my reasons why I believe gravity and the EPR may be Faster Than Light.

Note that the polar gravity of a star is stronger than elsewhere so the inward acceleration is even greater at the polar outflow even than inward at the disc and if it's already at the speed of light there, just using something like the "antigravity" of the polar magnetic field is not enough to make the inward acceleration less than the speed of light inward since the gravity is stronger there no weaker. thus the jets have no way out without faster than light.

If we have a bucket and try to lift it in the Earth's field we don't need to lift it faster than 32 feet per second to escape the earth. So too it might seem that we wouldn't even need to have the field of the jets move at near the speed of light to exit the black hole. A less dense field of gravity would perhaps give way to more pressure of the leptons. We do need a way to lift the bucket and this would be a wire of faster than the speed of the earth's field however. Thus the only way for the jet to connect I believe would not be the common electromagnetic field, rather the familiar matter wave field Feynmann, the Quantum Man, found in the foundation, he seemed to realize well that there was a need for the field connection of massive fields via Faster than Light. Thus there would be the basic tidal force to create lots of leptons, then separated by the magnetic jets an outlet for the field. After separation the particles would be drawn apart and inward to the poles by the magnetic force that causes cohesion of most, and beyond a certain point as the charges rise they would perhaps start to repel mostly and also attract and this would draw more charges up by a sort of induction. It wouldn't just be the gravity that powers black holes rather this upward attraction from the poles would allow radiance to rise. If there were no outlet there would be nowhere for the leptons to go and the process would stop. If there were mostly leptons in the main motion of the jets, they being lighter for this too to rise even in strong fields more heavy particles would stay nearer the center. None of this would violate lepton number or baryon number, or ect. At lower energy like common black holes of i.e. 10 solar mass no superfusion would take place, while at high enough densities, as in barred spirals the lights would turn on and higher power for more density. The source of much of the power other than the process of the lepton reradiance of the jets would be just the gravity converting the field to leptons by this mechanism somewhat like Hawking Radiation inside the event horizon. This would be why the huge jets of some realms are seen without any disc to fuel the machine. Indeed the implosion of the field could be a sort of dark matter as I say in the two posts below. The dark matter particles would flow in at lower power and then become the leptons. The question of why the implosion of gravitons here would only give half the increase of mass over time as astronomers observed compared to predictions of the way of this motif I believe may be refined to merely say that the gravity has to always win and thus that the electric reradiance of the poles in only half the gravitational implosion is mostly coincidence, there are few numbers that aren't vast or huge in any kind of connection between forces in subatomic physics like this. It seems there always has to be inward radiance by gravity and then reradiance, but the gravity will always win or there would be stuff like gravity shielding. Sooner or later with increases in astronomy power and the field, there would be nearer levels of gravity with the electromagnetism. (This is what's interesting about the LHC finding of the anamalous union of the higher energy force. It might be near enough the density of the field of a neutron star or a black hole to be near the 5th and 6th force.) Even so if we add more mass like for more massive black holes, like Hawking who agrees with Einstein I believe, more mass would increase more with time. E.g. the older galaxies are thousands of times more massive than young galaxies with no visible way often seen to increase the mass if the field itself was the cause of ion power, this could then unify the spiral arms of the galaxies. The outflow would be highly ionized and attraction the would make the galaxies solid and make them spin fast, and so on. We might expect to see zones of North and South leptons, in the center of each zone is the black hole. The ion outflow attracts the stars. You might ask if the electric charge is so much stronger than gravity in ions, why don't the black holes move together, actually they do to form the larger arms later, even so the interstellar distances may be so large it takes a long time, and most of the ions may combine outside with other ions to neutralize much of the attraction. Most star systems are binaries and so most black holes bound to the star would have twice as much force to move it nearer to other black holes.

The same process of cohesion would be in elliptical galaxies, except they would have more random angles as jumbled up by the quasar, this would be why they're round while other massive systems are not, i.e. superclusters ould be more oval because they have more random angles of the jets of each member of the system.
..

Monday, October 03, 2011

Mass If Shieldable By "Renormalization" Why No Eclipse Shielding?

Renormalization, the most accurate physical theory ever devised (until recently when the pulsar slowing rate predicted by General Relativity was proven by eclipses) is not complete. Initially there was the problem that the loops of the field by the Uncertainty Principle seem to get stronger with reduced distance, to zero distance when the mass would be infinite. This was then solved by finding the opposite component of the field and subtraction allows a finite residue, with the right mass and charge of the electron, not infinite. While this is no problem for the electric charge since it shields anyway, The mass itself would otherwise be infinite, so renorm works by assuming mass can be cancelled. The problem here is this is also a form of shielding and gravity doesn't shield. (The inertia of the electron seems no problem because inertia as in GR has no source and is internal.) The mass however would be huge and if it could be subtracted and it would seem to shield like in eclipses. I think of the friction as a sort of antigravity, the tides of the moon are a short range result of the mass of the electron or other particles, and this makes the moon slowly spiral out. Or more exactly the shielding of the electric charges by the renorm. If we can shield gravity by way of the electric charges why not gravity itself? Something otherwise is with gravity. In the next post I consider the evidence for an intermediate energy field aurora around each mass neither as dense as the electric field or as strong as the mass field of gravity. This would be important to dark matter, ect. Here may be the solution to the problem of gravity and renorm. If gravity would interact directly with the mass of most subatomic motes it would have to be at their wavelength and so a particle and so it would not only be not gravity and much more massive, it would also already shield. However if each particle and mass has an aurora it has higher volume and the volume itself would be lower energy more like gravity, gravity lite. Only locally is the mass huge and only the subtacted value is seen. Even so some of the gravity waves would hit the actual masses even if just the small actual zone of the electron and so at super high densitys like the eclipses of black holes we might see some slowdown via the shielding. Though possible this would be tough to find because a singularity is not great and the distances are vast somewhat.


The mechanism of the field would seem to need reversed entropy so gravity winds up the cosmos so it would seem to have wound down over infinite time of energy neither being created or destroyed, you might say entropy is not a change in the energy of the cosmos just a rearrangement of it. But high energy physics like quanta is mostly conservation of information and if all the matter was rerraranged into a cosmic fizz this would essentially violate energy conservation. If the low energy interaction of the gravity waves with the more nonwaved but still mosly wavelike components of each halo, the entropy would be easier to generally overpower, gravity imploding not reradiant so much (except at the lower energy as Einstein thought to conserve field momentum. If the earth was imploding only, it would gain two earth masses a second. The lower energy reradiant field wouldn't exert as much force as the higher energy fields so we don't fall of the earth.). Thus gravity unlike the other forces seems to be a sort of one way valve for force and the force it would exert is by overpowering the entropy. Particles alone couldn't acheive this because the are disconnected so the field would be mostly wavelike and this would be continuous attraction. In order to directly impact a renormalized zone gravity would need to be a dense particle so it would be by definition non implosive, a contradiction, so the halo would explain the low energy modus of operation. Each mass has the field, which would interact with the lower energy waves and the waves don't mostly have to be particles for the field to find out each zone of mass and unify it. The entropy reversing is easier to achieve via waves than particles because they are continuous, even though some particle force is needed for general discontuity like the heavier particles and to explain why when you fall there's no force in your rest frame. The particles flowing along with you are in mostly discontinuous connection and exert pressure once you are at rest on the roof by their motion of just 32 feet/second.

I believe no doubt experiments are being done to find the Higgs'even so Hawking's idea that there is no unified field probable I disagree with. Imagine a city where we have all these roads and steps and bridges and other definite ways to reach the other side by travel, taxicab geometry lives! The city and ways to reach are set and we might say each special road and zone is basically disconnected from the other side; we can't ever reach the other side without obeying all the signs and no faster than light travel is allowed as far as trains and stations and old old watches are involved. If the connections are not obvious we might say the street is unchangable and so all we will find is not a unified field ect.; rather we find just a set of mosaics that are disconected, loosely connected and not unified. Imagine a low energy Higgs' like Einstein's low energy particle he used to get around the Uncertainty Principle. If this is a fundamental particle with which to build the city, you have a new block and new streets, a whole new world of states of matter and perhaps the constants are changed and explained. All the fundamental reality is proven or controlled. I truly believe in one field because energy conservation is well proven for all physics, this is a good reason to believe Hawking is wrong here. He's good to popularize science though I think he's wrong in this event.


Some believe in many Higgs' and I believe the two main things the Higgs' are predicted to be used for are in contradiction. A higher energy Higgs' may be used to unify electromagnetism well as is needed to "make the muon weigh" but the claim that it can also be used to be the fundamental source of mass seems contradictory. High mass and low mass are not the same. Thus the prediction of the high energy Higgs' like the LHC may not itself be the low energy mass particle. There are many masses of subatomic particles and it would take a lower energy wavelike Higgs' to not cause a huge quantum jump of the masses as measured, in other words, the graviton, the particle who be Higgs'.

It could be argued that renorm could be used to make the Higgs' have low mass externally while the higher mass just for the heavier fields but if there were a much huger number of the heavy particles, each small bit of shielding by the gravity could cause delay in eclipses not seen. A smaller number of noneclipsing particles would be easier to explain no eclipsing yet found compared to more with more huge surface area, ect.

More about The Low Energy Particle; Recent Experiments



(Link to site about the twisting light experiment).