Search This Blog

| Vacuum - Information - Measurement |

Is there any Vacuum?

To understand that is there any vacuum or not, we must get information about coordinates of vacuum. And getting information has process.
-         For example;
If we want to know, which kind of element placed in which coordinates. We must send (dispatch) some waves & get info according to effects of element on waves. Or we can do that by receiving the rising waves from element. (Like spectrograph)

About vacuum if we want to get information;
First, It is impossible. Because vacuum = null. (Empty coordinates) and get info from null is meaningless. (Because there is nothing, even info.)
And vacuum has been destroyed. Because dispatching every kind of waves, will destroyed the vacuum. Waves have unlimited effects. (Waves function)
So supposing the vacuum (in exact mean) is wrong.
These were about exact meaning of vacuum.

In physics vacuum (in exact mean) is not accepted. Phys. Accepts that there s no vacuum100%. (like zero friction. Indeed we can say in vacuum & friction = 0)
Concepts of Vacuum fluctuation (that discuss about creation from nothing) do not talk about vacuum actually.
Energy (strings of energy), waves, elementary particle are swinging.
In physics vacuum means no matter. (like proton or neutron)
-         For example the spaces between electrons & protons are vacuum. But there is lots of waves and photons & of course Information. 
In vacuum, elementary particles do swinging. And they do transferring info and receive.

So null and empty (vacuum) in exact meaning is impossible.
And place of vacuum is meaningless. Empty coordinates is also meaningless. Because there is Information.
Because of these creation from nothing is meaningless.
-         using word, from, means there is something. 

(According to measuring and information theory, particle physics, Quantum vacuum, vacuum fluctuation & etc.)




 Floating Shortcuts   for Android




| DarkMatter/Energy - Measurement - Quantum Gravity |

Gravity in Quantum Scales


Description of Dark Matter/Energy;
|| Since any model of structure formation must explain both the tiny ripples in the Cosmic Microwave Background temperature across the sky, and the large-scale structures we see in the universe today, the combination of these two probes is especially powerful. Together they enable us to probe the spectrum of fluctuations over about 4 decades in length scale and its evolution over almost the entire age of the universe. This information, plus information from smaller, high-red-shift very early time, surveys of galaxies and neutral gas will enable us to piece together the mechanism for the formation and mode of evolution of all of the structure we see around us in the universe.

Galaxy surveys with well defined selection criteria enable us to extract information about the clustering pattern. (Surveys during the early 1980's showed that galaxies are not distributed randomly throughout the Universe. They are found to lie in clusters, filaments, bubbles and sheet like structures).

To characterize the distribution of galaxies, a number of statistical tools have been developed. The most widely used approach to quantifying the degree of clustering observed is to measure the correlation functions. For example the two-point correlation function is the probability, in excess of random, of finding a galaxy at a fixed distance from a random neighbor. Beyond this measurement, one can investigate a higher order correlation of functions, for example the distribution of counts in cells: the distribution of the number of galaxies found in cells of a given size which one lays down atop the survey.

As the new era of surveys catalogues the 3D positions of millions of galaxies the biggest uncertainty will become the mapping from the clustering of different galaxy types to the clustering of the underlying matter which the theories most straightforwardly predict. There is no reason to expect that the galaxies will trace the matter exactly. The fact that the light does not trace the `mass' is usually called galaxy bias (one of several technical uses of the term bias in large-scale structure) and poses a fundamental problem on which much work remains to be done.

Clusters of galaxies are the largest gravitationally bound systems in the universe, with sizes of a few Mpc (a Mpc is about 3 million light-years). A typical cluster contains hundreds or thousands of galaxies, but most of the mass is in the form of a hot intra-cluster gas. This gas is heated to high temperatures in the potential well of the cluster. Clusters are rare objects: less than 1 in 10 galaxies in the universe resides in clusters, the rest are said to be field galaxies.

The two most obvious means of studying clusters of galaxies are by observing the light emitted from the constituent galaxies or the X-ray emission from the hot intra-cluster gas. Recently it has proved possible to observe clusters of galaxies in two other ways which (in combination with the traditional methods) should prove exceptionally powerful. The first, known as the Sunyaev-Zeldovich effect after the people who first proposed it, is to observe the cluster as a hole in the microwave sky. Due to the free electrons in the hot intra-cluster gas, photons from the microwave background are up scattered in energy. This leaves a decrement or deficit in the number of photons at low-frequency or a hole in the microwave sky.

Obviously clusters trace out the large-scale structure of the universe just as galaxies do. However there are several cluster properties that are interesting in and of them.

The present number density of clusters is a measure of the amplitude of fluctuations in the universe on scale of around 8Mpc.

The evolution of this number density (vs mass or temperature) with red-shift can determine the mass density parameter Omega. Recently it has been argued that the number of giant arcs, caused by strong gravitational lensing of background galaxies is caused by the cores of clusters.

The Bolshoi supercomputer simulation, the most accurate and detailed large cosmological simulation run to date, gives physicists and astronomers a powerful new tool for understanding such cosmic mysteries as galaxy formation, dark matter, and dark energy.

To the scientists, this was a challenge to demonstrate and they did a simulation (The Bolshoi supercomputer simulation, the most accurate and detailed large cosmological simulation) traces the evolution of the large-scale structure of the universe, including the evolution and distribution of the dark matter halos in which galaxies coalesced and grew. Initial studies show good agreement between the simulation's predictions and astronomers' observations.

These huge cosmological simulations are essential for interpreting the results of ongoing astronomical observations and for planning the new large surveys of the universe that are expected to help determine the nature of the mysterious dark energy.

The standard explanation for how the universe evolved after the Big Bang is known as the Lambda Cold Dark Matter model, and it is the theoretical basis for the Bolshoi simulation. According to this model, gravity acted initially on slight density fluctuations present shortly after the Big Bang to pull together the first clumps of dark matter.

Although the nature of dark matter remains a mystery, it accounts for about 82 percent of the matter in the universe. As a result, the evolution of structure in the universe has been driven by the gravitational interactions of dark matter. The ordinary matter that forms stars and planets has fallen into the gravitational wells created by clumps of dark matter, giving rise to galaxies in the centers of dark matter halos.

Also we have a new and different technique that allowed astronomers to observe radio light from hydrogen gas dating from when the universe was about half its current age. This was the furthest scientist have ever observed such gas.

The method, called intensity mapping, could eventually reveal how such a large-scale structure has changed over the last few billion years, giving insight into which theory of dark energy is the most accurate.

The project mapped hydrogen gas to greater cosmic distances than ever before, and shows that the techniques they developed can be used to map huge volumes of the Universe in three dimensions and to test the competing theories of dark energy.

Since the early part of the 20th century, astronomers have traced the expansion of the universe by observing galaxies. The new technique allows to skip the galaxy-detection step and gather radio emissions from a thousand galaxies at a time, as well as all the dimly glowing material between them.

This is a demonstration of an important technique that has great promise for future studies of the evolution of large-scale structure in the Universe.

Also we can see the black holes in the picture of the large scale structure, we hear news of a new study of the early universe showing that black holes formed earlier than expected and that structure in the pattern of galaxies extends to larger distances than expected.

 A key feature of this model is the idea that the birth of black holes in the centers of supergiant galaxies is strongly influenced by the large-scale distribution of matter in the universe. This conjecture can successfully explain two observed phenomena: the alignment of the radio, optical and infrared axes of high-red-shift radio galaxies, and the alignment of present day CD galaxies with their environments.

The ruling paradigm says that galaxies formed when hydrogen gas and dark matter slowly clumped together under its gravitational pull. Stars were formed which continued to collapse together to form galaxies. The early stars which were large would die quickly and form black holes which would coalesce to form super-massive black holes at the centers of galaxies.

The process was seeded by density perturbations in the gas that existed at the time of the last light scattering. The effects of these perturbations are seen in the cosmic microwave background and are very familiar to cosmologists. They are believed to be due to fluctuations during the inflationary epoch and they have the right scale invariant spectrum to fit that hypothesis.

It predicts that the black holes form after the stars, yet we see quasars appearing in the early universe containing huge black holes that must have formed much earlier.

We also observe structure in the distribution of galaxies that extends out to very large scales. This is not predicted by the cold dark matter theory of structure formation. An example is the Great Sloan Wall, a vast planar structure covering 5% of the size of the observable universe.

One possible answer is that they did not form through gravitational collapse at all, but instead by a process of caustic focusing of dark matter by gravitational waves.

We know very little about how the inflationary epoch ended. The vacuum state would have changed as the inflationary scalar field dropped into a broken phase. There may have been a phase transition but it may have been a soft second order transition or even a smooth crossover. We don’t even know when it happened.

It may have been the electro-weak transition or something earlier.

It is likely that the transition did not happen simultaneously at all points in space. Fluctuations would mean that inflation continued a little longer in some places than others. This would leave a remnant gravitational wave background in the universe which in time would have cooled and weakened as the universe expanded more slowly. It would be hard to detect directly today because of its very low frequency and weak amplitude, but in the early universe during baryogenesis it would have been stronger.

||

-              -    To be continue…
Physicsism will complete it by studying on Expanding of Universe And what is relevance between Red-Shift & Fluctuations of elementary particle at Edge of Universe with Quantum Information concepts…


we don have good researches or discussions that compare information and dark matter Or look at dark matter from information peephole.
(For example mini&micro black hole had been made when physicist look at the black holes informational. Not exactly base on Q.information theory But that was good for first step. )
Now, I want ur opinion and viewpoint about dark matter/energy & Q.information theory,
And also Red-shift…
What r the effects of information at red-shift?
...



| Dynamics system - Information - Spin |

System of Information 

Description of Asymmetry & Symmetry by Miriam Strauss
||The term symmetry derives from the Greek words sun (meaning with or together) and metron (measure), yielding summetria, and originally indicated a relation of commensurability. It quickly acquired a further, more general, meaning: that of a proportion relation, grounded on integration of numbers. From the outset, then, symmetry was closely related to beauty and unity, and this was to prove decisive for its role in theories of nature. In Plato's Timaeus, for example, the regular polyhedra are afforded a central place in the doctrine of natural elements for the proportions they contain and the beauty of their forms: fire has the form of the tetrahedron, earth the form of the cube, air the form of the octahedron, water the form of the icosahedron, while the dodecahedron is used for the form of the entire universe.

From a modern perspective, the regular figures used in Plato's and Kepler's physics for the mathematical proportions they contain (and the related properties and beauty of their form) are symmetric in another sense that does not have to do with proportions. In the language of modern science, the symmetry of geometrical figures such as the regular polygons and polyhedra is defined in terms of their invariance under specified groups of rotations and reflections. Where does this definition stem from? In addition to the ancient notion of symmetry used by the Greeks and Romans (current until the end of the Renaissance), a different notion of symmetry emerged in the seventeenth century, grounded not on proportions but on an equality relation between elements that are opposed, such as the left and right parts of a figure. Crucially, the parts are interchangeable with respect to the whole. They can be exchanged with one another while preserving the original figure. This latter notion of symmetry developed, via several steps, into the concept found today in modern science.

The first explicit study of the invariance properties of equations in physics is connected with the introduction, in the first half of the nineteenth century, of the transformational approach to the problem of motion in the framework of analytical mechanics. Using the formulation of the dynamical equations of mechanics due to W. R. Hamilton (known as the Hamiltonian or canonical formulation), C. G. Jacobi developed a procedure for arriving at the solution of the equations of motion based on the strategy of applying transformations of the variables that leave the Hamiltonian equations invariant, thereby transforming step by step the original problem into new ones that are simpler but perfectly equivalent , Jacobi's canonical transformation theory, although introduced for the “merely instrumental” purpose of solving dynamical problems, led to a very important line of research: the general study of physical theories in terms of their transformation properties.

The application of the theory of groups and their representations for the exploitation of symmetries in the quantum mechanics of the 1920s undoubtedly represents the second turning point in the twentieth-century history of physical symmetries. It is, in fact, in the quantum context that symmetry principles are at their most effective. Wigner and Weyl were among the first to recognize the great relevance of symmetry groups to quantum physics and the first to reflect on the meaning of this. As Wigner emphasized on many occasions, one essential reason for the increased effectiveness of invariance principles in quantum theory is the linear nature of the state space of a quantum physical system, corresponding to the possibility of superposing quantum states. This gives rise to, among other things, the possibility of defining states with particularly simple transformation properties in the presence of symmetries.
The first non-spatiotemporal symmetry to be introduced into microphysics, and also the first symmetry to be treated with the techniques of group theory in the context of quantum mechanics, was permutation symmetry (or invariance under the transformations of the permutation group). This symmetry, discovered by W. Heisenberg in 1926 in relation to the indistinguishability of the identical electrons of an atomic system is the discrete symmetry (i.e. based upon groups with a discrete set of elements) at the core of the so-called quantum statistics (the Bose-Einstein and Fermi-Dirac statistics), governing the statistical behavior of ensembles of certain types of indistinguishable quantum particles (bosons and fermions).
The permutation symmetry principle states that if such an ensemble is invariant under a permutation of its constituent particles then one doesn't count those permutations which merely exchange indistinguishable particles, that is the exchanged state is identified with the original state.

The starting point for the idea of continuous internal symmetries was the interpretation of the presence of particles with approximately the same value of mass as the components (states) of a single physical system, connected to each other by the transformations of an underlying symmetry group. This idea emerged by analogy with what happened in the case of permutation symmetry, and was in fact due to Heisenberg (the discoverer of permutation symmetry), who in a 1932 paper introduced the SU symmetry connecting the proton and the neutron (interpreted as the two states of a single system).

Symmetry can be exact, approximate, or broken. Exact means unconditionally valid; approximate means valid under certain conditions; broken can mean different things, depending on the object considered and its context. Symmetry breaking was first explicitly studied in physics with respect to physical objects and phenomena. This follows naturally from the developments of the theory of symmetry, at the origin of which are the visible symmetry properties of familiar spatial figures and everyday objects. There are two different types of symmetry breaking of the laws: explicit and “spontaneous”.
Explicit symmetry breaking indicates a situation where the dynamical equations are not manifestly invariant under the symmetry group considered. This means, in the Lagrangian (Hamiltonian) formulation, that the Lagrangian (Hamiltonian) of the system contains one or more terms explicitly breaking the symmetry.
Spontaneous symmetry breaking, occurs in a situation where, given a symmetry of the equations of motion, solutions exist which are not invariant under the action of this symmetry without any explicit asymmetric input (whence the attribute “spontaneous”). A situation of this type can be first illustrated by means of simple cases taken from classical physics.
In quantum physics SSB actually does not occur in the case of finite systems: tunneling takes place between the various degenerate states, and the true lowest energy state or “ground state” turns out to be a unique linear superposition of the degenerate states.
The SSB prototype case, is Heisenberg's theory of the ferro-magnet as an infinite array of spin with 1/2 magnetic dipoles, with spin-spin interacts between the nearest neighbors in such a way that neighboring dipoles tend to align.
The concept of SSB was transferred from condensed matter physics to QFT (Quantum field theory) in the early 1960s, the application of SSB to particle physics in the 1960s; and successive years led to profound physical consequences and played a fundamental role in the edification of the current Standard Model of elementary particles.
 According to a “mechanism” established in a general way in 1964, in the case that the internal symmetry is promoted to a local one, the Goldstone bosons “disappear” and the gauge bosons acquire a mass. The Goldstone bosons are “eaten up” to give mass to the gauge bosons, and this happens without (explicitly) breaking the gauge invariance of the theory.
||


 Asymmetry & Symmetry

Irregularity (High-Entropy) - lowest energy (use)

Regularity (low-Entropy) - Highest energy (use)

- Decreasing Entropy > Lowest Bit.

- Increasing Entropy > Highest Bit.

Causality Sequence (non-accidental) means a sequence with probability of 1, has lowest Bits.

Non-Causality Sequence (accidental) means a sequence with unknown probability has highest numbers of Bits.

More information means powerful Structure. It means Harmony comes from chaos.

It is the fact of Nature to be chaotic.

It is reality.

It is nature.

The total state can be obtained by adding the spatial state the particles which could also be symmetric Or anti-symmetric But in the case of Bosons the spatial part of the wave function HAS TO be Anti-Symmetric...

(Bosons are particles of force transmission.)

"Bell-State Analyser"

Wave function can explain exactly about two particles while spatial states be symmetrical, that can be unequal zero, So spin states must be Asymmetrical.

- It means symmetrical universe cannot exist.

Universe can be symmetrical But just in spatial states as virtual parameter, not about reality.
-----


Asymmetric Supernovae: Not All Stellar Explosions Expand Spherically;
http://www.sciencedaily.com/releases/2011/02/110224145803.htm


Surprising New Evidence for Asymmetry Between Matter and Antimatter;
http://www.sciencedaily.com/releases/2010/05/100524161338.htm


Using Chaos to Model Geophysical Phenomena;
http://www.sciencedaily.com/releases/2010/12/101207091811.htm


Physicists propose mechanism that explains the origins of both dark matter and 'normal' matter;
http://www.physorg.com/news/2010-12-physicists-mechanism-dark.html


Dark Matter Could Transfer Energy in the Sun;
http://www.sciencedaily.com/releases/2010/12/101201095822.htm
& …


If laws of nature were symmetrical Now we couldn’t observe any Asymmetrical that be basis of Chaos.
-          Wave function can explain two particles while spatial state be symmetrical that it can be not equal Zero, So Spin MUST be Asymmetrical.

Several Frame in one Frame.
§  Laws must be symmetrical?
NO.
But we suppose at first that laws always symmetrical. (And suppose this supposition as principle!)
-          For everyFrame & everyTime laws are symmetrical But if you suppose several frame in a frame

It becomes impossible to have symmetry for primal frame. You walk on floor, other one walks on wall and I walk on roof…
All of us measure our walk-side as floor!
So easy you can understand it.  Just be careful when see your LCD/LED.
Angles are important for measuring (to get information).
(However LCD/LED s not exact and complete example!)

Well we understand here about Angle of Effect.
Laws are effecting with pi/2 (R) on frame But if the frame has angle proportion to primal frame laws will effect on with |pi/2 + a|.
So from another frame will measure sth different.
Now we can understand concepts of HyperCube better, And can explain about HyperCube (Tesseract) and its events in it and they wont be exotic & marvelous for us.

But we can see symmetry in nature sometimes. What is it?
-          Yes. It s exactly symmetry But in some coordinates for some viewpoint.
            Sometimes in some special frame for some observer we have Symmetry.

-                                 -     To be continue…


Next essay; Universe Expanding/Collapsing; Done!




| Measure theory - Quantum Mechanics - Frame concept |

Quantum Zeno Effect

Zeno was a disciple of Parmenides of Elea. Parmenides went around telling people that reality was an absolute, unchanging whole, and that therefore many things we take for granted, such as motion, space, and plurality, were simply illusions. At this period of time, motion and space were major philosophical questions. A philosophical(Gedanken or Subjective) question is a real question, which cannot be answered by the conceptual resources at the time the question is posed.

During this time in Greek thinker viewpoint, Parmenides of Elea realized that people believed in too many fictional concepts. Fictional concepts like holes, space, time, motion are all ideas that do not refer to anything and therefore are illusions.
His perception of space, motion, and time as fictional concepts could be interpreted in two different ways. Parmenides could be expressing that there is no space, time, or motion because reality is spacesless, timeless, and changless.
Space, at this point in Greek philosophy (viewpoint), was thought to be either discrete or continuous. If space is discrete, then there can be smaller units of it. These smaller units of space are called space atoms (Pyne). If space is continuous then it is infinitely divisible. Something that is infinitely divisible can be divided up in infinite amount of times. Zeno disproves the ideas of discrete and continuous space through his paradoxes.
Zeno’s Arrow paradox disproves how space can be discrete. Imagine a bow and arrow where the arrow is pointing at a target. An arrow is either in motion or at rest in one “space atom”. An arrow cannot move, because for motion to occur, the arrow would have to be in one “space atom” at the start of an instant and at another at the end of the instant. However, this means that the instant is divisible which is impossible because by definition, instants are indivisible. Hence, the arrow is always at rest in one of the “space atom”. (Pyne) Space therefore cannot be thought of as discrete, so then space must be continuous.
This essay will put forward the view that Zeno’s paradoxes represent something fundamentally wrong with our understanding of the structure of space and time (as dependent parameter of macro EM field effect).
This forms a problem that runs through the whole of human mathematics.
We briefly mention two verbal matters. Should the effect be called quantum Zeno? The original Zeno paradox was expressed in different forms, both of which are based on the difficulty of building up an idea of motion from a series of instantaneous snapshots. The quantum Zeno effect is based on the idea of measurement freezing change. Thus there are rather superficial similarities but rather more deep-seated differences between the sets of ideas.

Zeno’s paradox can be expressed in many ways but for the purpose of this post it is best to think of dropping an object and measuring the time (as dependent parameter of macro EM field effect) it takes to reach the ground.
Zeno rejected the idea of infinity and so he had a paradox and believed that moving and changing to be an illusion and that was the only thing that mattered!
I am now going to explain that it does not reject infinity but explains it as a universal process that forms the arrow of geometry of space-time, taking into account that the Zenon Paradox is not wrong.
In this theory the forward passage of time (as dependent parameter of macro EM field effect) is formed, by the forward motion or momentum of light, forming the geometry of space-time.
The probabilistic nature of the wave particle duality of light forms the flow of time (as dependent parameter of macro EM field effect) itself. This is explained by the Schrodinger equation that represents the quantum wave particle function.
Therefore Heisenberg’s Uncertainty Principle is the same uncertainty that we have with any future event and represents potential future possibilities.
The answer to the problem of infinity is that it only has the potential probability to exist. Aristotle was the first to introduce the idea of something being potentially infinite.
In this theory we have a potential infinity of probabilities at every degree and angle of space-time because one thing after another is always coming into existence as part of the time continuum.
After many experiments, to demonstrate the relation between Zeno Paradox and Quantum, where we can find defenders and refuters, we can have some interesting conclusions as:
The quantum Zeno effect is to be regarded as a genuine result of quantum theory, at least of the form of quantum theory we now possess. This leads to the intriguing possibility that experimental disproof of the effect could cause a reconsideration of fundamental aspects of quantum theory.

The paradoxical nature of the early discussions related to the effect of an external macroscopically separated macroscopic measuring device on an evolving microscopic system. It therefore seems sensible to restrict the use of the term Quantum Zeno effect to experimental situations of this nature.

While the arguments about continuous measurement affect some aspects of the discussion of quantum Zeno, we do not believe that they throw genuine questions on the theoretical prediction itself, which certainly does not require the possibility of continuous observation.

To get more information about measure theory & concepts of frame in observations of events & Holographic Universe see also;

Next essay; Asymmetry & Symmetry

| Human - Q.Computer - Information |

We are Quantum Computer -Part1;


We are computer at the first.
Hardware + Software (tools, features & etc)
Body with information
When a human born, it has basic information about simple thing like; How to use eyes, How to use mouth to drink milk and baby can understand breasts, Know little how to use hands and legs, its body knows to defecate non-profit things & etc.
Human at the first has very weak personality. (that this personality come from terms of around.)
But human is free to change and choose its own personality.
According to state of education human grows up. (update & upgrade)
Initial knowledge is our initial Operating System (OS) after that according to how human educated another OS will install on human. (Own personality)
DNA can builds all parts of body and based on personality (phase, capacity & …) DNA can builds lots of different sensors. Some human will grow up wrongly and build a sensor (s.th. like Gene & Hormones) of emotion.
Since human can build different sensors (that might be wasn’t at first) So human is Quantum Computer.  Human doesn’t have to choose just from two answer; Yes or No.
Q.OS is based on Fuzzy Logic. Several answer and for Q.Computer, it can makes some another answers. This is different of computer and Q.Computer, that it can update or upgrade its hardware. (make sensor or destroy some of them to change its mentality or personality.)
We are programmers of ourselves.
We are programming by each bit that can get from around.
We write our OS.
We can be Q.Computer but for someone terms could be UnScientific. Terms are language of programming.  However we can change it But at first family and society give us the software of language of programming.

You must not be like others. You as a system must not & cannot be same like another systems to have good communication and make friendship. You are a system in network and must connect to others not make yourself completely like them.Even two computers with different Operating System can have communication in network. So you can be individual system and connect to others.


 We are Quantum Computer -Part2;






| NanoAssemblers – Information – N.BioTechnology |

What will Assemble the Future?

Description of NanoTechnology * NanoAssemblers;
|| Nanotechnology is an essentially modern scientific field that is constantly evolving as commercial and academic interest continues to increase and as new research is presented to the scientific community. The field’s simplest roots can be traced, albeit arguably, to 1959 but its primary development occurred in both the eighties and the early nineties.
The first use of the concepts in 'nano-technology' (but predating use of that name) was in "There's Plenty of Room at the Bottom," a talk given by physicist Richard Feynman at an American Physical Society meeting at Caltech on December 29, 1959.
 Feynman suggested that it will eventually be possible to precisely manipulate atoms and molecules. Moreover, in an even more radical proposition, he thought that, in principle, it was possible to create "nano-scale" machines, through a cascade of billions of factories. According to the physicist, these factories would be progressively smaller scaled versions of machine hands and tools. He proposed that these tiny machine shops would then eventually be able to create billions of tinier factories.
In the 1980s the basic idea of this definition was explored in much more depth by Dr. K. Eric Drexler, who promoted the technological significance of nano-scale phenomena and devices through speeches and the books.
In 1981, Drexler published his first article on the subject in the prestigious scientific journal, Proceedings of the National Academy of Sciences. Titled "Molecular engineering, an approach to the development of general capabilities for molecular manipulation", Drexler's publication essentially expanded the idea of molecular manufacturing by integrating modern scientific ideas with Feynman's concepts,  Drexler states that molecular manufacturing and the construction of "nano-machines" is a product of an analogous relationship "between features of natural macromolecules and components of existing machines. In addition, Drexler includes a table that outlines by function the molecular equivalents to macroscopic technologies.
Drexler pointed out the resulting nanotechnology can help the life, spread beyond earth a step without parallel since life spread beyond the seas; it can let our minds renew and remake our bodies, with synthesize immortality.
 Essentially, Drexler presented, albeit simplistically, that if atoms are viewed as small marbles, then molecules are a tight collection of these marbles that "snap" together, depending on their chemical properties. When snapped together in the right way, these molecules could represent normal-scaled tools such as motors and gears. Drexler suggested that these "atomic" tools and machines would operate just as their larger counterparts do.

Some scientists have criticized Drexler's visions as impossible and harmful. Richard Smalley has led this movement against Drexler's almost sensationalist vision of molecular manufacturing. In their open debate in 2003, Smalley writes almost scathingly, "you cannot make precise chemistry occur as desired between two molecular objects with simple mechanical motion along a few degrees of freedom in the assembler-fixed frame of reference.

In addition to the criticism of Drexler's vision of molecular manufacturing, three important developments that were independent of Drexler's paper helped turn nanotechnology into this broad field, today.
The Scanning Tunneling Microscope: With this technology, individual atoms could be clearly identified for the first time. Despite its limitations (only conducting materials), this breakthrough was essential for the development of the field of nanotechnology because what had been previously concepts were now within view and testable and the idea of a ribosome as an example of a natural molecular machine, and note that atomically precise final products do not require precise control of all aspects of the chemical reaction.
While nanotechnology came into existence through Feynman's and then Drexler's vision of molecular manufacturing, the field has evolved in the 21st century to largely include research in chemistry and materials science as well as molecular engineering. As evidenced by Smalley's debate, this evolution is partly a response to the criticism of Drexler's views in both Engines of Creation and the Foresight Institute

 Drexler and therefore Feynman did not have a direct role in the three most important breakthroughs in nanotechnology, the invention of the STM , the invention of the AFM, and the first manipulation of atoms.

It can be also interesting to point out, one of the biggest discoveries a few days ago on Majorana particles; Dutch Scientists think they may finally have seen evidence for a famously elusive quarry in particle physics. The Majorana Fermion was first predicted 75 years ago a particle that could be its own anti-particle simultaneously.

This new discovery can lead to scientific development changing our current binary computer language into a revolutionary new type of quantum language by enabling the compressing of data hundreds if not thousands of times thanks to the Majorana Fermion For its discovery enables as to use it for a purpose.
||


HowStuffWorks: How Nano Technology works?
||There's an unprecedented multidisciplinary convergence of scientists dedicated to the study of a world so small, we can't see it -- even with a light microscope. That world is the field of nanotechnology, the realm ofatoms and nanostructures.Nanotechnology i¬s so new, no one is really sure what will come of it. Even so, predictions range from the ability to reproduce things like diamonds and food to the world being devoured by self-replicating nanorobots.
In order to understand the unusual world of nanotechnology, we need to get an idea of the units of measure involved. A centimeter is one-hundredth of a meter, a millimeter is one-thousandth of a meter, and a micrometer is one-millionth of a meter, but all of these are still huge compared to the nanoscale. A nanometer (nm) is one-billionth of a meter, smaller than the wavelength of visible light and a hundred-thousandth the width of a human hair [source: Berkeley Lab].
As small as a nanometer is, it's still large compared to the atomic scale. An atom has a diameter of about 0.1 nm. An atom's nucleus is much smaller -- about 0.00001 nm. Atoms are the building blocks for all matter in our universe. You and everything around you are made of atoms. Nature has perfected the science of manufacturing matter molecularly. For instance, our bodies are assembled in a specific manner from millions of living cells. Cells are nature's nanomachines. At the atomic scale, elements are at their most basic level. On the nanoscale, we can potentially put these atoms together to make almost anything.
In a lecture called "Small Wonders:The World of Nanoscience," Nobel Prize winner Dr. Horst Störmer said that the nanoscale is more interesting than the atomic scale because the nanoscale is the first point where we can assemble something -- it's not until we start putting atoms together that we can make anything useful.
In this article, we'll learn about what nanotechnology means today and what the future of nanotechnology may hold. We'll also look at the potential risks that come with working at the nanoscale.
||


Who/What are NanoAssemblers;

Nano Technology – Genetic Engineering – bioTech – NanoBioTech – Material Engineering – Quantum Physics – Quantum Information Technology – Information theory – Computers & Robots – Measurement of Nature – BlackHoles – Schwarzschild radius – Consciousness theory – Smart Materials

What will happen when BioTechnology & NanoTechnology & Genetic Engineering & Artificial Intelligence work togethers?
We will have new kind of life base on nano robots instead of our biological cells. It just need to make them smart with learning ability. And nowadays we have this technology of artificial Intelligence. When your softwares find new data and get them (called Auto Update), means they understand what are their requirements so update themselves and get new features and abilities. For example your OS cannot understand Chinese but it can get a languages package to get Chinese. The ability of detection of updates (new data) is a smart ability.
When NanoAssemblers as small computers get this ability they can do everything they want.

Assemblers will define new kind & methods of life. Based on tiny robots strings like DNA & RNA. It s new consciousness of matter when intract with data & waves and etc. It s consciousness of smart materials. These we lots of smarter materials; on glasses, smartphones, clothes, drugs & etc. When a drug detect exact coordinates of concerns and attack to it. When smart materials react(move) according to light or sound or ...

For assemblers we two general model to create them: like Bacteria & Virus.
Baceria like ones can just do that we order and confirm without growing up (get new data; update) But Virus like ones can upgrade themselves if necessary; they can get new data so new abilities and features. And they can copy them structure to redesign new structure always. And it s what we say cloning.
It s not necessity to say examples cause you can suppose everythings.

NanoAssemblers can govern on Nature. With super ability cause they can grow up so fast and make themselves more & more compatible because they live in Quantun scale and more quantic than us. So they can live from ice to sun, from labs to schwarzschild Radius. They can use elementary particke like neutrinos to communicate so they send us message from somewhere that we cannot imagine, from our bodies to BlackHoles.
And it s measurement of Nature |

*Physicsism will work on NanoTechnology more & specially.
From basic & simple concepts for introduction to advance levels.

(Next essay; Quantum Zeno Effect)


| Entanglement – Information – Correlation |

What is classical/quantum Entanglement?


Description of Holographic Universe Theory;
|| Entanglement is a term used in quantum theory to describe the way that particles of energy-matter can become correlated to predictably interact with each other regardless of how far apart they are.
Normally, when two or more particles are entangled (and seem to communicate with each other instantaneously), they not only share quantum correlations, but also classical correlations. Although physicists don’t have an exact definition for classical correlations, the term generally refers to local correlations, where information does not have to travel faster than the speed of light.
So if entangled particles demonstrate correlations across large distances, you might assume that they will also have correlations across shorter distances. After all, if entangled particles can communicate at faster-than-light speeds, they might be able to communicate at slower-than-light speeds.
Particles, such as photons, electrons, Or Qubits that have interacted with each other retain a type of connection and can be entangled with each other in pairs, in the process known as correlation. Knowing the spin state of one entangled particle whether the direction of the spin is up or down allows ones to know that the spin of its mate is in the opposite direction. Even more amazing is the knowledge that, due to the phenomenon of superposition, the measured particle has no single spin direction before being measured, but is simultaneously in both a spin-up and spin-down state.
Quantum entanglement allows Qubits that are separated by incredible distances to interact with each other immediately, in a communication that is not limited to the speed of light. No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated.
The beginning of the 20th century marked the golden era of theoretical physics. This summit was obviously achieved as a result of the epic contributions by scientists who dared to break sacred conventions. However, credit must also be given to the editors of scientific journals who similarly had the audacity to publish such unconventional theories. To this very day, the ideas developed by Schrodinger and Heisenberg are difficult to comprehend.

Einstein had deep reservations about his colleagues, works and sensed that physicists were overlooking some sort of element that would synchronize all the theories into a coherent whole. He felt that theoretical physics had failed to offer an adequate explanation for vast formations, but also admitted that his own research was far from flawless.
Einstein rebelled against the notion of quantum entanglement, derisively calling it "spooky action at a distance."And he was correspondingly single-minded in the principal argument he used in his efforts to establish this incompleteness, Einstein's weapons in this battle were thought experiments that he designed to highlight what he believed were the inadequacies of the new theory. The argument depended essentially on a highly non-classical element of quantum theory that Schrodinger in the 1930 called entanglement, and he was writing When two states become entangled, a complete account of the properties of one of the systems is not possible if it does not include the other system; and this will be true no matter how far apart the two systems may be spatially Einstein co pointed out that according to special relativity, this was impossible and therefore, quantum mechanics must be wrong, or at least incomplete.
The earliest fully developed and published version of Einstein's argument against the completeness of quantum mechanics, appeared in a 1935 article called EPR. The EPR paper written in 1936, it considered two entangled particles, let's call them A and B, and pointed out measuring a quantity of a particle A will cause the conjugated quantity of particle B to become undetermined, even if there was no contact, no classical disturbance, The EPR paradox stumped Bohr and was not resolved until 1964, long after Einstein's death. CERN physicist John Bell resolved it by thinking of entanglement as an entirely new kind of phenomenon, which he termed "nonlocal."
Heisenberg's principle was an attempt to provide a classical explanation of a quantum effect we call non-locality
The German physicist Werner Heisenberg idea was that the position and the velocity of an object cannot both be measured exactly, at the same time, even in theory. The very concepts of exact position and exact velocity together, in fact, have no meaning in nature.
Ordinary experience provides no clue of this principle. It is easy to measure both the position and the velocity of an automobile because the uncertainties implied by this principle for ordinary objects are too small to be observed. The complete rule stipulates that the product of the uncertainties in position and velocity is equal to or greater than a tiny physical quantity, or constant. Only for the exceedingly small masses of atoms and subatomic particles does the product of the uncertainties become significant.
Any attempt to measure precisely the velocity of a subatomic particle, such as an electron, will knock it about in an unpredictable way, so that a simultaneous measurement of its position has no validity. This result has nothing to do with inadequacies in the measuring instruments, the technique, or the observer; it arises out of the intimate connection in nature between particles and waves in the realm of subatomic dimensions.

Heisenberg, His leading idea was that only those quantities that are in principle observable should play a role in the theory, and that all attempts to form a picture of what goes on inside the atom should be avoided.  Thus, Heisenberg was led to consider the ‘transition quantities’ as the basic ingredients of the theory. Max Bohr, later that year, realized that the transition quantities obeyed the rules of matrix calculus, a branch of mathematics that was not too well-known then as it is now. He by contrast to Schrödinger declared. We believe we have gained understanding of a physical theory, if in all simple cases. We can grasp the experimental consequences qualitatively and see that the theory does not lead to any contradictions but rather result in an increase volume of concepts.
To get an overview of his principle, Heisenberg never seems to have endorsed the name ‘principle’ for his relations. His favorite terminology was ‘inaccuracy relations’, and only once he mentioned that his relations "are usually called relations of uncertainty or principle of indeterminacy".

We can describe as easiest we can his principle as the position and momentum of a particle cannot be simultaneously measured with arbitrarily high precision. There is a minimum for the product of the uncertainties of these two measurements. There is likewise a minimum for the product of the uncertainties of the energy and time.

Einstein never accepted the principle of uncertainties and Heisenberg failed to get Einstein's endorsement of his principle as a fundamental physical law.
||


 What is Entanglement?

Spin – Quantum Mechanics – Space and Time – Entanglement – Q.Information – System – Coherent – Correlation – Information – Double Slit Experiment – Superposition – AstroPhysics & etc.

System A is in “a” state and system B is in “b” state. These two parameters have relevance by equation(s) in nature base frame. So when one of them get new state (initialize), it forces another one to change.
We have two electrons with “n” Km distance apart (Far from one side of galaxy to another side).
It doesn’t matter which one will be in down-spin. Two systems should have connection(s) to have relevance together and to effect on each other. Systems can connect together by network (wiry & wireless) or in other hand touching, seeing, smelling and etc. all of these mean send-receive information. The connections are in space so they should get and have parameters base and depend on place, like speed, coordinates, time, scale(size), and when transfer something by particles or ElectroMagnetic waves we will have momentum and etc.
Two system correlated together means send-receive information. (Data Processing). So we should check spatial parameters for information.
Is passing time effects on information?
What is scale of a bit of information?  And … .
All events are action and have reaction speed; seeing, smelling, networking and etc. all these occur in place base frame; parameters are depend on place, so depend on speed and local gravity field (time).
Physicsism studied “What is the time?” and proved time is not actual so cannot effect on information. It s not important for restoring or reading some info that when did they saved. (all structures have external and internal info about their system and won’t destroy unless external or internal force, enforce it to change shapes.)
For a file it s not important to save at first level (layer or step) or another place. Doesn’t matter which coordinates you wanna transfer and save files. You can read file without any problems. And it doesn’t matter on what kinds of devices you saved files (just have enough free space). We explained that place (coordinates) doesn’t effect on information in classical scale. But what about in a scale that place and time are meaningless completely!? There s no right, left, up, down, on, under, behind and … for information.
-          So there s no spatial parameters for information.
When coordinates (place) be meaningless for information transferring bits from coordinates Z to Y is meaningless too cause you cannot define to information what is (x y z). when we send info from Z to Y (like a file through internet) means we don’t know what is information exactly so cannot control and have exact IT. Thus we have to transfer info by using EM waves & etc. But in nature in Quantum scale information can be transferred without any limitation, with speed more than speed of light, because speed is meaningless for information in Quantum scale So quantum systems like two photons can make correlation and effect on each other at the moment of changing the states. It s like using some another ports to tunneling and make connections. So Quantum Tunneling then Quantum Entanglement and then SuperPosition.              

    *Also we can observe Entanglement in classical scale in projectile systems. When you shoot a rocket and calculate its movement you get it will arrive to ground in point B (from Point A) ad it will always become true (you can test!). But if rocket has two part and during movement in the air they separate, one of them (for example) will arrive to ground in B-5 and another in B-2 and it means these apart rocket after separation effect on each other and you can suppose is as classical entanglement, that you can study it and explain it base on speed, force, height, weight, mass & etc. that these parameters effect on this system of projectile.
And in this way you can study that What occur in Double Slit Experiment.

| Information - Relativity - Time |

What is the Time exactly?


Description of Holographic Universe Theory;
||  There are many questions about the nature of time. What is time? What causes time?  Why time slows in gravity? Why time slows in motion? Is time a dimension? Because it is a long subject I will explain just some basic but important point trying to answer the questions from above
Aristotle had speculated that time may be motion. Einstein was working to develop a theory of general relativity and proposed the revolutionary idea that mass curves space, but he did not know that the universe was expanding.
We may have made the concept of time more complicated than what it really is. Measurement of time started early in human development. There are plenty of clues in every language in the greetings and the meetings. Time of the day is related to the position of sun in the sky or stars at night. There is dawn, sunrise, early morning, morning, mid morning, noon, afternoon, late afternoon, evening, sunset, dusk, night and mid night. Then there are years, months, weeks, based on earth’s yearly orbit around the sun and the changing seasons and the precession our World finds itself in.
To understand time we have to understand the mechanism which brings about this continuous change from which our mind creates the illusion of flow of time, has a tendency which is now confirmed to be speeding up, this is mainly consequence of an ever growing quantity of information within the same space-time consciousness.
The human brain as a computer is being challenged continually to adapt to the increased volume of information. Our world operates nowadays on a level of nanosecond reactions far exceeding the human capabilities.
Time becomes evident through motion. The aging process is a reminder that molecular motion and interactions are also at work and are a part of time. Other very important aspect of time is presence of motion of particles like photon and motion at atomic level.
(*When a system grow up and improve if there s no external force it continues to improve > increase information and energy so become stronger and won’t become old and aging process won’t work, But humans become old!)
Time can be defined from many perspectives. From perception view point time is an emergent concept which our mind creates. Present is the consciousness or awareness of recording of memory into the brain. From point of view of physics Time is just the presence of motion and forces permitted by the expansion of space.
||

Information – entropy – Chaos theory – Relativity theory – Quantum Mechanics – Energy – Quantum Fluctuation - & etc.

Is it important for information when did they store? Or when will or even where store?
Because of our technology defection after several years the information that stored on a Hard-Disk will destroy. The defections are because of reactions (photoelectric) But in fact the Hard-Disk will destroy and not information. It means HD won’t respond to read info from it. And info will liberate in space. So it s not important for information How long and etc. you delete some files from your HD but after that you can restore it. After you delete you will have free-spaces that means no info on HD but you can restore deleted files. (We didn’t talk about normal deleting,  it s about Shift+Delete)
Sometimes you feel time pass so fast and sometimes slowly. Your clock move fast or slow! This observation is because of your performance. When your brain do data processing fast you think time goes slowly cause you can do more task in less time. Or when your measuring process get problem (your mood get out of standard and normal state) also you feel time passing faster or slower. But in all situations and states time is illusion stuff and not independent parameter that can effect on us.
Time is just reactions and actions that we suppose as time. Human innovated clocks based on Earth’s movement. But for a theory that want to explain universal phenomena a innovation on earth cannot do anything unless lots of paradoxes and meaningless infinite results. Time is not independent parameter as apart dimension.
Time is just measurement according to Gravity field (Complex of EM field). All Events on earth happen base on strength of Gravity. Micro Reactions and Classical Motion. So twin will grow up differently when one of them lives on earth and another be In space. (Chemical reactions will occur with different state.)
All these illustrated time is not independent parameter (apart dimension for universe). We suppose how events occur as time. So different persons have different perception about time-passing that you can name it Psy. Coefficient or Parameter; Psychological Parameter, that means a frame for each person.

| Holographic Universe – Consciousness Theory - Information |

Information in Holographic Universe

Description of Holographic Universe Theory;
|| The twentieth century began with Einstein's theory of relativity. The observer was no longer external to the phenomena being studied. In fact, all patterns could be described only relative to the frames.
The role of independent mind in the construction of reality became an issue of concern. In fact, for some scientists, it had now become a central theme.
In 1961, Paul Pietsch, he set out to disprove the holographic theory of the brain. After performing thousands of operations on salamanders, he became convinced that the mind perceives and stores information by encoding and decoding complex interference patterns.
The most remarkable was the idea that a photographic plate containing a laser image could be broken in two, and each half would contain the complete image of the object, but with less resolution. This was identical to the way that memory in the brain seemed to be operating. Regardless of how many times the photographic plate was broken each piece contained the information necessary to reconstruct the entire image.
The problem with the holographic model comes when we try to understand what the brain is actually perceiving. The holographic model implies that our perceptions are merely an illusion. If we are perceiving an interference pattern, what is the true nature of what we are perceiving? The hologram consists of both a reflected and a reference beam. What is the nature of the reflection? Is it equally illusive; and what is the brain's equivalent of a reference beam?
Quantum physics has presented us with a puzzling picture of the nature of reality. Physicists have demonstrated that quanta can manifest themselves as either particles or waves. When scientists are not looking at electrons, they always exist as a wave, and whenever they design an experiment to observe the electrons, they always appear as particles.
 The holographic model of reality stresses the role of beat frequencies in our construction of reality. Suppose the fifth dimension consists of extremely high frequency energy far outside our range of normal perception. When two or more wave fronts interact, a third frequency is created that consists of the difference in frequencies between the two waves. Since the beat frequency is all we can perceive, we construct reality based on these illusory waves without any awareness of their true source.

 ‘The true nature of reality remains hidden from us. Our brains operate as a holographic frequency analyzer, decoding projections from a more fundamental dimension. Bohm concludes that even space and time are constructs of the human brain, and they may not exist as we perceive them.’

But now we are living where surrounded by different electromagnetic waves and subsequent sub-waves emitted by modern technologies interfering in the natural environment of humans as it had been for eons. These high ranges of microwave frequencies have a direct effect on our quantum field of dreams. The patterns of planetary alignments and their respective hyper-dimensional transmission are also part of the equation. ||

Where is sound of tree when it falls  when no observer (human) observe it?
It s of course not scientific Question cause there s lots of thing to do measurement.

HowStuffWorks: How dreams work?
|| Our dreams combine verbal, visual and emotional stimuli into a sometimes broken, nonsensical but often entertaining story line. We can sometimes even solve problems in our sleep. Or can we? Many experts disagree on exactly what the purpose of our dreams might be. Are they strictly random brain impulses, or are our brains actually working through issues from our daily life while we sleep -- as a sort of coping mechanism? Should we even bother to interpret our dreams? Many say yes, that we have a great deal to learn from our dreams.
In this article, we'll talk about the major dream theories, from Freud's view to the hypotheses that claim we can control our dreams. We'll find out what scientists say is happening in our brains when we dream and why we have trouble remembering these night-time story lines. We'll talk about how you can try to control your dreams -- both what you're dreaming about and what you do once you're having the dream. We'll also find out what dream experts say particular scenarios signify. Finding yourself at work naked may not mean at all what you think it does!
Perchance to Dream
For centuries, we've tried to figure out just why our brains play these nightly shows for us. Early civilizations thought dream worlds were real, physical worlds that they could enter only from their dream state. Researchers continue to toss around many theories about dreaming. Those theories essentially fall into two categories:
·         The idea that dreams are only physiological stimulations
·         The idea that dreams are psychologically necessary
||

In a Gedanken scientists supposed that a tree fall in forest But there s no human as observer to hear sound of falling down and said this occurrence won’t make sound cause there s no observer So reality is based on who, how observe it?!
But I should say Holographic viewpoint of nature in this way according to concepts of frame in Relativity, is without information!
They did forgot to enter information viewpoint and concepts of measurement. It s not important that a human hear or not. Hearing is kind of measuring and we should check about measurement in this occurrence. Another trees, animals, particles & … can measure. So they will hear sound of tree!

Holographic Universe & Dreaming;
Information Theory – ElectroMagnetic – BioEM reaction – Consciousness Theory – Relativity Theory,Frames & etc. 

We are full of ElectroMagnetic because of reactions on our body. Chemical reactions are based on what we eat and what we observe. For example; pain is a surface. But headache could be happened by eating something and by seeing or hearing (measuring) something that effects badly on your mood & viceversa.
Both of them happen by ElectroMagnetic. So it s better to say we are full of BioElectroMagnetic reactions. So we have EM field and another EM fields can effect on us (on DNA or chemical reactions and atc.)
Our brain is Holographic system that can complete some sketchy image or generally observations. Even brain tries to complete some observations by guessing based on relevant information from same topic or categories & etc.
We dream because of Background processes. We can see new place in dream so real cause brain can mix and combine data.
-          How can you predict future?
Of course it s possible but little bit difficult. Next sentence is in future and even next characters. But you can guess 100% true some words when you observe just first letter. It means you predict future.
How does it happen?
You are perfectly know English or maybe you are native so you have lots of words on your hard-disk (brain) and this is for past time. For example 7 years ago you remembered a word. Now you are using stored data to predict future. And also It happens for dreaming. You studied about prices of Gold in markets for last 20 years and now you have data and know what happen nowadays around you enough so can predict future of Gold prices. (as you know lots of smart persons become millionaire by studying carefully past, present and logical prediction about future.)
And this process can happen when you sleep or watch or drive or …
What you eat can effect on your dreaming. That s because when you are in exams seasons in university others try to guide you to drink or eat something beneficial for your memory and peace, 97% cocoa chocolate!
That s because when you sleep and begin to dream (because of background data processing) you can observe so real senses about past, present time and future or dream fancy thing very real. You can fly with dragon and touch it! And same data in a society can make same dream of someone known as prophet or important person in that society.  
Background processing can make so real event for you. Something happen in your mind but effect on your body. It can force you to die!
But point is here that all of these are in Nature and have scientific reason or at least scientific justifications.
When someone at the moment of death talk with some luminous circle or … It doesn’t mean ghosts fly to sky and talk with angels.
(someone said I talked to some luminous dots and told them I don’t want to die cause I wanna dance more! And then that light or luminous dots allowed me to be alive again to dance! But maybe in hell or heaven you can find some party to dance or if you wanna dance just tell I wanna do it why you like to make fancy story and then unwise people compare it with holographic universe!!-From Holographic Universe by Michael Talbot)
You can control your Background data processing to work on parts you need. You can do it by thinking a lot and increasing your knowledge(information).


| Measurement - Information - Parallel Universe |

Measuring System in Parallel Universe

Description of Time-Travel & Parallel Universe;
|| Do parallel universes exist, exactly like our universe? 
In 1954 a young scientist, Princeton University doctoral candidate, Hugh Everett III came up with the radical idea, Existence of Parallel Universes, exactly like our universe.
Notions of parallel universes or dimensions that resemble our own have appeared in works of science fiction and have been used as explanations for metaphysics.
With his Many Worlds theory, Everett was attempting to answer a rather sticky question related to quantum physics, when the physicist Max Planck first introduced the concept to the scientific world. Planck's study of radiation yielded some unusual findings that had problems with classical physics laws. These findings suggested that there are other laws at work in the universe, operating on a deeper level than the one we know.
Also we can find, according recent development in quantum physics, the hypothesis of the stark nature of reality, and Reality in a dynamic universe is non-objective.
Understand parallel universes, multiple reality, and the hypotheses advanced by scientists such as Hugh Everett, Bryce DeWitt, David Deutsch and others.
The Universe and Multiple Reality presents an understandable view of parallel universes and quantum physics––and explains what this means in our daily lives.

The Universe and Multiple Reality explains to the non-scientist reader in understandable, non-mathematical language the paradox of Schrödinger's Cat, the two-slit experiment and recent developments in quantum physics and cosmology.
There are a variety of competing theories based on the idea of parallel universes, but the most basic idea is that if the universe is infinite, then everything that could possibly occur has happened, is happening, or will happen.

According to quantum mechanics, nothing at the subatomic scale can really be said to exist until it is observed. Until then, particles occupy uncertain superposition states, in which they can have simultaneous up and down spins, or appear to be in different places at the same time. The mere act of observing somehow appears to nail down a particular state of reality. Scientists don’t yet have a perfect explanation for how it occurs, but that hasn’t changed the fact that the phenomenon does occur.

But if parallel worlds do exist, there is a way around these troublesome paradoxes. Deutsch argues that time travel shifts happen between different branches of reality. The mathematical breakthrough bolsters his claim that quantum theory does not forbid time travel. It does sidestep it. You go into another universe, he said. But he admits that there will be a lot of work to do before we can manipulate space-time in a way that makes hops possible. While it may sound fanciful, Deutsch says that scientific research is continually making the theory more believable.

We can point to phenomena such as black holes, curved space, the slowing of time at high speeds, even around Earth, which were all once rejected as scientific heresy before being proven through experimentation, even though some remain beyond the grasp of observation.


Is there a footprint we can discover in our mechanical law that originates from the quantum level law dissipating through to prove the exotic worlds outside our homely universe? ||

-----
Disproof of Parallel Universe;
QED – ElectroMagnetic – PhotoElectric - Q.Superposition - Q.Entanglement - Probability theory - Measure Theory - Information theory & etc.

-          When Scientists couldn’t explain about double-slit experiment and Gedanken of Schrodinger’s Cat and some another gedankens and hypothesis like these, tried to justify them by another hypothesis, like Parallel Universe. When they couldn’t understand what is superposition & entanglement exactly, made Parallel Universe.
Multiverse Universes Or worlds Or Multiverse multiverses or …!!!
All of these come from fancy viewpoints.
You want to drink glass of water. But 1)When, 2)How, 3)Where, 4)Which glass & …? All of these have 50% probability to happen. The cat might be dead or alive, it has 50% probability.
Optimistically for each event with 50% probability there is one universe! One universe for photon of right slit and one universe for photon of left slit.
For example you have three place to drink and ten times and 2 glasses and 2 kind of drinking water! How many 50% probability you will have? So How many universe you will make because of a simple action?
Again I should say these questions cannot disprove Parallel Universe. They just said Multiverse is little too silly! But we need disproof;
Parallel universe hypothesis is in a frame without any information!!! And measurement!
When there is no one to measure states of photon and cat they will be in Superposition but it doesn’t mean, after someone open the box and see cat s dead, one another alive cat is in another universe and jump out of box! Because all actions and events are measurement! It means each changing in structure of system is measurement (Send & Receive Information). So changing state of can or bottle of Cyanide Or trigger of gun will effect on system that include cat and we know this changing So cat will die, Be sure! Even if you test and repeat n times. All waves & particles do this process of measurement So there s no empty coordinates, even in Schwarzschild Radius cause always there exist something for wave function collapsing; me, you, waves, molecules, atoms, quarks, neutrinos, Bits & etc.
So nature always chooses one of probabilities and not necessary to have lots of fancy universe |