What happens to the world if there’s no limit to the speed of light?

Before we answer the question, let’s open a brain hole:

Our world is a sort of computational mainframe virtual, like the hacker empire;

This is a computer that has the same computational properties as the floating-point operation of our real world computers.

If I look at it from that angle, I can give an interesting thought to this issue.

Now that the question of the numerical precision of the real world has been raised, how can we, in the world of the hacker empire, feel about the ongoing numerical computation process at the bottom of the world, and thus learn more about the characteristics of its floating point operation?

In fact, we can think of this in the same way as the creators of the virtual world that we have built ourselves.

What is the most virtual world humans have built?

The answer is a 3D game, especially a 3D MMORPG type game.

Although the 3D MMOs that humans now create are still very simple and far from the real “mixer empire”, some characteristics are comparable.

If we’re in a 3D game, how do we know what this game is about?

As a player, we have no way of understanding the calculation of the bottom of the game, but we can see some graphic manifestations associated with floating-point calculations, for example, the performance of the game engine in dealing with the details of the model to analyse the accuracy of the calculation.

In the 3D game, all image rendering and three-dimensional calculations are calculated on a floating point basis, which consumes very much the resources of the system, so the hardware manufacturer specializes in computing needs by designing a graphic card with a powerful floating point chip, but the good programmer is careful to optimize the algorithm and try to save the computing resources so as not to create a system in Cardon.

In the 3D game, one of the important things that optimizes algorithms is the space collision of the game.

In plain words, it’s the collision detection of 3D objects, and in short, it’s the determination of whether or not two objects in the game are colliding, for example:

It’s two little balls in the game world. How do you figure they didn’t crash?

That’s what you’ll know in junior high.

Calculates the distance between the center d and the two-ball radius and r1+r2, if d< = r1+r indicates a collision.

3D coordinates, distance d.

But if there are three objects, it’s a little trouble, and it takes two or two comparisons, three calculations, and four objects naturally do six calculations.

But if there are 100 objects, we need 5,000 calculations, and there are often more sports objects in the game, and the shape of these objects is often very irregular, such as walls, rivers, trees, people, weapons, etc., then these calculations rise exponentially.

At this point, programmers need to use a variety of algorithmic techniques to reduce the volume of operations, such as fork trees:

Of course, even with these methods, we cannot calculate the distance in unlimited precision, and when two objects come very close, for the sake of calculation, we generally keep the floating point accuracy within a given range.

What happens when you control the accuracy of the calculations?

If there is a lack of precision in collision detection, there is often a certain display of BUG, which is called a “fashion model”.

Clothes.

BUG has two main causes:

One is a model setup problem similar to the above, and some models do not detect collisions;

In another case, the object model is too small and the impact detection precision is inadequate, for example, because the two role stations are so close that the handheld weapon is modeled into the other side ‘ s body.

This very close collision detection accuracy is very relevant to the floating point properties of the game, and theoretically, if the game crash accuracy can be calculated to the smallest pixels, this pattern will not occur, but it will cost too much, so that the general programmer will tolerate some of these shown little BUGs in the game and will not have much impact on the player experience anyway.

But it is clear to us that the very nature of this phenomenon is due to the inaccuracy of collision algorithms in the game engine.

So, in the real world, can we observe similar phenomena?

And really, there’s a very similar phenomenon in quantum physics — “Quantum Trunking.”

Quantum piercing is the phenomenon of microparticles running directly across the barrier in some cases, in an atmosphere, very close to a barrier (high-power pole). This phenomenon is absolutely impossible in the macroworld and the classical physics system, but it occurs in the microworld at a small scale to a certain extent. It’s completely inexplicable with traditional classical physics.

In normal cognitive terms, a small ball that wants to go through even a thin piece of paper cannot give no energy or break it. But in quantum worlds, particles are so weird that they cross a thin enough barrier.

The interpretation of quantum physics is also very obscure.

Quantum physics is explained by uncertainty about the location and energy of microparticles: Quantities have some uncertain energy rises and drops, and occasionally they can “loan” from nothing to some external energy, and then the more they cross the walls, making it possible to walk through walls.

Doesn’t that sound fantastic?

However, the phenomenon is real and has even been used to develop a number of practical high-tech devices, such as a tunnel-scan electronic microscope.

And the side effects of this phenomenon have greatly affected our control over the microworld. In the microelectronics sector, for example, it is also because of the Quantum cavity, which has led to the current microelectronic chip technological development of 1 nm, which has created a physical barrier that continues to shrink. If the materials in the chip that block electronics are small enough to be less than 5 nm, the failure caused by the quantum penetrating effect cannot be ignored. If the size is further reduced, then the problem will be exacerbated, and the electron will randomly cross the thin fence, thus rendering the logical circuits of the chip dysfunctional.

This issue has become the biggest obstacle to the continued development of chip technology.

This sounds like a model BUG, unlike a small object in a 3D game, when the object is small to a certain extent, and because the impact detection algorithm is insufficiently precise to calculate the floating point, a small object sometimes penetrates a wall or object when it approaches the wall.

Theoretically, if the microworld uses the same collision algorithm as the macroworld, this should not occur as long as the precision of the calculations is sufficient.

For example, the precision of the calculations reached the Planck scale, so it is absolutely impossible to see a perforation.

And, in terms of the scale at which the piercing occurs, it’s really far from the Planck scale, and we’re counting the scale at 1 nm = 1 E-9 m, while Planck’s size is about 1.6 E-35 m, which is 26 orders of magnitude!

So, how low are the calculations of our universe?

In conclusion, in order to prove the effectiveness of our disproportionate way of understanding, we look at an issue that is very difficult to understand in quantum penetrating.

The most inexplicable phenomenon in the perforation effect is “exceeding the speed of light.”

According to the principle of uncertainty about the energy time in quantum theory, the time that a quantum passes through the barrier is inversely proportional to the energy height of the barrier, that is, the higher the energy of the barrier, the shorter the time passes. If the width of the barrier is sufficient, then a sufficiently high barrier will cause particles to pass through the barrier at the speed of the light speed, which is in conflict with the relative theory that the speed of light is the maximum speed of the universe.

Theoretical physicists have argued about this phenomenon, and have put forward a variety of new assumptions and statements to explain it, trying to, on the one hand, firmly defend the position of light speed as the absolute limit of the speed of the universe and, on the other hand, to explain what is going on with quantum hyper-light speed. These advanced theories are convoluted around complexity, and it is generally recommended that they not be understood in order to avoid symptoms of overstretching.

So let’s go back to the ground and think about how to understand this most cutting-edge scientific problem with our junior high school grades.

Let’s have a virtual video chat:

For example, you’re the owner of a network car racing company, and today you’re angry because the latest results of one of the games have been refreshed to a surprising extent, with some players completing the game in a few seconds, and it’s obviously a game that’s been exploited, so you called the game manager and the R & D manager to a meeting.

“Why don’t you explain to me how this perverted grader did it? You, as the boss, have the right to ask your subordinates to give you a reasonable explanation.

The manager was busy saying, “I understand. This is what players do with BUG. I’m sorry.

R & D managers feel strange, saying, “It shouldn’t be possible, this BUG is theoretically impossible. I’m sorry.

So you asked, “Why can’t it happen?” I’m sorry.

The R & D manager said, “We have a speed limit in the game, and no matter how the player adapts it, we can’t exceed that speed limit. I’m sorry.

“Why can’t we go beyond the limit? Isn’t there some way for a player to circumvent this restriction? I’m sorry.

“It can’t be bypassed, because the speed limit is not set to prevent the player from brushing BUG, but the bottom mechanism of our game. Because car racing in our game requires constant change of location, and the smallest unit and the smallest unit of time in which the object changes position is determined, there will theoretically be a maximum speed, and it will not be possible for the player to exceed that speed in any way. Because this speed is the limit of our game to keep racing, and any faster, the racing car is in the eyes of the player, and it’s not gonna happen from our bottom algorithm, and our racing car is in program space, and it’s only moving in one cell, not jumping! I’m sorry.

After listening, you thought the R & D manager had a good point, so you asked the operator very puzzledly, “How does our player do that?” I’m sorry.

The manager said, “I don’t know much about this, but I can show you the players. I’m sorry.

So the manager goes into the game and chooses the track to start the game. Only the operations manager found a suitable position on the track and quickly crashed the car into a cliff on the side of the road, and the manager did so several times. At one point, almost immediately, the car was not properly bounced back from the cliff wall, but from the other side of the cliff, which seemed to have taken no time, and the manager recreated the perverted performance of the player.

A moment of silence in the office, you and the R & D manager face each other.

After all, the R & D manager was a 211-year-old high-school student, and he thought of it as a matter of time: “I didn’t think that would happen, because the cliffs next to the track were too thin. I’m sorry.

“Why is the cliff so thin that this BUG?” You’re all confused.

“This is the case,” the R & D manager has fully understood the problem: “The collision detection in our game is time-spacing, and every other time the program detects the distance between the center of the racing model and the various barriers, and once the distance is less than a certain value, it is considered a collision, and it goes back. But this cliff wall is too thin, and when the player’s speed is up to a certain point, when he crashes in between tests, the center of the car passes through the cliff to be detected, but because the model has crossed the cliff, and the collision program moves the car to the other side, so the racing car passes through the wall. The movement was caused by collision algorithms, which, unlike normal movements, was not limited by the minimum movement distance and therefore exceeded the maximum speed of the game. I’m sorry.

“How can collision algorithms be so powerful to help races out of the maximum speed of the game?” I’m sorry.

“Yes, the bottom reason for setting the maximum speed of movement of the game is because the object cannot leap above the smallest unit in space, otherwise there will be transients, and if there is transients, there will be the possibility that two objects will move at the same moment to the same position. Therefore, the essence of requiring continuous movement of objects is also to avoid the presence of different objects in the same smallest space unit at the same time. However, the bottom cause of collision algorithms is also to avoid the presence of different objects in the same space at the same time, so the maximum speed of the game is not a cause or an end, but a phenomenon, and the more fundamental bottom mechanism avoids space overlap between different objects, so that once this space overlaps, when the ejection of the object is almost unnecessary, the system moves the model out at the fastest pace. Of course, it is not an instantaneous, completely time-consuming move, but redrawing objects takes a little time, but it is also far above maximum speed.

“Oh,” you and the manager finally figured out the bottom of this BUG, and you asked, “How do we avoid a player using this BUG?” I’m sorry.

The R & D manager thought, “It’s simple, just a little thicker. I’m sorry.

“This doesn’t seem to solve the problem, does it? I’m sorry.

R&D managers shrugged their shoulders: “This is the most cost-effective way to solve this problem once and for all, if you feel like you’re going to do it from the bottom, first of all I don’t think it’s necessary, because reducing detection intervals would place a huge burden on the system, would require the purchase of stronger and more expensive servers, and in the vast majority of cases there would be no significant improvement in the experience of players. Second, it’s a big risk to change the bottom algorithm. Maybe it’ll ruin the whole game, boss. I’m sorry.

“Let’s get the gates planned and the art to make the walls thicker and check if there are any other tracks that need to be modified. “You feel like it’s easy to make decisions as a boss.

Before closing the meeting, you were a little upset and asked the R & D manager: “You’re sure that when the cliff is thicker, this BUG won’t happen again. I’m sorry.

The R & D manager looked at it as a very tight man, and he thought for a moment, “In theory, there’s still a chance of going through, because we’re also random in time, so long as the players try so many times, and there’s a chance that the thick block will go through, but this is a very small probability. I’m sorry.

“Well, that’s it.” You, as the boss, have a deep understanding of the simple reason not to worry about a very small probability, and that’s a question for friends to worry about.

Although the above-mentioned scene dialogue is a fiction, one should understand what it means.

From the conversation between the programmer and the boss above, we actually got a very amazing view of the basic law of the universe, for example, that we can see light speed not as the current upper limit of the speed of time and space as a bottom-up, but rather as a phenomenon, and that the universe must also have a bottom-up pattern that led to this.

In addition to the speed of light, we can even look at the underlying constants in physics from a similar perspective, especially those with a scalable range, which are probably not the original variables of the universe that are absolutely constant, but rather a result of some of the lower physical patterns. For example, the charge value of an electronic, or the mass of a proton, etc.

On the other hand, since we feel that there is a lower rule above the constant, the constant is not so unchallenged.

It is like light speed, and although there is a basis for calculation of Planck length and time on the ground floor, a more basic law of constraint is likely to come from the fact that the universe absolutely avoids the possibility of overlap between different substances in the same time and space, and that this light speed limit may be broken in certain special circumstances (e.g., in an emergency operation to resolve the overlap caused by the lack of precision in the computation of quantum scales), which is also an easy-to-understood procedural thinking: lower logic must be subject to upper logic.

And finally to the question, since we can imagine the transmutation of quantum as the precision of the world’s calculations, we have actually extrapolated to prove that the bottom-of-the-world float is in the range of output precision.

This means that our world does not express the exact position and state of each particle at every moment with the smallest measure, but rather the general state of the particle by a result of a significant drop-out of the floating point precision, and that these output results constrain each other. This measurement is precisely what is often raised in physics as “the principle of uncertainty”: When you try to get more precise position values for particles, you cannot get accurate motion values for particles.

We can only assume that this is the intrinsic feature of our world’s bottom algorithm, and that, when our world’s creators (or programmers) construct our world’s bottom algorithms, they may have drastically reduced the output precision of particle motion algorithms for resource-saving reasons, using a microprobability output to address the precision expression of the macroworld, if this is really for the purpose of saving hardware resources, and as a programmer, I need to express my sincere admiration: beautiful and practical algorithms!

This resource-saving bottom algorithm is, of course, the reason why the world is able to function properly on existing mainframes and, of course, the weird experimental results in the quantum world, such as the problem of wave pictology, delayed selection experiments, the problem of photolytic homogeneity, the problem of particle rotation, quantum entanglement effects, etc.

If we can really figure out the essence of this bottom algorithm, it will certainly be very helpful to understand our world, and hopefully one day we can reveal the truth hidden by the Creator.

Author of this article: Abu Gyro-Present: YX11lv1WQVg

I don’t know.

Keep your eyes on the road.