As others have pointed out, at one level this question is about Universal Turing machines. So that answer is probably unsatisfying.
Can you do something a little narrower that is not a UTM? Since you are talking about "modeling" and "simulation," I assume you mean modeling of physical reality, not modeling of computing systems, since that is covered by UTM theory (any computing system can be exactly modeled/emulated by a UTM, so that is not particularly interesting; it is when the model is necessarily an impoverished representation of the referred-to reality that modeling and simulation get int
As others have pointed out, at one level this question is about Universal Turing machines. So that answer is probably unsatisfying.
Can you do something a little narrower that is not a UTM? Since you are talking about "modeling" and "simulation," I assume you mean modeling of physical reality, not modeling of computing systems, since that is covered by UTM theory (any computing system can be exactly modeled/emulated by a UTM, so that is not particularly interesting; it is when the model is necessarily an impoverished representation of the referred-to reality that modeling and simulation get interesting).
Unfortunately, I think the answer to this question is basically "No."
The devil is in the details, and unfortunately each kind of modeling language requires specialized handling at the "simulation algorithm" level you are thinking of. For example, tuning an ode45 simulator for things described with ODEs takes some local, specialized skill, as does tuning the expressivity of your algorithm's general purpose data structures for representing objects in the open modeling formalisms that represent object universes. Yes, you could create a "general" simulation algorithm that does all of it, but there will be no real "sum greater than parts" advantage, because all the complexity is down in the details. You will get very little extra leverage by doing it all together. In fact, I'd argue that lumping the capabilities together will create a worse algorithm overall than a more focused algorithm that just tries to be the specialized simulation engine for a single modeling paradigm.
Another way of saying this is that there is stronger coupling than you think between the modeling formalism and a simulator that can run models described in that formalism. The engine must match the car, and an engine that can be used on anything from a Kia to a Porsche is probably not a very good engine.
That said, it is useful to review the most general tools that ARE available, since it sheds light on why more general tools haven't appeared.
The most basic modeling formalism is just symbolic math and logic. Wolfram's Mathematica rules here. This is basically an attempt to replace human mathematicians with an AI to the extent possible. The simulation algorithm for something like theorem proving (yes, theorems are a kind of modeling, and proofs are a kind of simulation) has to be finely tuned to deal with symbol strings efficiently.
The next most basic language is ordinary differential equations. Tools like Simulink within Matlab reduce ODE-based modeling down to a GUI-based interface. You still have to do some artistic selection and tuning of the integrator (whether you use ode45 or ode23 depends on the domain... you'd use different precision integrators for car suspensions vs. the orbit of Jupiter, and this is a matter of judgment about the sensitivity properties of the specific equations, though these judgments are increasingly being automated)
If you generalize a little bit and add difference equations, finite state machines, Petri Nets, Markov Decision Processes etc., you can still use the stateflow/statechart description languages and stick to tools like Matlab. Try Lafortune's text on discrete event systems.
System Dynamics is a cousin of these, and there are tools like iThink that specialize in that. It claims extreme generality, but that comes at the cost of such low precision (which practitioners hand-wave away with "directionally correct what-ifs") that I barely think of it as modeling/simulation.
Once you get away from ODEs, you get what is technically known by us modelers as a "mess." There are a couple of directions where things get messy.
First, if you move from ODEs to PDEs, you end up in one kind of mess. ODEs are well-behaved, often have closed form solutions, and when they don't, can be numerically simulated in lots of useful cases in stable ways.
PDEs generally require extremely careful custom handling, and even then you'll usually get a mess. A classic example is the Navier-Stokes equations of fluid mechanics. There are entire disciplines devoted basically to modeling and simulating these equations. If you write down an arbitrary innocent looking ODE, chances are you will be able to simulate it. If you write down an arbitrary innocent looking PDE, chances are, it will crash your best abilities to model it.
Another direction of messiness is the discrete direction. Once you get beyond simple state machines and Petri nets describing relatively fixed universes, and move to working with open systems, you are essentially doing a specialized kind of object-oriented multi-threaded programming where handling non-sequentiality in time becomes the biggest headache. Again there are no truly general frameworks (C.S.P Hoare's "Communicating Sequential Processes" will give you a sense of the complexities).
Generally people who deal with this stuff call themselves "agent-based modelers." With or without realizing it, people doing game or VR programming stick to a tractable narrow class within this domain. How different agent-based modeling is from OO programming is a matter of debate. One popular line is "objects do it for free, agents do it for money." Regular programmers are skeptical that there is a useful distinction at all.
There have been attempts like the programming language Swarm (really a library for Objective C, I believe... I haven't used it... I chose to stick to Matlab even for my ABM work) to support this kind of modeling.
If you haven't actually done any of these types of modeling (I've done at least starter models in all these domains, and pretty advanced models in some), it is easy to get tempted by a false sense of potential generality. There are people who get very excited by visions of a "general systems theory" (GST). I admit I used to be one of them, and even took a stab at it many years ago.
I have since decided that this is basically an impossible problem for a variety of technical and philosophical reasons that go beyond what I've mentioned here (these have to do with NP-completeness theory, no-free-lunch theorems in optimization etc.) I have also concluded that anyone who adopts that label for something they've produced is basically a charlatan.
But don't let me stop you from hunting for a true GST. Who knows, all battle-bruised modelers like me might be mistaken, and you might find one, build Skynet on top of it, and take over the world. And even if you fail, tilting at this particular windmill is a very educational experience. I learned a lot from attempting it.
Where do I start?
I’m a huge financial nerd, and have spent an embarrassing amount of time talking to people about their money habits.
Here are the biggest mistakes people are making and how to fix them:
Not having a separate high interest savings account
Having a separate account allows you to see the results of all your hard work and keep your money separate so you're less tempted to spend it.
Plus with rates above 5.00%, the interest you can earn compared to most banks really adds up.
Here is a list of the top savings accounts available today. Deposit $5 before moving on because this is one of th
Where do I start?
I’m a huge financial nerd, and have spent an embarrassing amount of time talking to people about their money habits.
Here are the biggest mistakes people are making and how to fix them:
Not having a separate high interest savings account
Having a separate account allows you to see the results of all your hard work and keep your money separate so you're less tempted to spend it.
Plus with rates above 5.00%, the interest you can earn compared to most banks really adds up.
Here is a list of the top savings accounts available today. Deposit $5 before moving on because this is one of the biggest mistakes and easiest ones to fix.
Overpaying on car insurance
You’ve heard it a million times before, but the average American family still overspends by $417/year on car insurance.
If you’ve been with the same insurer for years, chances are you are one of them.
Pull up Coverage.com, a free site that will compare prices for you, answer the questions on the page, and it will show you how much you could be saving.
That’s it. You’ll likely be saving a bunch of money. Here’s a link to give it a try.
Consistently being in debt
If you’ve got $10K+ in debt (credit cards…medical bills…anything really) you could use a debt relief program and potentially reduce by over 20%.
Here’s how to see if you qualify:
Head over to this Debt Relief comparison website here, then simply answer the questions to see if you qualify.
It’s as simple as that. You’ll likely end up paying less than you owed before and you could be debt free in as little as 2 years.
Missing out on free money to invest
It’s no secret that millionaires love investing, but for the rest of us, it can seem out of reach.
Times have changed. There are a number of investing platforms that will give you a bonus to open an account and get started. All you have to do is open the account and invest at least $25, and you could get up to $1000 in bonus.
Pretty sweet deal right? Here is a link to some of the best options.
Having bad credit
A low credit score can come back to bite you in so many ways in the future.
From that next rental application to getting approved for any type of loan or credit card, if you have a bad history with credit, the good news is you can fix it.
Head over to BankRate.com and answer a few questions to see if you qualify. It only takes a few minutes and could save you from a major upset down the line.
How to get started
Hope this helps! Here are the links to get started:
Have a separate savings account
Stop overpaying for car insurance
Finally get out of debt
Start investing with a free bonus
Fix your credit
I won't attempt to answer this question fully here. However I will provide some food for thought and a link to more information. I will here mention some features that are common to all systems and briefly hint at how these might be leveraged to create a general system simulator. The linked article goes into greater depth regarding the specifics of how to construct the simulator.
If one utilises high-level features of systems this means that the resulting simulator can only simulate systems that can be described by such high-level features. For example, one might use fuel-tank-size wit
I won't attempt to answer this question fully here. However I will provide some food for thought and a link to more information. I will here mention some features that are common to all systems and briefly hint at how these might be leveraged to create a general system simulator. The linked article goes into greater depth regarding the specifics of how to construct the simulator.
If one utilises high-level features of systems this means that the resulting simulator can only simulate systems that can be described by such high-level features. For example, one might use fuel-tank-size within a simulation of traffic, but this cannot apply to particles. Or one might use mass and location to simulate particles but this cannot apply to systems of symbols such as a language. And so on.
Thus we must ask, upon what low-level system features do all high-level system features supervene. I.e. are there low-level features that can be used to define all high-level features. In fact there are.
To find them we have to step out of the context of phenomenal features that are based on the appearances of systems. Instead we must step into the realm of information and information processes. Only at this level are there features common to all systems.
Some common features:
- All systems have state, both internal state and observable state.
- All systems interact and thereby change state.
- All interactions are mediated by communication.
- All communication consists of the flow of information.
- General computation is the coherent transformation of information.
- System interactions integrate sub-systems into super-systems.Thus a super-system can be considered to be a network of interacting sub-systems. This applies at all levels of the complexity hierarchy. By successive application of this reduction every system can be conceived of as a holarchy (network of networks) of system interactions.
- An interaction network is equivalent to a graph of information flows.
- A graph can be represented as a matrix.
- A system state (classical or quantum) can be represented as a vector.
- When a matrix is multiplied with a vector the information within the vector flows through the network defined by the matrix and is then integrated into a new vector that represents a new system state, thus we can form the iterative equation V = M.V.
- This brings us into the realm of algebra wherein there are many mathematical manipulations that are possible.
- Using such methods one can model and simulate both classical and quantum systems.
- The equations of quantum mechanics embody these principles even though they were not used to derive the equations of QM.
For more information on how to apply these concepts to create a general system simulator, refer to the ebook System Science of Virtual Reality. This book describes a plausible proposition that deserves closer analysis. This document provides some preliminary clarifications that help to understand the approach The Objective Information Process & Virtual Subjective Experiences Hypothesis.
The book "System Science of Virtual Reality" includes discussions that are not directly relevant to this question, however the bulk of the book consists of a systematic mathematical development of a general system simulator. It starts from a very simple initial prototype, then in successive stages the limitations are identified and the model is extended to overcome these limitations. This results in a final version that can simulate arbitrary classical or quantum systems.
The book also re-derives the foundations of quantum mechanics using the principles of general system simulation. There are also software prototypes, however a great deal more work is required on these. Furthermore, it would take a quantum computer to efficiently run this simulator - the permutation explosion is exponentially too great for a classical computer to handle. This method may point the way towards a general purpose programming methodology for quantum computers.
This approach implements the abstraction called for in the question because the details of how to simulate a system are standardised into a common algorithm. One then need only describe the range of possible states that a system may occupy and the conditions under which it changes state. The simulation algorithm then simulates the system regardless of the type of system that is described. This description can be defined at any level of detail.
The other discussions in the book indirectly relate to this question via other issues such as: Is the Universe a Simulation? , What is consciousness? , What is sentience? , the Hard Problem of Consciousness , What is matter? and What is it like to be a quantum computational process?
Yes, absolutely. And what you are describing is a computer.
Why is it we can simulate traffic or particles or Marios and Luijis? Because it is a general purpose machine that can execute code written in general purpose languages. If you understand the target system thoroughly enough to express its characteristics in executable code, you can simulate anything.
You insist on requiring no programming or simulation knowledge, yet even with a language targeted for a specific purpose, you would still have to learn it. And if you are compiling instructions that are to be run by an interpreter or som
Yes, absolutely. And what you are describing is a computer.
Why is it we can simulate traffic or particles or Marios and Luijis? Because it is a general purpose machine that can execute code written in general purpose languages. If you understand the target system thoroughly enough to express its characteristics in executable code, you can simulate anything.
You insist on requiring no programming or simulation knowledge, yet even with a language targeted for a specific purpose, you would still have to learn it. And if you are compiling instructions that are to be run by an interpreter or some algorithm (as you like to call it), what you are doing is essentially programming. Further, if your instructions consist of engineering a simulation and its behavior, you would still need to understand the simulation and would require thorough knowledge of it.
What you are describing is a language and development environment for a targeted purpose, in your case a simulation simulator. There are plenty of examples of targeted systems. Excel and Mathematica come to mind. But if you were to build your simulation simulator, you would do it with computers, and ironically, you would be building a system that would be more limited than using generic programming tools and computers.
So ultimately, the usefulness of your system depends on how narrow your definition of "general system" is. The narrower it is, the more constraints can be added and flexibility removed, ie. it would be more targeted. And this is the nature of all simulations. A traffic simulator factors out everything that isn't about cars, and so on.
This targeted-ness is what would make a traffic simulator better than your general system simulator. By the same token, targeted-ness is also what would make your general system simulator better than using generic programming tools, and is probably why you seek one.
Well, by definition a universal Turing machine can simulate any computation or simulation that can be computed. (For more: http://en.wikipedia.org/wiki/Universal_Turing_machine)
The model is simple: a finite-state machine attached to an infinite tape, with the ability to read and write symbols and move left and right.
There are other simple models of automata that are equivalent to this; as a group they're called "Turing-complete."
Any of them could serve as your universal simulator!
Len Troncale has discusses this idea in his some of his papers. "Is Artificial "Systems" Research Possible?" http://www.necsi.edu/events/iccs/2004proceedings.html
Start with a Turing-complete programming language, and a VERY powerful computer (lots of cores, lots of memory). (Or just load copies of MATLAB and SIMULINK onto a serious workstation.)
After that, it’s a “simple” matter of developing a model of the system you want to simulate. Easy-Peasy.
No it is not. To quote British mathematician George Box “all models are wrong but some are useful”. When creating a simulation you must clearly define what it is you are solving for, just as if you were solving by hand. Your model may be good at simulating one thing, but not so good at simulating another. After you define what you need you must chose the inputs (boundary conditions/ initial conditions). How well your simulation works will depend on how good the parameters s are that you select.
Q: Is it possible to build a "perfect" simulation of anything?
I don’t think so.
At one time I was thinking of writing a story about that which would have focused on the philosophical principle the Identity of Indiscernibles [ https://plato.stanford.edu/entries/identity-indiscernible/ ], which tends to imply that if one could perfectly simulate a thin...
Theoretically - yes.
In practice, the amount of computer power and memory required to simulate even a microgram of matter at the level of protons, neutrons and electrons would overwhelm any computer on the planet.
It’s done all the time. The simulated computers are called Virtual Machines. My laptop can do it, so your computer probably can too.
Simulating a universe is very easy, if the universe is simple enough.
Disco universe consists of a single bit that is either on or off. If it’s on, turn it off. If it’s off, turn it on. This universe even has a nifty conserved quantity: the bit change rate.
Disco universe might not be the simplest possible dynamic universe, but it is surely somewhere near the bottom end of things. Its current state can be represented by a = one bit. To specify its program, however, requires several lines of code, though only one of them does any heavy lifting. How many bits is that? We can get an idea by compres
Simulating a universe is very easy, if the universe is simple enough.
Disco universe consists of a single bit that is either on or off. If it’s on, turn it off. If it’s off, turn it on. This universe even has a nifty conserved quantity: the bit change rate.
Disco universe might not be the simplest possible dynamic universe, but it is surely somewhere near the bottom end of things. Its current state can be represented by a = one bit. To specify its program, however, requires several lines of code, though only one of them does any heavy lifting. How many bits is that? We can get an idea by compressing the program in a lossless format in an optimal manner. Let’s say the result is b. Whatever else we might say about b, it is definitely greater than a.
Can we embed another simulation inside Disco universe? No. It doesn’t have the computational power to simulate anything. The inside of the thing is just a = one bit.
Can we embed Disco universe within a larger simulation known as Onion? Sure thing. The inside of Onion needs to be at least b in size representing 2^b distinct states. Onion’s program will be size c where c > b.
Can we continue wrapping Onion with larger and larger simulations? Yes, but not an infinite number of them.
The portion of our own universe that we could influence (and therefore program) in a finite amount of time has in turn a finite size, and we presume, a finite information content. At best, it grows at the speed of light.
Not even computational Midas, who turns everything he or his simulation touches into part of a larger computation, can get enough computational power in a finite amount of time to run an infinite number of embedded simulations.
My systems level view of your question, is that the details of your question are in conflict with each other. The "model specification language" you refer to does not obviate the need for programming. It is programming.
There is a problem facing developers of programming languages.
Lower level languages allow the maximum possible control over every detail of the system and may be applied to any conceivable model, but take an extraordinarily long time to program and need a different program for every possible system variant. The Turing machine discussion applies at this level.
Higher level lan
My systems level view of your question, is that the details of your question are in conflict with each other. The "model specification language" you refer to does not obviate the need for programming. It is programming.
There is a problem facing developers of programming languages.
Lower level languages allow the maximum possible control over every detail of the system and may be applied to any conceivable model, but take an extraordinarily long time to program and need a different program for every possible system variant. The Turing machine discussion applies at this level.
Higher level languages allow more general descriptions of systems, solving the development time problem, but at the expense of restricting the range of systems to which it may be applied, because they impose model structures.
Peeking under the covers of this dilemma, the root cause is the effectively infinite problem space encompassed by your question.
One valid answer (only a little tongue in cheek) is to go create a child or more challengingly, go create a general purpose AI. These are both general purpose simulators without the need for programming as you describe..
In both cases, their method of operation (speculating on the latter of course) is some variation on the universal evolutionary search algorithm. i.e. try things, find patterns, build a model, try more things ... repeat, die if you fail too often, maybe procreate if you don't... repeat.
This method explores the infinite problem space, reinforcing models that work and suppressing those that do not.
The MatLab module Simulink may provide exactly this. It's a general-purpose simulation framework that works across domains and does not require programming:
http://www.mathworks.com/products/simulink
The challenge with any simulation framework is that the more general it is, the harder it is to use, the slower it runs, and the more that using it starts to look like programming.
surely is. Try to create a simple web site with just a description of your product and try to promote it. Put there a registration form for early adopters. See what reaction you get. Quite a lot of startups started like this.
We most certainly can model the universe using the laws of physics. It has been done. And we have used computer modeling to ascertain ideas about cosmic inflation for example. Computer models are also used to generate information on such a ideas as merging galaxies and stellar evolution. The greater the capacity of our computers to handle information, the more we can model accurately. Someone once asked a question if it was possible to model the motion of every atom in the universe. The quick answer is yes, but no. But computers are extremely useful in helping us to understand natural, dynamic
We most certainly can model the universe using the laws of physics. It has been done. And we have used computer modeling to ascertain ideas about cosmic inflation for example. Computer models are also used to generate information on such a ideas as merging galaxies and stellar evolution. The greater the capacity of our computers to handle information, the more we can model accurately. Someone once asked a question if it was possible to model the motion of every atom in the universe. The quick answer is yes, but no. But computers are extremely useful in helping us to understand natural, dynamic processes.
Why is there no simulation case without a physical system to run it?
You can, of course, run a simulation of an impossible or improbable situation (such as the figure-8 solution to the three-body problem).
However, a simulation can’t be done without something doing the simulation. It’s similar to Rene Descartes proof of “I think, therefor I am” - assume that you don’t exist, and then ask “so what is doing the thinking?” If there isn’t a system running the simulation, where is it running?
(Note that “a brain, a hand, a pencil, and a pad of paper” constitutes a “system” for the purposes of this dis
Why is there no simulation case without a physical system to run it?
You can, of course, run a simulation of an impossible or improbable situation (such as the figure-8 solution to the three-body problem).
However, a simulation can’t be done without something doing the simulation. It’s similar to Rene Descartes proof of “I think, therefor I am” - assume that you don’t exist, and then ask “so what is doing the thinking?” If there isn’t a system running the simulation, where is it running?
(Note that “a brain, a hand, a pencil, and a pad of paper” constitutes a “system” for the purposes of this discussion)
Yes, of course. It’s done all the time. The only limitations are that you share resources with the underlying computer hardware; power, memory, and compute cycles.
These virtual machines (VMs) are 100% replicated and 100% functional less the shared resources which are usually planned and managed well in advance.
VMs and emulators are staples in the computer industry.
Yes, of course. Any Turing-complete computational system can emulate any other Turing-complete computational system, including itself. For example, you could have an x86 PC run an emulator of an x86 PC, then emulate running the same emulator, and so on. Or you can run a VM inside another VM inside another VM, etc.
However, with a finite amount of computing power the simulation would eventually become extremely slow. There is always some “frictional loss” in an emulator or VM, even assuming a larely compatible (or identical) instruction set, so the more layers of nested simulations the slower th
Yes, of course. Any Turing-complete computational system can emulate any other Turing-complete computational system, including itself. For example, you could have an x86 PC run an emulator of an x86 PC, then emulate running the same emulator, and so on. Or you can run a VM inside another VM inside another VM, etc.
However, with a finite amount of computing power the simulation would eventually become extremely slow. There is always some “frictional loss” in an emulator or VM, even assuming a larely compatible (or identical) instruction set, so the more layers of nested simulations the slower the whole thing would run. And since any real machine must have a finite amount of RAM, eventually you couldn't fit all the emulators in memory at once and you'd constantly be swapping to disk, etc.
From what I understand, there are two main approaches to simulating dynamic processes: analog and digital. As for an example of the former, a nation’s economy has been modeled using water pipes, reservoirs and valves, and one, but not the only one, is described in the following YouTube video:
(This video is a pitch to reinstate analog modeling in general.)
The digital approach is beautifully and efficiently exemplified by iSee Systems’ Stella, of which this is a partial example:
Both abstract the elements of a system into “stocks” (reservoirs) and “flows”,
From what I understand, there are two main approaches to simulating dynamic processes: analog and digital. As for an example of the former, a nation’s economy has been modeled using water pipes, reservoirs and valves, and one, but not the only one, is described in the following YouTube video:
(This video is a pitch to reinstate analog modeling in general.)
The digital approach is beautifully and efficiently exemplified by iSee Systems’ Stella, of which this is a partial example:
Both abstract the elements of a system into “stocks” (reservoirs) and “flows”, the behavior of which is governed by “valves” which in turn are governed by “policies” and “assumptions”. The calculus of differential equations is the mathematical language of modeling change. Trying to code complicated combinations from scratch would indeed be likely to be inscrutable. But the system dynamics programming versions hide all that, especially Stella and Vensim, which have a dynamic graphical user interface (GUI) that completely hide the math. It’s odd that Dr. Sheldrake seems completely oblivious of the System Dynamics approach, best exemplified above, created by Dr. Jay Forrester at MIT way back in the ’60s and ‘70s, which dramatically simplified the digital coding which he excoriates and dismisses as incomprehensible.
That being said, maybe he is on to something, as much of natural processes are wave phenomena that can best be modeled with, yes, wave generators like water wave tables.
Arguably the most impactful system model has been Forrester’s World Dynamics, which was the model upon which Donella Meadows’ books Limits to Growth and Beyond Limits to Growth were based.
These models express their outputs in graphs like the following:
This was furiously attacked by the business community, seen as it was as threatening their worldview, which I describe as rapaciously voracious. Modelers go on and on about how their models are only suggestive, not predictive, but her cautions are being borne out in spades even as we speak.
Yes, but things which yield to “perfect” simulation also yield to simple mathematical analysis—they tend not to have any non-linear relationships. Consequently, such systems are rarely considered for simulation, because simulating is unnecessary when an analysis is available.
Sure. There’s nothing preventing it.
People “nest” virtual machines inside other virtual machines all the time. It’s slow, but for simple stuff like web services, there’s usually enough CPU.
If you want to “simulate” physics, that’s gonna require more CPU that anybody has to hand, but there’s no magic preventing it.
For those desperate enough to think that our existing universe is already some kind of simulation, then existing physics simulations are already nested simulations. What value this thought process yields, I cannot say.
Look up Celestia.
I spent some time exploring Mars that way. It works well, but you need to have some knowledge of various coordinate systems and have knowledge of software installation.
Yes, it's possible. Or at least, we can't prove it's impossible.
It might be tempting to try to work out what size of computer it would take, how fast the computer would have to be, etc, as a way of assessing the feasibility. Alas, you can't actually draw any conclusions from that sort of reasoning, because such reasoning implicitly assumes that the laws of physics of the simulators are the same as the laws of physics we see - and there's no good reason to make such an assumption. We often run “simulations” with different “laws of physics”, and sometimes even a different number of dimensions! I
Yes, it's possible. Or at least, we can't prove it's impossible.
It might be tempting to try to work out what size of computer it would take, how fast the computer would have to be, etc, as a way of assessing the feasibility. Alas, you can't actually draw any conclusions from that sort of reasoning, because such reasoning implicitly assumes that the laws of physics of the simulators are the same as the laws of physics we see - and there's no good reason to make such an assumption. We often run “simulations” with different “laws of physics”, and sometimes even a different number of dimensions! Imagine intelligent entities in Conway's “Game of Life” discussing the feasibility of them being in a simulation - without even realizing that their 2D “reality” is not the same as our 3D reality.
So there's no way to know what the real laws of physics are - and thus no way to assess the feasibility of running our “reality” as a simulation.
It should lead to some unexpected problems, like every system with self-reference. Logical paradoxes arise out of assertions about itself; Gödel's incompleteness-theorems were proved by self-reference of a mathematical statement to its own provability and truth. What could happen: infinite oscillations of consciousness, breakdown of the whole system (because of that infinity), vanishing of borders between one simulation and the others, meaning: between observations and reality. I wouldn't be involved!
A2A: I suspect that what is probably intended by “like a closed circuit” would be more accurately described as “recursively”. For that, yes. E.g., a Turing machine can simulate a Turing machine simulating a Turing machine. However, I don’t think the relationship can be reciprocal. The role of the simulated is not all that similar to that of the simulator.
The crucial question when involving a simulator is: what are you intending the simulator for?
This is important, because there are many ways to use simulators.
For example, a flight simulator (as people commonly understand it, eg. Microsoft Flight Simulator or Boeing Flight Simulator) might be useful to train pilots, but is useless when evaluating wing aerodynamic performance, or training air traffic controllers.
This is because each simulator simulates an “interface” which the “system under test” uses to interact with the simulator. In the case of the flight simulator example above, the simulate
The crucial question when involving a simulator is: what are you intending the simulator for?
This is important, because there are many ways to use simulators.
For example, a flight simulator (as people commonly understand it, eg. Microsoft Flight Simulator or Boeing Flight Simulator) might be useful to train pilots, but is useless when evaluating wing aerodynamic performance, or training air traffic controllers.
This is because each simulator simulates an “interface” which the “system under test” uses to interact with the simulator. In the case of the flight simulator example above, the simulated interface are the senses of the pilot (vision, touch, etc). The pilot interacts with the simulator as they would in the real cockpit. Some flight simulators are so realistic, they even simulate the gravity forces. If the interface was the radar screen of the air-traffic controller, you would use a different simulator to put plane icons on the radar screen.
Another example: a weather simulation, as you are familiar with on your evening newscast. This type of simulation is a model which runs entirely within a computer, and the only interface is the reports of the model to the weather forecaster. There is no interaction with the real world. But, this simulator would be worthless for predicting the reliability of physical structures in case of a hurricane. For that, you would use a “hurricane simulator”, maybe in a wind tunnel, to test your physical structures with different parameters (eg. wind speed).
With that as a background, what do you want the IoT Simulator for?
If you want to model the behavior of a IoT network, you would use a modelling system, as you can easily find via google.
If you want to test the behavior of an IoT Application under different conditions, you would use an IoT Simulator that interacts with the IoT Application through the interfaces (protocols) that the IoT Applications uses in the real world.
There are various open-source and commercial packages that simulate different interfaces. When looking for features, make sure they support the one you want, and scale to the scale you want. Small scale is simple, large scale is hard. Duplicating what someone has already done is even harder.
Summary: when looking for a simulator be clear what the interface is that you want to simulate.
I’ve been wanting to build a VR simulator like this, where you can immersive yourself in VR and in a literal sense design your own worlds, interactions, and stuff like that without code.
MIT produced something called ‘Scratch” which does a little of what you’re talking about that lets you assemble interactions and things like that
a computer or calculator must always be simpler than the calculation/computation it performs
for a perfectly accurate computer simulation that means the computer must be more complex than what he simulates
or an emulation, a simulation of a computr to properly work the comptuer running it must be more complex than the computer in the simulation
but a simplified simulation can be run on a less omplex computer
but then again, it is simplified and for example computers inside that simulation will not properly work
rough numbers for estimate/reference
for msot computers you would need several thousand t
a computer or calculator must always be simpler than the calculation/computation it performs
for a perfectly accurate computer simulation that means the computer must be more complex than what he simulates
or an emulation, a simulation of a computr to properly work the comptuer running it must be more complex than the computer in the simulation
but a simplified simulation can be run on a less omplex computer
but then again, it is simplified and for example computers inside that simulation will not properly work
rough numbers for estimate/reference
for msot computers you would need several thousand times their computing power to run a proper electrica lsimulation of all it’s components in real time
but of course you can simplify the simulation turn the computr into a funnily painted block in a videogame with 2 images switcing back and forth on the screen
this can be run on a less complex computer
but it’s also not an accurate simulation and the computer in the simulation doesn’t work
it is mathematically necessary that an accurate simulation of a computr requies more coputing power than it has, in practice these are large numbers like 1000 or 10000, hypothetically it ocudl get down to a factor of 1.00000000000000001 but it cannot be less than 1
Totally, you don’t even need that much detail, you just need to design them with the intent that they can support sub-simulations.
If you want a more specific answer, you’ll have to ask a more specific question.
A properly written emulator will run other emulators. I saw a screenshot once where someone had run three or four nested emulators each handling a different version of Windows. The deepest emulator, unsurprisingly, ran slowly.
There are lots of ways to reach users.
Simulations are just part of the puzzle when it comes to instructional design.
I hope you enjoy my video response.
Good luck!
Yes, it is called an emulator, check out the QEMU project.
Yes, a simulation can create a simulation.
Just for example, a computer can be regarded as a machine for running simulations; the computer code is the instructions for running the simulation (the computer program). Any computer can in fact be programmed to simulate running another computer. Such a program is called a virtual machine.
The fact that my Linux computer can be programmed to simulate a Commodore 64 computer is why computers are called universal computers.
Yes. it's one of my long-term (that is, takes a very long time) projects. I intend to emulate some writers and some comedians by being able to understand their minds and viewpoints and reproducing it as a new simulation having the traits and characteristics. It's a big project and not at all easy. I have some pieces solved and see others that need much more work.
The complexity is illustrated by realizing that it takes humans two decades to mature during which they take in a lot of knowledge and have a lot of experience. These huge masses of information guide formation of their beliefs, values
Yes. it's one of my long-term (that is, takes a very long time) projects. I intend to emulate some writers and some comedians by being able to understand their minds and viewpoints and reproducing it as a new simulation having the traits and characteristics. It's a big project and not at all easy. I have some pieces solved and see others that need much more work.
The complexity is illustrated by realizing that it takes humans two decades to mature during which they take in a lot of knowledge and have a lot of experience. These huge masses of information guide formation of their beliefs, values, and ways of doing. it is these things one has to replicate. So if you think about it, a simulation has an awful lot to cover. Existing chatbots are pretty thin, shallow things.
A simulation that could learn a person would have to spend a lot of time observing a person's actions (mental output and physical ones) and not just recording what happened but trying to understand why something happened. You can see this is difficult. If you see a woman cry, you would have to try to guess what made her act. Well, that;s a tough call for humans - just ask any boyfriend puzzling out 'what just happened?' And it will be tough for AIs.
When I propose a system (in my world, giving someone a song to sing), my only simulation is the notes I write and my voice to sing the parts, aided in recent years by software that gives a poor but viable imitation of singers. I keep modeling from what I hear and shape it, often lopping off bad branches, until I get something I hadn’t planned on, but which exceeds my expectations!
At VERY low fidelity, you can simulate an entire universe inside a computer by simplifying it so much that you bring the problem down to a manageable size. As an example there are mechanical computers that simulate the solar system, in low fidelity - you could program that into the computer.
Here is another example:
No. As long as you don't care how long the simulation takes and you have enough memory, any Turing complete computer can simulate any other.
This is from the Wikipedia page on Turing completeness:
In computability theory, a system of data-manipulation rules (such as a computer's instruction set, a programming language, or a cellular automaton) is said to be Turing-complete or computationally universal if it can be used to simulate any Turing machine (devised by English mathematician and computer scientist Alan Turing). This means that this system is able to recognize or decide other data-manipul
No. As long as you don't care how long the simulation takes and you have enough memory, any Turing complete computer can simulate any other.
This is from the Wikipedia page on Turing completeness:
In computability theory, a system of data-manipulation rules (such as a computer's instruction set, a programming language, or a cellular automaton) is said to be Turing-complete or computationally universal if it can be used to simulate any Turing machine (devised by English mathematician and computer scientist Alan Turing). This means that this system is able to recognize or decide other data-manipulation rule sets. Turing completeness is used as a way to express the power of such a data-manipulation rule set. Virtually all programming languages today are Turing-complete.
A related concept is that of Turing equivalence – two computers P and Q are called equivalent if P can simulate Q and Q can simulate P. The Church–Turing thesis conjectures that any function whose values can be computed by an algorithm can be computed by a Turing machine, and therefore that if any real-world computer can simulate a Turing machine, it is Turing equivalent to a Turing machine. A universal Turing machine can be used to simulate any Turing machine and by extension the computational aspects of any possible real-world computer.
_————-
To show this, here is an interesting quote I heard a while back. I'm glad I was able to find it:
"When told that Steve Jobs bought a CRAY to help design the next Apple, Seymour Cray said, 'Funny, I am using an Apple to simulate the CRAY-3.'"
I think, You ask about computer simulation. Let’s see a simple case, free fall. We know the equation of motion and the start conditions. We create a loop, in each step we increase the value of time variable a little, we calculate the position of the body and draw this position on screen. As You can see we can simulate the motion only in finite steps therefore the motion on screen will not be continuous. So You can see one of the greatest problem of simulation: it should be quantized in time. There are other problems: complicated equations, or there are cases when we have no equation at all. Or
I think, You ask about computer simulation. Let’s see a simple case, free fall. We know the equation of motion and the start conditions. We create a loop, in each step we increase the value of time variable a little, we calculate the position of the body and draw this position on screen. As You can see we can simulate the motion only in finite steps therefore the motion on screen will not be continuous. So You can see one of the greatest problem of simulation: it should be quantized in time. There are other problems: complicated equations, or there are cases when we have no equation at all. Or the behaviour might be chaotic: a simple, little change in conditions gives a huge change in results. Even a simulation of three body interaction could be very hard, You can imagine what problems arise in a simulation of a whole Universe.
Plenty of long answers to a simply answered question.
No, you cannot build a general purpose system simulator.
To understand why not consider the interfaces between elements in your system. Are state changes event driven, continuous, or both. If you think just a bit about the complexity of a GP simulation engine in that context, you will see that the nature of the question addressed drives the architecture of the simulation implementation.