Creation of new technology using computer modeling. How computer modeling is performed


A computer model is natural. Computer modeling is used everywhere, making the design and production of real systems, machines, mechanisms, goods, products economical, practical, and effective. The results are always pre-simulated.

Man has always built models, but with the advent computer equipment mathematical, computing and software methods raised the ideas and technologies of modeling to extraordinary heights, made a wide range of their applications: from a primitive technical level to the level of high art and creativity.

A computer model is not only a more advanced spacecraft or conceptual system for understanding public consciousness, but also a real opportunity to assess climate change on the planet or determine the consequences of a comet impact in a few hundred years.

Technical Modeling

Today, few specialists do not know, and this program is already competing with a dozen more advanced solutions.

Modeling a modern airplane or bicycle ultimately requires not only the automation of the production of drawings and the preparation of documentation. The modeling program is required to do the technical part: draw up drawings and documentation - this is the foundation.

The program must also show the real product in real application in time in three-dimensional space: in flight, in motion, in use, including possible accidents, replacement of energy carriers, negative impacts of humans or nature, corrosion, influence of climate or other circumstances.

System Modeling

A model of a machine, a product, a conveyor is a system, but a system of clear structure and content, already manufactured once. For each there is experience, knowledge and examples of using computer models.

Technical reality is the same system as the system of relations in society, a system advertising campaign, a model of the human psyche or its circulatory system.

For example, a reliable diagnosis of a disease today can be obtained as:

  • the result of the competent actions of the doctor;
  • conclusion computer program, which built a model of the patient's condition.

These two options increasingly lead to the same result.

Man lives in a world of systems, and these systems require decisions that require initial data: understanding and perception of the surrounding reality. Without modeling it is impossible to understand the nature of systems and make decisions.

Only a computer mathematical model makes it possible to evaluate the objectivity and level of understanding of the original system, gradually bringing the created one closer virtual image to the original.

Abstraction in Modeling

Computer models and simulation are an extremely promising and dynamically developing area of ​​technology. Here, high-tech solutions are a common (ordinary, daily) event, and the possibilities of models and simulations amaze any sophisticated imagination.

However, people have not yet reached abstract system modeling. Examples of using computer models are real examples real systems. For each direction of modeling, for each type of model, each type of product, conveyor, etc. there is its own separate program or its own separate item in the menu of a program that provides modeling in a relatively wide range of systems.

Software tools themselves are models. The result of a programmer's work is always a model. Whether a program is good or bad, it is always a model for solving a specific problem, which receives initial data and generates a result.

Classical programming - classic models, no abstraction: an exact task without variations in dynamics after its development is completed. It's like a real machine, a real product, any product with strict quantitative and qualitative characteristics: done - use it within the limits of what is available, but nothing beyond what is done.

Object-oriented programming - system model with a claim to abstraction and dynamics of structure and properties, that is, with an orientation towards the creation of a dynamic model that determines its purpose by the environment of application or the problem being solved.

Here the model can “live” after it finds itself in the application area alone without its creator (author) and will independently “collaborate” with users.

Modeling: the essence of the process

The concept of a computer model is presented today various options opinions, but they all agree that the work of the program, and in context: the model is equal to the result of the actions of a specialist who works in the specific modeling environment of a particular program.

There are three types of models: cognitive, pragmatic and instrumental.

In the first case, the modeling aspect is expressed most of all as the desire to obtain a model in the format of the embodiment of knowledge, knowledge of theory, and a global process. Pragmatic model - gives an idea of practical actions, worker, production management system, product, machine. The third option is understood as an environment for constructing, analyzing and testing all models in general.

Typically, computer modeling is the activity of a specialist in constructing and studying a material or ideal (virtual) object that replaces the system under study, but adequately reflects its essential aspects, qualitative and quantitative characteristics.

Species diversity of simulated systems

In the field of modeling, as on all frontiers high technology, science, technology and programming, there are many opinions on the classification and definition of the species diversity of modeled systems.

But experts and specialists always agree on one thing: the types of computer models can be determined by objective points:

  • time;
  • method of presentation;
  • the nature of the modeled party;
  • level of uncertainty;
  • implementation option.

The time point is static and dynamic models. The first ones can be refined as much as you like, but dynamic models develop, and they are different at each moment in time. The mode of representation is usually understood as either discrete or continuous. The nature of the modeled side is informational, structural or functional (cybernetic).

The introduction of uncertainty parameters into the modeled system in many cases is not only justified, but is also a consequence of scientific achievements in related fields of knowledge. For example, building a climate model in a certain geographic region will not be feasible without many stochastic factors.

Modern modeling tools

Modeling today is a huge experience of many decades of development computer industry, which presented in the form of algorithms and programs many centuries of modeling, in general, and mathematical modeling, in particular.

Popular software are represented by a small family of widely known products: AutoCAD, 3D Max, Wings 3D, Blender 3D, SketchUp. There are many custom implementations based on these products.

In addition to the known, there is a significant private one, for example, the market for geographical, cartographic, geodetic; market of the film and video industry, represented by a significant number of little-known software products. The GeoSoft, TEPLOV, Houdini, etc. families in their field of competence are few inferior to others in quality, usefulness and efficiency.

When choosing the best software tool The best decision- assess the area of ​​the proposed modeling, the environment of existence of the future model. This will allow you to decide on the necessary tools.

Small and creative models

And although there is “little creativity left” in the design of a modern airbus, sports car or spaceship, in fact, programming and organizing business processes have become the subject of the closest attention and the target for the most expensive and complex modeling processes.

Modern business is not only hundreds of employees, pieces of equipment, but also thousands of production and social connections inside and outside the company. This is a completely new and unexplored direction: cloud technologies, organization of privileged access, protection from malicious attacks, unlawful action of an employee.

Modern programming has become too complex and has become a special kind of programming that has a life of its own. A software product created by one team of developers is intended to be modeled and studied for another company of developers.

Authoritative example

One can imagine Windows system or the Linux family as a subject of modeling and get someone to build adequate models. The practical significance here is so low that it is cheaper to just work and not pay attention to the shortcomings of these systems. Their developer has his own idea of ​​the development path he needs, and is not going to deviate from it.

With regard to databases and the dynamics of their development, the opposite can be said. Oracle is a big company. Many ideas, thousands of developers, hundreds of thousands of solutions brought to perfection.

But Oracle is the foundation and powerful reason for modeling in the first place, and it appears that investing in this process will have a tremendous return on investment.

Oracle took the lead from the very beginning and was not inferior to anyone in the field of creating databases, ensuring a responsible attitude towards information, its protection, migration, storage, etc. Everything that is required for service information tasks, is Oracle.

The Downside of Oracle

Investments and work of the best developers to solve a pressing problem are an objective necessity. Over the many decades of its leadership, Oracle has completed hundreds of urgent tasks, and thousands of implementations and updates.

Scope of information in context computer applications has not changed since the 80s to this day. Conceptually, the databases of the beginning of the computer era and today are twin brothers with differences in the level of security and implemented functionality.

To achieve the current level of “security and implemented functionality,” Oracle performed, in particular:

  • compatibility of large flows of heterogeneous information;
  • data migration and transformation;
  • application verification and testing;
  • generalized relational functionality for universal access;
  • migration of data/specialists;
  • transformation of the fundamental principles of corporate databases into a distributed Internet environment;
  • maximum integration, aggregators, systematization;
  • determining the range of feasibility, eliminating duplicative processes.

This is only a small fraction of the topics that make up multi-volume descriptions of existing software products from Oracle. In fact, the range of manufactured solutions is much wider and more powerful. All of them are supported by Oracle and thousands of qualified specialists.

Income model

If in the 80s Oracle had gone through modeling, rather than concrete capacity building in the form of real, complete solutions, the situation would have turned out significantly differently. By and large, a person or enterprise from a computer information system you don't need that much. Here the study of the computer model is not of interest.

You always need to get only a solution to the problem that has arisen. How this decision is obtained is never of concern to the consumer. He is completely uninterested in knowing what data migration is or how to test the application code so that it works on any data, and in case of an unforeseen situation he can calmly report it and not do it blue screen or hang silently.

By modeling the next need programmatically, rather than by investing in yet another specialist who will apply his intelligence and knowledge to create the next piece of code, more can be achieved.

Any, the best specialist is, first of all, a static code, it is a recording of the best knowledge in the format of a monument to the author. It's just code. The result of the work of the best does not develop, but for its development it requires new developers, new authors.

Probability of implementing an income model

Developers and the IT industry as a whole have stopped treating dynamics, knowledge and artificial intelligence with the enthusiasm that accompanied the waves of interest of previous years.

Purely formally, many associate their products or areas of work with the topic artificial intelligence, but, in fact, they are engaged in the implementation of strictly defined algorithms, cloud solutions, attach importance to security and protection from all kinds of threats.

Meanwhile, the computer model is dynamics. Computer modeling is its consequences. This objective circumstance has not yet been canceled. It is completely impossible to cancel it. The example of Oracle shows in the best possible way and more indicative than others how labor-intensive, expensive and ineffective it is to engage in forced modeling, when you have to build really working models with the labor of many thousands of specialists, and not automatically by means of the designed information system itself - models in dynamics in real practice!

Computer modeling is widely used in various branches of science and technology, gradually replacing real experiments and experiences. It has become so firmly established in our lives that it is already quite difficult to imagine a situation when we will have to abandon this method of studying the real world. This phenomenon can be explained quite easily: using this process You can achieve significant results in the shortest possible time, allowing you to penetrate into that area of ​​reality that is unattainable for humans.

Computer allows you to create a model on a computer that, with some assumption, has the properties real object or process, and the research is carried out precisely on this created model. To conduct research, you need to accurately understand why it is being carried out, what its purpose is, what properties and aspects of the object being studied interest you. Only in this case can you be sure of positive result.

Like any other process, computer modeling is built according to certain principles, among which the following can be distinguished:

· the principle of information sufficiency. If information about the real process or object is not enough, conduct research using this method most likely it won't work;

· the principle of feasibility. The created model should allow achieving the goals set for the researcher;

· the principle of multiplicity of models, which is based on the fact that in order to study all the properties of a real object it is necessary to develop several models, since it is not possible to combine all real properties in one;

· principle of aggregation. In this case complex object is presented in the form of separate blocks that can be rearranged in a certain way;

· the principle of paramerization, which allows the parameters of a certain subsystem to be replaced numerical values, which, by reducing the volume and duration of modeling, also reduces the adequacy of the resulting model. Therefore the application this principle must be fully justified.

Computer modeling must be performed in a certain, strictly defined sequence. At the first stage, the goal is determined, after which development is carried out. Then the model is formalized, allowing for its software implementation. After this, you can begin to plan model experiments and implement the previously compiled ones. After all the previous points have been completed, it will be possible to analyze and interpret the results obtained.

IN Lately computer modeling of physical processes is performed using various You can find a large number of works performed in Matlab. Such studies make it possible to study all sorts of physical processes that humans cannot observe in reality.

Computer modeling is widely used in industry. With its help, new products are developed, new machines are designed, their operating conditions are set and virtual tests are carried out. If the compiled model has a sufficient degree of adequacy, it can be argued that the results of real tests will be similar to virtual ones. In addition to studying the properties of a particular system, you can develop on a computer appearance finished product, set its parameters. This minimizes the amount of defects that may arise as a result of inaccurate engineering calculations.

There is absolutely no doubt that computer modeling of various physical processes has significantly accelerated the process of developing technical products, while saving developers a lot of money on assembling test models. With the help of modern computing power And software engineers can simulate work individual components and components of complex systems, which will reduce the number of physical tests required before launching a new product. Manufacturers can also calculate the cost of development after CAD modeling, rather than waiting until the end of physical testing of the product.

Modern industry, when launching new products, faces problems such as time to develop a new product and development costs. And in the automotive and aerospace industries it is almost impossible to do without CAD modeling, since modeling helps to significantly speed up development and reduce costs, which is very important for modern market. Historically, the emergence of modern computing systems, which are capable of simulating the dynamic properties of objects under various influences, has pushed into the background the modernization of physical test stands, as well as the development of test methods. Many organizations try to choose modeling because it requires minimal cost and minimal development time. However, in some studies, only the process of physically testing the product can provide an accurate answer. No more close interaction between electronic models and physical testing, many organizations may become overly dependent on computer models for development, which when misuse may subsequently lead to unexpected failures in the operation of expensive equipment.

In the automotive industry, computer modeling is becoming an integral part as modern vehicle designs have become much more complex and computer modeling systems have improved significantly. However, unfortunately, many manufacturers reduce physical testing of products to a minimum, relying on computer simulation results.

Physical testing processes have not kept pace with computer modeling in improving techniques. Engineers conducting tests usually try to carry out minimal necessary tests above the product. The result is more frequent test repetitions to obtain more reliable results or their confirmation. Relying purely on computer modeling without physical testing can lead to very serious consequences in the future, since the mathematical model of the product, on the basis of which the process of calculating dynamic properties is carried out, is created with certain assumptions, and in real work the product may behave slightly differently than what was displayed on the monitor.

Computer modeling has a symbiotic relationship with physical testing of equipment, which allows (unlike a computer model) to obtain experimental data. Therefore, the lag in testing technologies for finished devices, with such an increase in capabilities computer technology, can lead to unnecessary savings on experimental samples with subsequent problems in finished products. The accuracy of the models directly depends on the input data about the behavior of the model (mathematical description) under various conditions.

Of course, model elements cannot include all possible options and the conditions for the behavior of certain components, since the complexity of calculations and the cumbersomeness of the mathematical model would become simply enormous. To simplify the mathematical model, certain assumptions are made that “should not” have a significant impact on the operation of the mechanism. But, unfortunately, the reality is always much harsher. For example, a mathematical model will not be able to calculate how the device will behave if there are microcracks in the material, or if there is a sudden change in weather, which can lead to a completely different load distribution in the structure. Experimental data and calculated data quite often differ from each other. And this must be remembered.

There is another important advantage to the physical testing of equipment. This is the ability to point out flaws to engineers when drawing up mathematical models, and also provides a good opportunity for discovering new phenomena and improving old calculation methods. After all, you must agree that if you put variables into a mathematical formula, the result will depend on the variables, and not on the formula. The formula will always remain constant, and only a real physical test can supplement or change it.

The emergence of new materials in all branches of modern industry creates additional problems for computer modeling. If engineers continued to use time-tested materials and their improved mathematical descriptions, then yes, the problems with modeling would be much less. But the emergence of new materials requires mandatory physical testing of finished products with these materials. However, new elements are increasingly appearing on the market and growth trends are only going up.

For example, the aeromobile and automobile industries quickly adopted composite materials due to their good specific strength. One of the main problems with computer modeling is the inability of the model to accurately predict the behavior of a material that suffers from certain performance disadvantages compared to the aluminum, steel, plastic and other materials that have long been used in this industry.

Validation of computer models for composite materials is critical during the design phase. After carrying out the calculations, it is necessary to assemble a test stand on a real part. When conducting physical tests to measure deformation and load distribution, engineers focus on critical points determined by a computer model. Strain gauges are used to collect information about critical points. This process is only monitored for expected issues that may create blind spots in the testing process. Without comprehensive research, the authenticity of a model may be confirmed when in fact it is not.


There is also a problem with gradually outdated measurement technologies, for example, strain gauges and thermocouples do not allow covering the entire required measurement range. For the most part, traditional sensors are only able to measure the required value in certain areas, not allowing deep insight into what is happening. As a result, scientists are forced to rely on pre-modeled processes that show vulnerabilities and force testers to pay increased attention to one or another node of the system under test. But as always there is one thing. This approach works well for time-tested and well-studied materials, but for designs that include new materials, it can be harmful. Therefore, design engineers in all industries are trying to update old measurement methods as much as possible, as well as introduce new ones that will allow more detailed measurements than older sensors and techniques.

Strain gauge technology has remained virtually unchanged since its invention decades ago. New technologies such as , are capable of measuring full field strength and temperature. Unlike legacy strain gauge technologies, which can only collect information at critical points, fiber optic sensors can collect continuous strain and temperature data. These technologies are much more beneficial when conducting physical testing, as they allow engineers to observe the behavior of the structure under study at and between critical points.

For example, fiber optic sensors can be embedded inside composite materials during downtime to better understand curing processes. A common disadvantage, for example, there may be a process of wrinkling in one of the layers of material, which causes inside mechanical stress. These processes are still very poorly understood and there is very little information about the stress and deformation inside composite materials, which makes it almost impossible to apply computer modeling to them.

Outdated strain gauge technologies are quite capable of detecting residual strain in composite materials, but only when the strain field reaches the surface and the sensor is installed exactly in the right place. On the other hand, spatially continuous measurement technologies such as fiber optics can measure all field strength data at and between critical points. It was also previously mentioned that fiber optic sensors can be embedded in composite materials to study internal processes.

The development process is considered complete when the product has passed all tests and has begun to be shipped to consumers. However, modern level allows manufacturers to receive first reports on their products immediately after users start using them. As a rule, immediately after the release of a serial product, work begins on its modernization.

Computer models and physical tests go hand in hand. They simply cannot exist without each other. Further development technology requires maximum interaction between these design tools. Investments in the advancement of physical research data require initially large investments, but the “return” will also please. But, unfortunately, most developers try to get benefits here and now and do not care at all about long-term prospects, the benefits of which, as a rule, are much greater.

Those looking to secure the long-term future of their products will seek to implement more innovative and reliable methodologies and elements of product testing, such as fiber optic measurements. The combination of computer modeling and physical testing technologies will only grow stronger in the future, because they complement each other.

, astrophysics, mechanics, chemistry, biology, economics, sociology, meteorology, other sciences and applied problems in various fields of radio electronics, mechanical engineering, automotive industry, etc. Computer models are used to obtain new knowledge about the modeled object or to approximate the behavior of systems that are too complex for analytical study.

The construction of a computer model is based on abstraction from the specific nature of phenomena or the original object being studied and consists of two stages - first creating a qualitative and then a quantitative model. Computer modeling consists of conducting a series of computational experiments on a computer, the purpose of which is to analyze, interpret and compare the modeling results with the real behavior of the object under study and, if necessary, subsequent refinement of the model, etc.

The main stages of computer modeling include:

There are analytical and simulation modeling. In analytical modeling, mathematical (abstract) models of a real object are studied in the form of algebraic, differential and other equations, as well as those involving the implementation of an unambiguous computational procedure leading to their exact solution. In simulation modeling, mathematical models are studied in the form of an algorithm(s) that reproduces the functioning of the system under study by sequentially executing large quantity elementary operations.

Practical use

Computer modeling is used for a wide range of tasks, such as:

  • analysis of the distribution of pollutants in the atmosphere
  • designing noise barriers to combat noise pollution
  • vehicle design
  • flight simulators for pilot training
  • weather forecasting
  • emulation of the operation of other electronic devices
  • forecasting prices in financial markets
  • study of the behavior of buildings, structures and parts under mechanical load
  • predicting the strength of structures and their destruction mechanisms
  • design production processes, for example chemical
  • strategic management of the organization
  • behavior research hydraulic systems: oil pipelines, water pipelines
  • modeling of robots and automatic manipulators
  • modeling of urban development scenarios
  • transport systems modeling
  • simulated crash tests
  • modeling the results of plastic surgery

Different areas of application of computer models have different requirements for the reliability of the results obtained with their help. Modeling of buildings and aircraft parts requires high precision and confidence, while models of the evolution of cities and socio-economic systems are used to obtain approximate or qualitative results.

Computer simulation algorithms

  • Component circuit method
  • State variable method

see also

Links


Wikimedia Foundation. 2010.

See what “Computer modeling” is in other dictionaries:

    COMPUTER MODELLING- Quite literally - using a computer to simulate something. Usually the thinking or behavior of a person is modeled. That is, attempts are being made to program the computer so that it acts in a similar way to how... ... Dictionary in psychology

    Modeling is the study of objects of knowledge on their models; building and studying models of real-life objects, processes or phenomena in order to obtain explanations of these phenomena, as well as to predict phenomena of interest... ... Wikipedia

    Computer vision is the theory and technology of creating machines that can detect, track and classify objects. How scientific discipline, computer vision refers to the theory and technology of creating artificial systems,... ... Wikipedia

    Social modeling- a scientific method of understanding social phenomena and processes by reproducing their characteristics on other objects, i.e. models specially created for this purpose. The need for M. s. due to the recent increased need... Sociological reference book

    Cross section of a simulated volume with a thickness of 15 Mpc/h in the modern Universe (redshift z=0). The density of dark matter is shown, with good ... Wikipedia

    M. is an imitation of natural situations, in which a person should ideally behave as if it were a real situation. The advantage of the model is that it allows the subject to react to the situation without facing dangers... ... Psychological Encyclopedia

    The request for "Software" is redirected here. See also other meanings. Software (pronunciation software is not recommended, or rather, not recommended), along with hardware, is the most important component of information ... Wikipedia

    Software Development Software Development Process Process Steps Analysis | Design | Implementation | Testing | Implementation | Maintenance Agile Models/Methods | Cleanroom | Iterative | Scrum | RUP | MSF | Spiral | ... Wikipedia

    Modeling- (military), a method of theoretical or technical research of an object (phenomenon, system, process) by creating and studying its analogue (model), in order to obtain information about real system. M. can be physical, logical, mathematical... ... Border Dictionary

    Computer modeling is one of the effective methods studying complex systems. Computer models are easier and more convenient to study due to their ability to carry out the so-called. computational experiments, in cases where real experiments... ... Wikipedia


The modeling method as a scientific research began to be used in ancient times and gradually captured new areas of scientific knowledge: technical design, construction and architecture, astronomy, physics, chemistry, biology and, finally, information technology. Modeling methodology has long been developed independently by individual sciences. Absent one system concepts, common terminology. Only gradually began to be realized the role of modeling as universal method scientific knowledge.

Term model is widely used in various spheres of human activity and has many semantic meanings.

A model is a material or mentally imagined object that, in the process of research, replaces the original object so that its direct study provides new knowledge about the original object. Under modeling understands the process of constructing, studying and applying models. It is closely related to such categories as abstraction, analogy, hypothesis, etc. The modeling process necessarily includes the construction of abstractions, inferences by analogy, and the construction of scientific hypotheses.

The main feature of modeling is that it is a method of indirect cognition using proxy objects. The model acts as a kind of cognition tool that the researcher puts between himself and the object and with the help of which he studies the object of interest to him. It is this feature of the modeling method that determines the specific forms of using abstractions, analogies, hypotheses, and other categories and methods of cognition.

The need to use the modeling method is determined by the fact that many objects (or problems related to these objects) are either impossible to directly study, or this research requires a lot of time and money.

The modeling process includes three elements:

1) subject (researcher),

2) object of study,

3) a model that mediates the relationship between the cognizing subject and the cognizable object.

Let there be or need to create some object A. We construct (materially or mentally) or find in the real world another object B - a model of object A. The stage of constructing a model presupposes the presence of some knowledge about the original object. The cognitive capabilities of the model are determined by the fact that the model reflects any essential features of the original object. The question of the necessity and sufficient degree of similarity between the original and the model requires specific analysis. Obviously, the model loses its meaning both in the case of identity with the original, and in the case of excessive difference from the original in all significant respects.

Thus, the study of some sides of the modeled object is carried out at the cost of refusing to reflect other sides. Therefore, any model replaces the original only in a strictly limited sense. It follows from this that for one object several “specialized” models can be built, concentrating attention on certain aspects of the object under study or characterizing the object with varying degrees of detail.

Rice. 1 – Stages of computer modeling

The stages of computer modeling can be presented in the form of a diagram (Fig. 1).

Modeling begins with the object of study. At the first stage, the laws governing the research are formed, information is separated from the real object, essential information is formed, and unimportant information is discarded. The transformation of information is determined by the problem being solved. Information that is essential for one task may be unimportant for another. The loss of essential information leads to an incorrect decision or does not allow obtaining a decision at all. Taking into account irrelevant information causes unnecessary difficulties and sometimes creates insurmountable obstacles to a solution. The transition from a real object to information about it is meaningful only when a task is set. At the same time, the formulation of the problem is refined as the object is studied. Thus, at the first stage, the processes of targeted study of the object and clarification of the task occur in parallel and independently of each other. Also at this stage, information about the object is prepared for processing on a computer. A so-called formal model of the phenomenon is constructed, which contains:

    a set of constants, constants that characterize the modeled object as a whole and its components, called statistical or constant parameters of the model;

    kit variables, by changing the value of which you can control the behavior of the model, called dynamic or control parameters;

    formulas and algorithms connecting quantities in each of the states of the modeled object;

    formulas and algorithms that describe the process of changing states of the modeled object.

At the second stage, the formal model is implemented on a computer, suitable software is selected for this, an algorithm for solving the problem is built, a program is written that implements this algorithm, then the written program is debugged and tested on specially prepared test models. Testing is the process of executing a program to identify errors. Selecting a test model is something of an art, although some basic testing principles have been developed and successfully applied. Testing is a destructive process, so a test is considered successful if an error is found. It is often possible to check a computer model for compliance with the original, to check how well or poorly the model reflects the basic properties of an object, with the help of simple model examples, when the result of the modeling is known in advance.

At the third stage, working with a computer model, we directly carry out a computational experiment. We study how our model will behave in this or that case, with certain sets of dynamic parameters, we try to predict or optimize something depending on the task.

The result of the computer experiment will be information model phenomena, in the form of graphs, dependencies of some parameters on others, diagrams, tables, demonstration of the phenomenon in real or virtual time, etc.

Modeling is a cyclical process. This means that the first four-step cycle may be followed by a second, third, etc. At the same time, knowledge about the object under study is expanded and refined, and the initial model is gradually improved. Deficiencies discovered after the first modeling cycle, due to poor knowledge of the object and errors in model construction, can be corrected in subsequent cycles. Thus, the modeling methodology contains great opportunities for self-development.

Computer modeling, which arose as one of the areas of mathematical modeling with the development of information computer technologies, has become an independent and important area of ​​computer application. Currently, computer modeling in scientific and practical research is one of the main methods of cognition. Without computer modeling today it is impossible to solve major problems. scientific tasks. A technology for studying complex problems has been developed, based on the construction and analysis, using computer technology, of a mathematical model of the object being studied. This research method is called a computational experiment. Computational experiments are used in almost all branches of science - in physics, chemistry, astronomy, biology, ecology, even in such purely human sciences as psychology, linguistics and philology. Conducting a computational experiment has a number of advantages over the so-called natural experiment:

    the computational experiment does not require complex laboratory equipment;

    significant reduction in time spent on the experiment;

    the ability to freely control parameters, change them arbitrarily, up to giving them unrealistic, implausible values;

    the possibility of conducting a computational experiment where a full-scale experiment is impossible due to the remoteness of the phenomenon under study in space (astronomy) or because of its significant extension in time (biology), or because of the possibility of introducing irreversible changes into the process being studied.

In these cases, computer modeling is used. Computer modeling is also widely used for educational and training purposes. Computer modeling is the most adequate approach when studying natural science subjects; the study of computer modeling opens up wide opportunities for understanding the connection between computer science and mathematics and other sciences - natural and social. The teacher can use ready-made computer models to demonstrate the phenomenon being studied, be it the movement of astronomical objects or the movement of atoms or a model of a molecule or the growth of microbes, etc. Also, the teacher can challenge students to develop specific models; by modeling a specific phenomenon, the student will not only master specific educational material, but also acquire the skill pose problems and tasks, predict the results of research, make reasonable estimates, identify major and minor factors for building models, select analogies and mathematical formulations, use a computer to solve problems, analyze computational experiments. Thus, the use of computer modeling in education makes it possible to bring the methodology of educational activities closer to the methodology of scientific research work.

The concept of modeling is a very broad concept; it is not limited only to mathematical modeling. The origins of modeling are found in the distant past. The rock paintings of a mammoth pierced by a spear on the cave wall can be seen as a model of a successful hunt created by an ancient artist.

Elements of modeling are often present in children's games; children's favorite pastime is to model objects and relationships from the lives of adults using improvised means. Children grow up, humanity matures. Humanity is learning about the world around us, models are becoming more abstract, losing their external resemblance to real objects. The models reflect underlying patterns established as a result of targeted research. A wide variety of objects can act as models: images, diagrams, maps, graphs, computer programs, mathematical formulas etc. If we replace a real object with mathematical formulas - let’s say, according to Newton’s Second Law, we describe the movement of a certain body with a system of nonlinear equations, or, according to the law of thermal conductivity, we describe the process of heat propagation with a second-order differential equation - then we talk about mathematical modeling if we replace a real object with a computer program - about computer modeling.

But no matter what acts as a model, the process of replacing a real object with the help of a model object is constantly observed in order to study the real object or transmit information about the properties of the real object. This process is called modeling. The replaced object is called the original, the replacing one is called the model (Fig. 2).

Rice. 2 – Modeling elements







2024 gtavrl.ru.