Content-based presentation approach. IV


Class: 10.

The purpose of the lesson: To teach how to measure the information volume of a message through a meaningful approach.

Lesson Objectives:

  • Educational: teach to measure the information volume of a message through a meaningful approach.
  • Developmental: development of thinking, speech, fine motor skills, imaginative perception.
  • Educational: grafting careful attitude to information and technology, personal responsibility for work results, accuracy, perseverance, self-discipline.

Lesson type: Explanation of new material with elements of a workshop.

Textbooks:

  • “Informatics 10” (basic course), ed. N.V. Makarova, “Peter”, 2003.
  • Ugrinovich N.D. Computer science. Basic course 10th grade. - M.: Publishing house "BINOM".

Basic concepts:

  • Half division method;

During the classes

I. Organizational moment

Setting the mood for the work environment.

II. New material

In the last lesson we learned to distinguish informative messages from uninformative ones.

We found out that to determine the amount of information in a message about the occurrence of one event out of more than two equally possible ones, the following formulation is necessary: ​​“A message that reduces uncertainty by 2 times contains 1 bit of information.” We analyzed the problem with tossing a coin: “Before tossing the coin, there were two equally probable outcomes. This determines the uncertainty of the situation. In other words, uncertainty is the number of possible events. After receiving a message about the result, there was only one option left. How much has the uncertainty of the situation decreased?”

Now let's solve the problem of determining the amount of information in a message using the method of halves (dichotomy). So that at each search step exactly half of the options can be discarded. We will organize the work in the form of a game “Guess the answer”.

For example, I think that a book is on some shelf, but I don’t tell you about it. You need to determine which of the 8 shelves the book is on. Questions must be asked in such a way that each answer (“yes” or “no”) reduces the uncertainty by exactly half. Therefore, no matter how many questions are asked, so many bits of information carry the message about the guessed object. Fills in during the game table 2, establishing the relationship between the number of events and the amount of information in the message.

Analyzing the solution to previous problems, we introduce symbols and we deduce the formula of R. Hartley. For example, the chain of reasoning could be as follows:

  1. When guessing the mark, two questions were asked, each of which reduced the uncertainty of the situation by half, and there were four possible options in total. Formalization of the reasoning – 2 · 2 = 4, i.e. 2 2 = 4.
  2. When guessing the location of the book, three questions were asked, each of which reduced the uncertainty of the situation by half, and there were eight possible options in total. Formalization of the reasoning – 2 · 2 · 2 = 8, i.e. 2 3 = 8.
  3. Based on this, we can derive the formula 2 i = N, Where i– amount of information in the message, N– number of options (events).
  4. We use the resulting formula to determine the amount of information when tossing a coin. 2 1 = 2, i= 1 bit.

Number 2 in the formula means reducing the uncertainty by half, in accordance with the definition of the concept of “bit”. Using the formula, we fill out the table of integer powers of two up to 210 = 1024. The table establishes the relationship between the quantities of information in the message ( i) and the number of equally probable events ( N) and is a support for students in solving problems.

Let's create a general diagram:

Let's solve the problem using an example.

Task 1. Classes can take place in one of the rooms, numbered from 1 to 16. How much information does the teacher's message contain that classes will be held in room No. 7?

III. Summarizing

Today we studied:

  • Half division method;
  • Measuring the amount of information in a message in two ways: using a formula and the halving method,
  • Measuring the amount of information in a message over several actions,
  • Measuring the number of events if the information volume of the message is known.

IV. Homework

Solve the problem: There are 16 red apples in a bag. How much information does the message that you got a red apple contain?



1) a person receives a message about some event; it is known in advance uncertainty of knowledge person about the expected event. Uncertainty of knowledge can be expressed either by the number of possible options for an event, or by the probability of expected options for an event; 2) as a result of receiving the message, the uncertainty of knowledge is removed: from a certain possible number of options, one was chosen; 3) the formula calculates the amount of information in the received message, expressed in bits.


The formula used to calculate the amount of information depends on the situations, of which there can be two:

  • 2. Probabilities ( p) possible options for the event are different and they are known in advance:

(p i ), i = 1.. N. It's still here N- the number of possible options for the event.

  • 1. Everything possible options events are equally probable. Their number is finite and equal N .

Equally probable events. If denoted by the letter i amount of information in the message that one of the events occurred N equally probable events, then the values i And N are related by Hartley's formula:

2 i = N

1 bit is the amount of information in a message about one of two equally probable events .

Hartley's formula is an exponential equation. If i is an unknown quantity, then the solution to this equation will be:

i= log 2 N

These formulas are identical to each other.


  • Example 1. How much information does the message that a queen of spades was drawn from a deck of cards contain?

Solution: There are 32 cards in the deck. In a shuffled deck, the loss of any card is an equally probable event. If i- the amount of information in the message that a specific card has fallen (for example, the queen of spades), then from Hartley’s equation:

2 i = 32 = 2 5

From here: i= 5 bits.


  • Example 2. How much information does the message about rolling up a side with the number 3 on a six-sided die contain?

Solution: Considering the loss of any edge to be an equally probable event, we write Hartley’s formula:

2 i = 6.

From here: i= log 2 6 = 2.58496 bits.


Unequally probable events (probabilistic approach). If the probability of some event is p , A i (bit) is the amount of information in the message that this event occurred, then these quantities are related to each other by the formula:

2 i = 1/ p

Solving this exponential equation for i, we get:

i= log 2 (1/ p) Shannon's formula


Qualitative approach

  • Information is the knowledge people receive from various messages.
  • Message - This information flow(data stream), which in the process of transmitting information reaches the receiving entity.

Message

Informative message , which replenishes human knowledge, i.e. carries information for him.

Uninformative information is “old”, i.e. the person already knows this, or the content of the message is unclear to the person


Quantitative approach in the equiprobability approximation

Events are equally probable , if none of them has an advantage over the others.

Let's look at an example. “How much information does the message about the result of throwing a six-sided die convey?” From Hartley's equation: 2 i = 6.

Since 2 2 i

Then we determine a more precise value (accurate to five decimal places), which i= 2.58496 bits. Note that when this approach the amount of information can be expressed as a fraction.


Probabilistic approach to information measurement

Probability of some event is a quantity that can take values ​​from zero to one.

Probability impossible event is equal to zero

(for example: “tomorrow the Sun will not rise above the horizon”)

Probability reliably th event is equal to unit

(for example: “Tomorrow the sun will rise above the horizon”).

Probability some events are determined through repeated observations (measurements, tests). Such measurements are called statistical. And the more measurements are performed, the more accurately the probability of an event is determined.


Let's look at a few examples:

Example 3. Two bus routes stop at the bus stop: No. 5 and No. 7. The student is given the task: to determine how much information is contained in the message that bus No. 5 has arrived at the stop, and how much information is contained in the message that bus No. 7 has arrived.


Solution: The student did the research. During the entire working day, he estimated that buses approached the stop 100 times. Of these, bus No. 5 arrived 25 times and bus No. 7 approached 75 times. Making the assumption that buses run with the same frequency on other days, the student calculated the probability of bus No. 5 appearing at the stop: p 5 = 25/100 = 1/4, and the probability of bus number 7 appearing: p 7 = 75/100 = 3/4.

Hence, the amount of information in the message about bus No. 5 is equal to: i 5 = log 2 4 = 2 bits. The amount of information in the message about bus number 7 is equal to:

i 7 = log 2 (4/3) = log 2 4 – log 2 3 = 2 – 1.58496 = 0.41504 bits.


Example 4 . Let's consider another version of the bus problem. Buses No. 5 and No. 7 stop at the stop. The message that bus No. 5 has arrived at the stop carries 4 bits of information. The probability of bus No. 7 appearing at the stop is two times less than the probability of bus No. 5 appearing. How many bits of information does the message about bus No. 7 appearing at the stop carry?

Solution: Let us write the problem condition in the following form:

i 5 = 4 bits, p 5 = 2 p 7

Let's remember the connection between probability and amount of information: 2 i = 1/ p

From here: p = 2 – i

Substituting into the equality from the problem conditions, we get:


Answer the questions orally:

  • What is meant by information?
  • What can you do with the information?
  • What types of information representation in a computer do you know?
  • What message encoding techniques were used in ancient times?
  • What is code and information encoding?
  • Give examples in various ways encoding information.
  • List the advantages and disadvantages of coding used in computers.
  • What is the name of the encoding used to represent characters entered from the keyboard?
  • Let's think about what can serve as an estimate of the amount of information?
  • Is it true that it's worn out
  • book, if not in it
  • torn pages, carries for
  • there are exactly the same number of you
  • information, how much is the same
  • new?
  • Let's think about what can serve as an estimate of the amount of information?
  • A block of stone weighing three tons carries as much information for archaeologists as a good photograph of it in an archaeological magazine.
  • Is not it?
Let's think about what can serve as an estimate of the amount of information?
  • Let's think about what can serve as an estimate of the amount of information?
  • When a Moscow radio studio broadcasts the latest news, both a resident of the Moscow region and a resident of Novosibirsk receive the same information. But the flow of radio wave energy in Novosibirsk is much less than in Moscow.
  • Consequently, the signal power, as well as the weight of the carrier, cannot in any way serve as an estimate of the amount of information carried by the signal.
  • How then can we measure the amount of information?
  • Different approaches to defining and measuring information
  • Meaningful
  • (probabilistic) approach:
  • Amount of information as a measure of uncertainty reduction
  • knowledge
  • Watch the video
Let's summarize what has been said
  • Let us
  • there is a coin
  • which we
  • throw it on the flat
  • surface.
  • Possible events
  • Event that happened
  • One of the following is equally likely to happen
  • two possible events - a coin
  • will end up in one of two positions:
  • "heads or tails".
  • Events are equally probable if, with an increasing number of experiments, the numbers of heads and tails gradually become closer.
  • Before the throw there is uncertainty of our knowledge ( two events are possible), and after the throw there is complete certainty.
  • The uncertainty of our knowledge is reduced by half, since out of two possible equally probable events, one was realized.
Reducing knowledge uncertainty
  • When throwing an equilateral tetrahedral pyramid, there are 4 equally probable events.
  • When throwing a six-sided die, there is
  • 6 equally probable events.
Reducing knowledge uncertainty
  • A message that reduces the uncertainty of knowledge by half,
  • carries 1 bit of information.
  • A bit is the minimum unit of information.
  • 1 byte = 23 bits = 8 bits
  • 1 KB = 210 bytes = 1024 bytes
  • 1 MB = 210 KB = 1024 KB
  • 1 GB = 210 MB = 1024 MB
  • GB
  • kbyte
  • MB
  • TB
  • :1024
  • :1024
  • :1024
  • :1024
  • *1024
  • *1024
  • *1024
  • *1024
  • The amount i of information contained in the message that one of N equally probable events has occurred is determined by solving the exponential equation
  • 2i = N
  • Task: In roulette, the total number of holes is 128. How much information will we receive in a visual message about the ball stopping in one of the holes?
  • N=128
  • i - ?
  • Given:
  • Solution:
  • 2i = N
  • 2i = 128
  • 27 = 128
  • i = 7 bits
  • Answer: i = 7 bits
  • Number of possible events and amount of information
Task:
  • Task:
  • Box contains 32 pencils, all pencils different color. They pulled out a red one at random. How much information was obtained?
  • Solution.
  • Since drawing a pencil of any color from the 32 pencils in the box is equally probable, the number of possible events is 32.
  • N = 32, i = ?
  • N = 2i, 32 = 25, i = 5 bits.
  • Answer: 5 bits
№ 1
  • The book has 512 pages. How much information does a message convey that a bookmark is on a page?
  • Solving problems in a notebook
№ 2
  • How much information does the message that on a 4x4 square field contain, one of the cells is shaded?
  • Solving problems in a notebook
№ 3
  • How much information does the message about rolling up a side with the number 3 on a six-sided die contain?
  • Solving problems in a notebook
What is the meaning of a content approach to measuring information?
  • What is the meaning of a content approach to measuring information?
  • What formula was studied?
  • Name in ascending order what units of measurement of information you know.
  • How are units of measurement of information interrelated?
  • Fixing the material
1. You approached a traffic light when the light was red. After that the yellow light turned on. How much information did you receive?
  • 1. You approached a traffic light when the light was red. After that the yellow light came on. How much information did you receive?
  • Solve orally
2. You approached a traffic light when the light was yellow. After that the light turned green. How much information did you receive?
  • 2. You approached a traffic light when the light was yellow. After that the light turned green. How much information did you receive?
  • Solve orally
3. "Are you getting off at the next stop?" - they asked the man on the bus. “No,” he replied. How much information does the answer contain?
  • 3. "Are you getting off at the next stop?" - they asked the man on the bus. “No,” he replied. How much information does the answer contain?
  • Solve orally
4. How much information does the message indicate that the program you need is on one of the eight floppy disks?
  • 4. How much information does the message indicate that the program you need is on one of the eight floppy disks?
  • Solve orally
Homework
  • 1. Analyze the entries in the notebook.
  • 2. Solve 2 individual problems on cards.

How to measure information?

Question: “How to measure information?”

very difficult.

The answer to this depends on what you mean by information. But

since information can be defined in different ways, then measurement methods Same may be different.

A meaningful approach to measuring information

For a person, information is human knowledge . Let's consider the issue from this point of view.

Obtaining new information leads to expanding knowledge. If some message leads to reducing uncertainty our knowledge, then we can say thatsuch a message contains information.

It follows that the message is informative(i.e. contains non-zero information), if it adds to knowledge person. For example, a weather forecast for tomorrow is an informative message, but a message about yesterday's weather is uninformative, because we already know this.

It is not difficult to understand that the information content of the same message can be different for different people. For example: “2x2=4” is informative for a first-grader learning the multiplication table, but uninformative for a high school student.

Informativeness of the message

But in order for the message to be informative, it must also be understandable.

To be understandable means to be logically related to previous knowledge person. The definition “the value of a definite integral is equal to the difference between the values ​​of the antiderivative integrand at the upper and lower limits” most likely will not add to the knowledge of a high school student, because it is not clear to him. In order to understand this definition, you need to finish studying elementary mathematics and know the beginnings of higher mathematics.

Obtaining any knowledge should go from simple to complex. And then every new message will be at the same time understandable, which means it will carry information for a person.

A message carries information for a person if the information it contains is new and understandable to him.

Unit of information

Obviously, distinguishing only two situations: “no information” - “there is information” is not enough to measure information. We need a unit of measurement, then we can determine which message contains more information, which

The unit of measurement of information was defined in a science called information theory. This unit is called “bit”. Its definition goes like this:

A message that reduces knowledge uncertainty by half carries 1 bit of information.

The uncertainty of knowledge about some event is the number of possible outcomes

Example:

After taking a test or completing a test, the student is tormented by uncertainty; he does not know what grade he received.

Finally the teacher announces the results and he gets one of two things information messages: “pass” or “fail”, and after the test one of four information messages: “2”, “3”, “4” or “5”.

The information message about the assessment for the test leads to reducing the uncertainty of knowledge by half, since one of two possible information messages has been received. Information message about the assessment for test leads to reduce knowledge uncertainty by four times, since one of four possible information messages has been received.

Example:

The bookcase has eight shelves. The book can be placed on any of them. How much information does the message contain about where the book is?

Asking questions:

- Is the book above the fourth shelf?

No.

- Is the book below the third shelf?

Yes.

- Is the book on the second shelf?

No.

Well now everything is clear! The book is on the first shelf!

Each answer reduced the uncertainty by half.

A total of three questions were asked. This means 3 bits have been dialed

information. And if it were immediately said that the book lies on

Description of the presentation by individual slides:

1 slide

Slide description:

2 slide

Slide description:

3 slide

Slide description:

Uncertainty of knowledge and quantity of information Another approach to measuring information is called meaningful approach. In this case, the amount of information is associated with the content (meaning) of the message received by a person. Let us remember that from a “human” point of view, information is the knowledge that we obtain from outside world. The amount of information contained in a message should be greater the more it adds to our knowledge. How is the unit of measurement of information determined from this point of view? You already know that this unit is called a bit. The problem of measuring information is studied in information theory, the founder of which is Claude Shannon. In information theory, a bit is defined as follows:

4 slide

Slide description:

CONTENTIVE APPROACH TO INFORMATION MEASUREMENT A message that one of two equally probable events has occurred (the uncertainty of knowledge has been halved) carries 1 bit of information. 8 colored balls in a basket - 8 equally probable events The uncertainty of knowing that a red ball can be drawn from the basket is 8. A more strict definition of equiprobability: if you increase the number of coin tosses (100, 1000, 10000, etc.), then the number of heads and the number of tails will be increasingly closer to half the number of coin tosses. Therefore, we can say this: The uncertainty of knowledge about the result of some event (throwing a coin or a die, drawing lots, etc.) is the number of possible results.

5 slide

Slide description:

The bookcase has eight shelves. The book can be placed on any of them. How much information does the message contain about where the book is? We ask questions: - Is the book located above the fourth shelf? - No. - Is the book below the third shelf? - Yes. - Is the book on the second shelf? - No. - Well, now everything is clear! The book is on the first shelf! Each answer reduced the uncertainty by half. A total of three questions were asked. This means that 3 bits of information have been typed. And if it were immediately said that the book is on the first shelf, then the same 3 bits of information would be transmitted by this message.

6 slide

Slide description:

 BINARY SEARCH METHOD You need to guess the intended number from the range of numbers from 1 to 8 8 options for possible events  3 questions  3 bits of information What grade did your friend get in the exam? Four equally probable events. 1 2 3 4 5 6 7 8 5 6 7 8 5 6 Game using the binary search method Game rules: You need to guess the intended number from a given range of integers. The player guessing the number asks questions that can only be answered with “yes” or “no.” If each answer cuts off half of the options (reduces the choice by 2 times), then it carries 1 bit of information. Then the total amount of information (in bits) obtained when guessing a number is equal to the amount questions asked. Question No. Questions yes no 1 Is the number less than 5?  2 Is the number less than 7?  3 Is this number equal to 5? 

7 slide

Slide description:

Now let's try to get a formula that calculates the amount of information contained in a message that one of many equally probable results of some event took place. Let us denote by the letter N the number of possible results of an event, or, as we also called it, the uncertainty of knowledge. The letter i will denote the amount of information in a message about one of N results. In the coin example: N = 2, i = 1 bit. In the example with estimates: N = 4, i = 2 bits. In the example with a rack: N = 8, i = 3 bits. It is easy to see that the relationship between these quantities is expressed by the following formula: 2i = N. Indeed: 21 = 2; 22 = 4 ; 23 = 8.

8 slide

Slide description:

We are already familiar with the resulting formula from basic course computer science, and we will meet with her more than once. The significance of this formula is so great that we called it the main formula of computer science. If the value N is known, a i is unknown, then this formula becomes an equation for determining i. In mathematics it is called an exponential equation. Let the rack have not 8, but 16 shelves. To answer the question of how much information is contained in the message about the location of the book, you need to solve the equation: 2i = N. Since 16 = 2, then i = 4 bits. The amount of information (i) contained in a message about one of N equally probable outcomes of some events is determined by solving the exponential equation: 2i = N. If the value of N is equal to an integer power of two (4, 8, 16, 32, 64, etc. ), then the exponential equation is easy to solve in your head, since i will be an integer. What, for example, is the amount of information in the message about the result of throwing a die, which has six sides and, therefore, N = 6? You can guess that the solution to the equation 2i = 6. will be fractional number, lying between 2 and 3, since 22 = 4< 6, а 2 = 8 >6. How can you find out this number more accurately?

Slide 9

Slide description:

EXPONENTIAL EQUATION N i Determination of the amount of information contained in a message that one of N equally probable events has occurred N i Determination of the number of equally probable events N, if it is known how much information a person received in a message that one of these events has occurred. 2 i = N N i N i N i N i 1 0.00000 17 4.08746 33 5.04439 49 5.61471 2 1.00000 18 4.16993 34 5.08746 50 5.64386 3 1.58496 19 4.24793 35 5.12928 51 5.67243 4 2.00000 20 4.32193 36 5.16993 52 5.70044 5 2.32193 21 4.39232 37 5.20945 53 5.72792 6 2.58496 22 4.45943 38 5.24793 54 5.75489 7 2.80735 23 4.52356 39 5.28540 55 5.78136 8 3.00000 24 4.58496 40 5.321 93 56 5.80735 9 3.16993 25 4.64386 41 5.35755 57 5.83289 10 3.32193 26 4.70044 42 5.39232 58 5.85798 11 3.45943 27 4.75489 43 5.42626 59 5.88264 12 3.58496 28 4.80735 44 5.45943 60 5.90689 13 3.70044 29 4.85798 45 5.49185 61 5.93074 14 3.80735 30 4.90689 46 5.52356 62 5. 95420 15 3.90689 31 4.95420 47 5.55459 63 5.97728 16 4.00000 32 5.00000 48 5.58496 64 6.00000







2024 gtavrl.ru.