Measures and amount of information
For the first time, the term "information" was proposedancient philosophers and sounded like informatio - clarification, presentation, awareness. However, in academic circles, disputes still remain about the most accurate and complete definition of this word. For example, the scientist Claude Shannon, who laid the foundations of the theory of information, believes that information is the withdrawn uncertainty of the subject's knowledge about something. The simplest definition of "information" sounds like this - this is the degree of awareness of the object.
In order to determine the amount of information,should be familiarized with the classification of measures of information data. In total there are three measures of information: syntactic, semantic and pragmatic. Let's consider each measure separately:
1. The syntactic measure works with data that does not reflect the semantic relation to the object. This measure deals with the type of media, the way of presentation and coding, the speed of transmission and processing of information.
In this case, the measure is the amount of information -the amount of memory required to store data about the object. The information volume is equal to the number of digits of the binary system with which the message in question is encoded and measured in bits.
In order to determine the syntacticthe amount of information, we turn to the concept of entropy - a measure of the uncertainty of the state of the system, namely, our knowledge of the state of its elements and the state of the system as a whole. Then the amount of information is a change in the measure of the uncertainty of the system, that is, a change (increase or decrease) in entropy.
2. A semantic measure serves to determine the semantic content of the data and associates the relevant information parameters with the user's ability to process the message. This concept was called the user's thesaurus. A thesaurus is understood as a collection of information about an object that a system or user has. The maximum amount of information in terms of semantics is possible in the case when the entire amount of data is understandable to the user or system - can be processed using the available thesaurus - and, therefore, is a relative concept.
3. A pragmatic measure of information measures the value of information to achieve a specific goal. This concept is also relative and has a direct bearing on the ability of the system or user to apply a specific amount of data to a particular problem area. Therefore, it is advisable to measure information from a pragmatic point of view in the same units of measure as the objective function.
Qualitative characteristics of information include the following indicators:
- Representativeness - the correct selection and presentation of information for the most optimal display of the characteristics of the object.
- Content - the ratio of the amount of information in the semantic dimension to the amount of data processed.
- Completeness - presence in the message of the minimal necessary for achievement of the purpose of a set of information.
- Availability - execution of procedures for obtaining and converting data by a user or system.
- Relevance - the degree of preservation of the value of information from the moment of receipt to the moment of use.
- Timeliness - the arrival of information no later than the required time.
- Accuracy - the degree to which information corresponds to the actual state of the object.
- Reliability - the ability of information data to reflect real objects with specified accuracy.
- Stability is a property of information that allows you to respond to the transformation of the original data in time, while maintaining the specified accuracy.