Entropy and Quantity of Information in Technical Designations
DOI:
https://doi.org/10.31649/1997-9266-2023-167-2-58-65Keywords:
entropy, information, technical notation, algebraic theory of entropy, entropy of classification, information algebra, theory of hintsAbstract
Conventional designations of integrated microcircuits are considered as an example of classification and abbreviated name (code) of technical products to answer the question: Why do they say, that some designation systems are "more informative?". Do such notations contain more information compared to other systems? Such tasks are closely related to the tasks of machine learning and the construction of the "semantic web". Based on the algebraic approach and set theory, the characteristics of the entropy of the classification of designations are considered and it is shown that the entropy of such a coded designation is less than that of an arbitrary system of recording technical characteristics, which is explained by the positional structure of the designation and, accordingly, the lower power of the sets that make up a specific designation. Based on the approach of informational algebra, it is confirmed that the establishment in the technical notation of the atomic structure of the sets to which the technical characteristics correspond, really corresponds to the mathematical definition of a more informative structure. Based on the mathematical theory of hints, the structure of the technical designation is analyzed and the possibility of obtaining additional information, for example, relationships between different groups of technical parameters, is indicated. It will be obtained as a result of questions clarifying the interpretation of existing answers. This is a consequence of the property of hint entropy, which has two components — the Shannon entropy and the generalized Hartley measure, which correspond to probabilistic information about the true interpretation of the answer in the set and relational information about the true answer about some type of integrated circuit parameters. Technical notation turns out to be an effective example on which the considered mathematical theories can be applied and accordingly can be an example of a code that, on the one hand, can be understood by a person, and on the other hand, can be used in machine information processing systems.
References
P. Hitzler, “A review of the semantic web field,” Communications of the ACM, no. 64(2), pp. 76-83, 2021. https://doi.org/10.1145/3397512 .
А. І. Катаєва, «Застосування баз знань до неструктурованої текстової інформації,» Матеріали наукової конференції професорсько-викладацького складу, наукових працівників і здобувачів наукового ступеня за підсумками науково-дослідної роботи за період 2019–2020 рр. Вінниця: ДонНУ, квітень–травень 2021 р., с. 324-326.
А. Ю. Берко, О. М. Верес, і В. В. Пасічник, Системи баз даних та знань. Книга 1. Організація баз даних та знань «Магнолія-2006» Рік: 2013, 680 с.
ОСТ 11 073.915-80, Мікросхеми інтегральні. Класифікація і система умовних позначень. Чинний від 1 січня 1980 р.
Eco U. The Open Work, translated by Anna Cancogni: with an introduction by David Robey. Harvard University Press Cambridge, Massachusetts, 1989, 290 p.
R. M. Gray, Entropy and Information Theory. Springer New York, NY, 2013, 355 p.
K. Baclawski, and D. A Simovici, “A characterization of the information content of a classification,” Information Processing Letters, vol. 57, issue 4, pp. 211-214, 26 February 1996.
P. Fejer, and D. Simovici, Mathematical Foundations of Computer Science, Springer, New York, 1990.
J. Kohlas, “Information Algebras: generic structures for inference,” Discrete Mathematics and Theoretical Computer Science, Series ISSN 1439-9911. ISBN 978-1-85233-689-9 .
A. Janssen, and K. Immink, “An Entropy Theorem for Computing the Capacity of Weakly – Constrained Sequences,” IEEE Tran. on Information Theory, vol. 46, no. 3, pp. 1034-1038, May 2000.
Р. Н. Квєтний, П. П. Повідайко, М. М. Компанець, В. В. Гармаш, і Я. А. Кулик, Арифметичні основи проектування мікропроцесорних систем, навч. посіб. Вінниця: ВНТУ, 2017, 111 с.
J. Kohlas, “The mathematical theory of evidence — A short introduction,” in: Doležal, J., Fidler, J. (eds) System Modelling and Optimization. IFIP, Springer, Boston, MA, 1996, pp. 37-53. https://doi.org/10.1007/978-0-387-34897-1 .
M. Pouly, J. Kohlas, and P. Y. A. Ryan, “Generalized Information Theory for Hints,” January 2013 International Journal of Approximate Reasoning, no. 54(1), pp. 228-251. https://doi.org/10.1016/j.ijar.2012.08.004 .
ЦЕОМ. Інтегральні мікросхеми серії КР1533. [Електронний ресурс]. Режим доступу:
https://ksm.nau.edu.ua/arhitectura/files/ims1533.pdf. 21.01.23 .
А. Д. Данілова, А. І. Радченко, і Т. М. Яцків, Методичні рекомендації щодо впровадження цифрових ідентифікаторів у видавничий процес для періодичних видань Національної академії наук України, ПА «Укрінформнаука», 3-е вид., перер. і доп. Київ: Академперіодика, 2019, 60 с.
A. Saha, and N. Manna, Digital Principles and Logic Design. Infinity Science Press LLC, 2007, 505 р. ISBN: 978-1-934015-03-2.
Downloads
-
PDF (Українська)
Downloads: 77
Published
How to Cite
Issue
Section
License
This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).