Index
What is an index?
in·dex/ˈinˌdeks/noun
- 1.an alphabetical list of names, subjects, etc., with references to the places where they occur, typically found at the end of a book."clear cross references supplemented by a thorough index"
- 2.an indicator, sign, or measure of something."exam results may serve as an index of the teacher's effectiveness"Similar:guidecluehintindicationindicatorleadsignsignalmarkevidencesymptomimplicationintimationsuggestion
- 1.record (names, subjects, etc.) in an index."the list indexes theses under regional headings"
- 2.link the value of (prices, wages, or other payments) automatically to the value of a price index."legislation indexing wages to prices"
Index (indices) in Maths is the power or exponent which is raised to a number or a variable. For example, in number 24, 4 is the index of 2.
In words, we might say “2 to the power of 4” = 16.
- Index in a documentAn index is an alphabetical list of topics or subjects in a document, usually found at the end, that includes the page numbers where each topic can be found. It's a navigational tool that helps readers quickly find information, especially in nonfiction books. For example, an index might include names, places, keywords, and major topics.
- Index in investingAn index is a standardized way to track the performance of a group of assets or stocks, such as a broad-based index or one that tracks a specific sector. Indexes are important in financial markets because they help investors measure performance, understand risk, and guide the development of financial products.
An index is a kind of set. A collection of things that identify and contain something larger than the sum of their parts.
With integrated information from data synthesized by technology like digital twins and cultural practices like storytelling (technoculture), we can even more accurately index, describe & measure our informational distance from ours to other actual worlds. We can then develop a systems-wide process of alter-globalization.
A better world is possible.
Information
Information is an abstract concept that refers to that which has the power to inform. At the most fundamental level, information pertains to the interpretation (perhaps formally) of that which may be sensed, or their abstractions. Any natural process that is not completely random and any observable pattern in any medium can be said to convey some amount of information. Whereas digital signals and other data use discrete signs to convey information, other phenomena and artifacts such as analogue signals, poems, pictures, music or other sounds, and currents convey information in a more continuous form. Information is not knowledge itself, but the meaning that may be derived from a representation through interpretation.
The concept of information is relevant or connected to various concepts, including constraint, communication, control, data, form, education, knowledge, meaning, understanding, art, mental stimuli, pattern, perception, proposition, representation, and entropy.
Core Concepts Information Entropy Communication Probability Possible Worlds Meaning Emergence Knowledge Worldbuilding Related Concepts 0 Belief Best Explanation Bioinformatics Big Data Cause Certainty Chance Coherence Correspondence Combinatorics Consciousness CPT Symmetry Decoherence Dynamic Systems Differential Equations Divided Line Downward Causation Emergent Dualism ERR God Identity Theory Infinite Regress Intension/Extension Intersubjectivism Justification Kissing Number Knot Theory Lagrangian Lorenz Attractor Materialism Mental Causation Multiple Realizability Naturalism Necessity News Noosphere Postmodernism Quantum Information Realism Reimann Hypothesis Reductionism Samsara Schrödinger's Cat Supervenience Truth Universals Wick Rotation Wisdom World Line World Sheet World Model |
Information theory is the scientific study of the quantification, storage, and communication of digital information. The field was fundamentally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.
A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory and information-theoretic security.
Applications of fundamental topics of information theory include source coding/data compression (e.g. for ZIP files), and channel coding/error detection and correction (e.g. for DSL). Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones and the development of the Internet. The theory has also found applications in other areas, including statistical inference, cryptography, neurobiology, perception, linguistics, the evolution and function of molecular codes (bioinformatics), thermal physics, molecular dynamics, quantum computing, black holes, information retrieval, intelligence gathering, plagiarism detection, pattern recognition, anomaly detection and even art creation. (via. Wiki)