Hubbry Logo
Notation systemNotation systemMain
Open search
Notation system
Community hub
Notation system
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Notation system
Notation system
from Wikipedia

In linguistics and semiotics, a notation system is a system of graphics or symbols, characters and abbreviated expressions, used (for example) in artistic and scientific disciplines to represent technical facts and quantities by convention.[1][2] Therefore, a notation is a collection of related symbols that are each given an arbitrary meaning, created to facilitate structured communication within a domain knowledge or field of study.

Standard notations refer to general agreements in the way things are written or denoted. The term is generally used in technical and scientific areas of study like mathematics, physics, chemistry and biology, but can also be seen in areas like business, economics and music.

Written communication

[edit]

Writing systems

[edit]
  • Phonographic writing systems, by definition, use symbols to represent components of auditory language, i.e. speech, which in turn refers to things or ideas. The two main kinds of phonographic notational system are the alphabet and the syllabary. Some written languages are more consistent in their correlation of written symbols (or graphemes) with sound (or phonemes), and are therefore considered to have better phonemic orthography.
  • Ideographic writing, by definition, refers to things or ideas independently of their pronunciation in any language. Some ideographic systems are also pictograms that convey meaning through their pictorial resemblance to a physical object.

Linguistics

[edit]

Biology and medicine

[edit]

Chemistry

[edit]
  • A chemical formula describes a chemical compound using element symbols and subscripts, e.g. H
    2
    O
    for water or C
    6
    H
    12
    O
    6
    for glucose
  • SMILES is a notation for describing the structure of a molecule with a plain text string, e.g. N=N for nitrogen or CCO for ethanol

Computing

[edit]

Logic

[edit]

A variety of symbols are used to express logical ideas; see the List of logic symbols

Management

[edit]
  • Time and motion study symbols such as therbligs

Mathematics

[edit]

Physics

[edit]

Typographical conventions

[edit]
  • Infix notation, the common arithmetic and logical formula notation, such as "a + bc".
  • Polish notation or "prefix notation", which places the operator before the operands (arguments), such as "+ a b".
  • Reverse Polish notation or "postfix notation", which places the operator after the operands, such as "a b +".

Sports and games

[edit]

Graphical notations

[edit]

Music

[edit]
  • Musical notation permits a composer to express musical ideas in a musical composition, which can be read and interpreted during performance by a trained musician; there are many different ways to do this (hundreds have been proposed), although staff notation provides by far the most widely used system of modern musical symbols.

Dance and movement

[edit]

Science

[edit]
  • Feynman diagrams permit a graphical representation of a perturbative contribution to the transition amplitude or correlation function of a quantum mechanical or statistical field theory
  • Structural formulas are graphical representations of molecules
  • Venn diagrams shows logical relations between a finite collection of sets.
  • Drakon-charts are a graphical representation of algorithms and procedural knowledge.
  • Unified Modeling Language is a standard notation for many types of diagrams

Other systems

[edit]

See also

[edit]

References

[edit]

Further reading

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
A notation system is a system of signs or symbols used to represent , especially in , , and music. These systems provide a standardized means of encoding and decoding complex data, enabling precise communication, analysis, and transmission of knowledge across cultures and generations. By abstracting concepts into visual or written forms, notation systems bridge the gap between ephemeral ideas—such as sounds, movements, or quantities—and durable records that support collaboration and innovation. Common examples illustrate their versatility: in , symbols like + for and for allow manipulation of abstract relationships. In music, staff notation employs lines, clefs, and note shapes to convey pitch, duration, and , evolving from ancient neumes to modern mensural systems that precisely capture . In chemistry, formulas such as H₂O denote atomic compositions and molecular structures, serving as a universal for substances and reactions. Beyond these, notation extends to fields like dance (e.g., for movement sequences) and (e.g., SMILES for molecular representations), highlighting their role in diverse human endeavors.

Core Concepts

Definition and Purpose

A notation system is a structured collection of graphics, symbols, characters, or abbreviated expressions, governed by formalized rules for encoding, representing, and decoding information within particular domains of knowledge or communication. These systems function as diagrammatic or symbolic tools that abstract complex ideas into manageable forms, allowing users to manipulate and interpret them systematically. Key characteristics include syntax, which defines the rules for combining symbols; semantics, which assigns meanings to those symbols in relation to objects or concepts; and , which addresses the contextual use and interpretation by users. Notation systems can be universal, such as alphabetic scripts applicable across languages, or domain-specific, tailored to fields like logic or for precise technical representation. The primary purpose of notation systems is to facilitate precise communication and among users, reducing ambiguity and errors in conveying complex information. By providing a shared framework, they enable the encoding of ideas for transmission, analysis, and replication, evolving from oral traditions to durable written records that preserve across generations. This supports collaborative reasoning and problem-solving, as symbols can be manipulated according to syntactic rules to derive new insights without relying on verbal description alone. Notation systems are essential for , allowing scalable recording and transmission of ideas by distilling intricate realities into compact forms. Historically, they shifted from concrete pictographs, which directly depicted objects for rudimentary and depiction, to abstract symbols representing sounds or relations, enhancing efficiency and generality in information processing. This evolution underscores their role in advancing human cognition, from localized preservation to global knowledge dissemination.

Historical Development

The earliest forms of notation systems emerged in prehistoric times through cave paintings and markings that served as proto-notations for counting, , and recording events, with symbols dating back nearly 40,000 years across European caves. , such as those incised on bones and stones, also appeared around this period as rudimentary methods for tracking quantities and lunar cycles, laying the groundwork for systematic representation. In ancient , Sumerian cuneiform developed around 3200 BCE as one of the first true writing systems, primarily used for administrative records like economic transactions on clay tablets. Concurrently, arose circa 3100 BCE, combining ideographic elements representing concepts with phonetic signs to denote sounds, enabling both pictorial and linguistic expression in monumental and religious contexts. During the classical eras, the Greek alphabet emerged around 800 BCE by adapting Phoenician script, introducing vowels and a fully phonetic system that profoundly influenced Western notation by facilitating precise representation of spoken language. In parallel, Chinese logographic scripts originated during the circa 1200 BCE, prioritizing character-based symbols that convey meaning through visual forms rather than , forming the basis of a non-alphabetic tradition. From the medieval period onward, the adoption of —originally Hindu in origin—spread to in the 12th century, notably through Leonardo Fibonacci's 1202 publication , which demonstrated their efficiency for arithmetic over and accelerated commercial calculations. The invention of the by around 1440 further standardized typographical notations by enabling mass production of uniform text, reducing variations in script forms and promoting consistency across printed materials. In the 20th and 21st centuries, digital advancements transformed notation systems with the creation of ASCII in 1963, a seven-bit encoding standard that unified character representation for early computers, supporting basic text interchange in English and Western scripts. This evolved into , formalized in 1991 by the , providing a universal framework for encoding over a million characters from global writing systems to enable seamless multilingual digital text.

Text-Based Notations

General Writing Systems

General writing systems form the foundational frameworks for representing spoken languages through visual symbols, enabling the recording and transmission of information across cultures and time periods. These systems vary in how they map linguistic units—such as sounds, syllables, or morphemes—to graphic forms, influencing , communication efficiency, and cultural adaptation. The primary types of writing systems include alphabetic, , , , and logographic scripts. Alphabetic systems, such as the Latin alphabet used in English and many , or the Cyrillic alphabet employed in Russian, typically consist of 20-30 symbols that represent individual phonemes, including both consonants and vowels, allowing for flexible recombination to form words. Abjads, like the Hebrew or scripts, focus primarily on consonants with 20-30 symbols, where vowels are often omitted or indicated by optional diacritics, relying on reader familiarity with word roots for disambiguation. Abugidas, exemplified by used for and other Indic languages, feature consonants as base characters with an inherent vowel sound, modified by diacritics for other vowels, typically using around 30-50 primary symbols plus modifiers. Syllabaries, such as Japanese and , assign distinct symbols to s (e.g., about 46 basic characters in hiragana), suiting languages with consistent syllable structures but increasing inventory size for complex phonologies. Logographic systems, like Chinese hanzi, use characters to represent words or morphemes, with comprehensive dictionaries cataloging approximately 50,000 characters, though daily requires mastery of 2,000-3,000. Design principles underlying these systems emphasize phonetic mapping to elements, efficiency in stroke count for ease of writing, and adaptability to linguistic features like morphology or . Characters across diverse scripts average about three strokes, balancing recognizability and production speed while minimizing to around 50%, which optimizes information density without excessive complexity. Evolutionarily, writing systems originated from ideographic representations of concepts, such as Sumerian pictograms around 3200 BCE, gradually incorporating phonographic elements to denote sounds via the principle, leading to fully phonetic scripts like the derived from Proto-Sinaitic around 1900 BCE. Standardization efforts, such as the international standard, assign four-letter codes to scripts (e.g., "Latn" for Latin, "Hans" for Han simplified) to facilitate digital encoding, multilingual processing, and bibliographic control, though challenges persist in handling script variations and hybrid systems in global computing environments. A notable tactile extension is , developed in 1824 by , a blind French educator, which uses a 6-dot cell configuration in a 2x3 grid to represent 63 distinct characters, adapting alphabetic principles for visually impaired users through raised dots readable by touch. In , these systems support phonetic adaptations, such as for transcribing non-Latin scripts.

Typographical Conventions

Typographical conventions in notation systems govern the visual formatting of text to enhance clarity, consistency, and aesthetic appeal across written communication. These rules encompass , which originated in scholarship; for instance, the for short pauses, period for sentence ends, and colon for longer pauses were developed by of around the 3rd century BCE to indicate rhythmic breaks in oral readings of poetry. Spacing between words and lines ensures readability, while typically denotes proper nouns, sentence beginnings, or emphasis in many alphabetic scripts. Italics serve to highlight terms, foreign words, or stress, providing subtle cues without altering the core symbols. Typefaces play a crucial role in distinguishing notation styles, with serif fonts—characterized by small decorative strokes at letter ends—traditionally favored for printed texts due to their perceived elegance and guidance of the eye along lines, as seen in historical book printing since the . In contrast, fonts, lacking these strokes, offer a cleaner, more modern appearance and are preferred for digital displays to reduce visual strain on screens. Ligatures, such as the æ combining "a" and "e" in Latin-derived texts, bind characters for historical accuracy and space efficiency, originating from medieval scribal practices to represent diphthongs. adjusts inter-character spacing—particularly for pairs like "AV" or "To"—to achieve optical evenness, a technique refined in metal type era and essential for precise symbol alignment in dense notations. Standardization through encoding and style guides promotes uniformity in notation systems. , introduced in version 1.0 in October 1991, provides a universal framework for representing 159,801 characters across scripts as of version 17.0 (September 2025), enabling consistent digital rendering of notations including diacritics in non-Latin systems like accented vowels in French or tonal marks in Vietnamese. Influential guides include , first published in 1906 by the as a compilation of typographical rules for , and the APA Publication Manual, debuting in 1952 to standardize notations in psychological and social sciences. In mathematical typesetting, conventions dictate italicization for variables (e.g., x for an unknown) to differentiate them from upright operators like "+" or "sin" for functions, ensuring unambiguous of expressions; these rules, formalized in standards like those from the International Union of Pure and Applied Chemistry (IUPAC), trace to 19th-century printing practices for scientific clarity. Knuth's system, released in , revolutionized these conventions by automating high-quality with programmable control over italics, upright forms, and spacing, becoming a cornerstone for academic notations in and .

Notations in Linguistics

In , phonetic notations provide standardized ways to represent the sounds of languages independently of their orthographies. The International Phonetic Alphabet (IPA), developed by the in 1886, serves as the primary system for transcribing with precision. It consists of 107 letters for and vowels, along with diacritics to modify them, enabling linguists to denote phonetic details such as aspiration or . For example, the symbol /θ/ represents the in English words like "think." Extensions to the IPA, such as the extIPA symbols introduced in 1990 for disordered speech, further accommodate rare or atypical phonemes encountered in clinical linguistics, including articulatory distortions not covered by the core alphabet. To address limitations in early digital text encoding, alternatives like X-SAMPA emerged as ASCII-compatible representations of IPA symbols. Developed by phonetician John C. Wells in 1995, X-SAMPA uses standard keyboard characters to approximate IPA notations, facilitating computational processing of phonetic data before widespread Unicode adoption. For instance, it renders the IPA's /ʃ/ (as in "ship") as {S}. This system unified earlier language-specific variants of SAMPA, promoting interoperability in speech technology and linguistic databases. Grammatical notations in capture the structural organization of language, particularly in morphology and . Morpheme breakdowns dissect words into their minimal meaningful units, often using hyphens or brackets for clarity; for example, the English word "unhappiness" is analyzed as un- () + happy () + -ness (nominalizer). Syntactic tree diagrams, a staple of since the mid-20th century, visually represent hierarchical phrase structures, with nodes indicating constituents like noun phrases (NP) branching from a sentence (S) node. In , frameworks like Universal Dependencies, introduced in 2014, standardize dependency annotations across languages using typed relations (e.g., nsubj for nominal subject) to model in treebanks. Semantic notations employ feature analysis to decompose word meanings into binary or componential attributes, aiding cross-linguistic comparisons. In this approach, concepts are defined by bundles of features such as [+human, +animate] for "" or [-human, -animate] for "table," highlighting contrasts in ./07%3A_Components_of_lexical_meaning/7.04%3A_Componential_analysis) This method, rooted in structural semantics, explains phenomena like hyponymy and synonymy by identifying shared or differing features among related terms./07%3A_Components_of_lexical_meaning/7.04%3A_Componential_analysis)

Notations in Mathematics

In , notations serve as symbolic languages to express abstract concepts, operations, and relationships precisely, enabling rigorous proofs and computations. These symbols, evolved over centuries, facilitate communication of ideas from basic arithmetic to advanced theories, forming the foundation for notations in applied sciences. Arithmetic and algebraic notations underpin fundamental operations and abstractions. Positional notation, which assigns place values to digits for efficient representation of numbers, originated in around the 5th century CE and was introduced to by in 1202, becoming widespread by the 15th century through printed texts. The plus sign (+) for emerged in 15th-century , first appearing in Johannes Widmann's 1489 treatise Behende und hüpsche Rechenung auf allen Kauffmannschaft, derived from the Latin et meaning "and." The multiplication symbol × was introduced by in his 1631 work Clavis Mathematicae, replacing earlier juxtapositions or words. Variables, such as xx denoting an unknown quantity, were pioneered by in 1591 using letters for parameters in equations, with standardizing lowercase letters like x,y,zx, y, z for unknowns in his 1637 . Equations and functions employ notations to describe relations and transformations. The quadratic equation is written as ax2+bx+c=0ax^2 + bx + c = 0, with its general solution x=b±b24ac2ax = \frac{-b \pm \sqrt{b^2 - 4ac}}{2a}
Add your contribution
Related Hubs
User Avatar
No comments yet.