Recent from talks
Nothing was collected or created yet.
Notation system
View on WikipediaIn linguistics and semiotics, a notation system is a system of graphics or symbols, characters and abbreviated expressions, used (for example) in artistic and scientific disciplines to represent technical facts and quantities by convention.[1][2] Therefore, a notation is a collection of related symbols that are each given an arbitrary meaning, created to facilitate structured communication within a domain knowledge or field of study.
Standard notations refer to general agreements in the way things are written or denoted. The term is generally used in technical and scientific areas of study like mathematics, physics, chemistry and biology, but can also be seen in areas like business, economics and music.
Written communication
[edit]Writing systems
[edit]- Phonographic writing systems, by definition, use symbols to represent components of auditory language, i.e. speech, which in turn refers to things or ideas. The two main kinds of phonographic notational system are the alphabet and the syllabary. Some written languages are more consistent in their correlation of written symbols (or graphemes) with sound (or phonemes), and are therefore considered to have better phonemic orthography.
- Ideographic writing, by definition, refers to things or ideas independently of their pronunciation in any language. Some ideographic systems are also pictograms that convey meaning through their pictorial resemblance to a physical object.
Linguistics
[edit]- Various brackets, parentheses, slashes, and lines are used around words and letters in linguistics to distinguish written from spoken forms, etc. See International Phonetic Alphabet § Brackets and transcription delimiters.
Biology and medicine
[edit]- Nucleic acid notation
- Systems Biology Graphical Notation (SBGN)
- Sequence motif pattern-description notations
- Cytogenetic notation
- Energy Systems Language
Chemistry
[edit]- A chemical formula describes a chemical compound using element symbols and subscripts, e.g. H
2O for water or C
6H
12O
6 for glucose - SMILES is a notation for describing the structure of a molecule with a plain text string, e.g. N=N for nitrogen or CCO for ethanol
Computing
[edit]- BNF (Backus normal form, or Backus–Naur form) and EBNF (extended Backus-Naur form) are the two main notation techniques for context-free grammars.
- Drakon-charts are a graphical notation of algorithms and procedural knowledge.
- Hungarian notation is an identifier naming convention in computer programming, that represents the type or intended use of a variable with a specific pattern within its name.
- Mathematical markup languages are computer notations for representing mathematical formulae.
- Various notations have been developed to specify regular expressions.
- The APL programming language provided a rich set of very concise new notations
Logic
[edit]A variety of symbols are used to express logical ideas; see the List of logic symbols
Management
[edit]- Time and motion study symbols such as therbligs
Mathematics
[edit]- Mathematical notation is used to represent various kinds of mathematical ideas.
- All types of notation in probability
- Cartesian coordinate system, for representing position and other spatial concepts in analytic geometry
- Notation for differentiation, common representations of the derivative in calculus
- Big O notation, used for example in analysis to represent less significant elements of an expression, to indicate that they will be neglected
- Z notation, a formal notation for specifying objects using Zermelo–Fraenkel set theory and first-order predicate logic
- Ordinal notation
- Set-builder notation, a formal notation for defining sets in set theory
- Systems to represent very large numbers
- Conway chained arrow notation, an arrow system
- Knuth's up-arrow notation, an arrow system
- Steinhaus–Moser notation, Polygon Numbers
- Schläfli symbol in geometry
- Symbol Levelled notation, The Ultimate Leveller
- Numeral systems, notation for writing numbers, including
- Arabic numerals
- Roman numerals
- Scientific notation for expressing large and small numbers
- Engineering notation
- Sign-value notation, using signs or symbols to represent numbers
- Positional notation also known as place-value notation, in which each position is related to the next by a multiplier which is called the base of that numeral system
- Binary notation, a positional notation in base two
- Octal notation, a positional notation in base eight, used in some computers
- Decimal notation, a positional notation in base ten
- Hexadecimal notation, a positional notation in base sixteen, commonly used in computers
- Sexagesimal notation, an ancient numeral system in base sixty
- See also Table of mathematical symbols - for general tokens and their definitions...
Physics
[edit]- Bra–ket notation, or Dirac notation, is an alternative representation of probability distributions in quantum mechanics.
- Tensor index notation is used when formulating physics (particularly continuum mechanics, electromagnetism, relativistic quantum mechanics and field theory, and general relativity) in the language of tensors.
Typographical conventions
[edit]- Infix notation, the common arithmetic and logical formula notation, such as "a + b − c".
- Polish notation or "prefix notation", which places the operator before the operands (arguments), such as "+ a b".
- Reverse Polish notation or "postfix notation", which places the operator after the operands, such as "a b +".
Sports and games
[edit]- Baseball scorekeeping, to represent a game of baseball
- Aresti Catalogue, to represent aerobatic manoeuvres
- Chess notation, to represent moves in a game of chess
- Siteswap notation represents a juggling pattern as a sequence of numbers
- Singmaster notation, to represent Rubik's Cube moves
Graphical notations
[edit]Music
[edit]- Musical notation permits a composer to express musical ideas in a musical composition, which can be read and interpreted during performance by a trained musician; there are many different ways to do this (hundreds have been proposed), although staff notation provides by far the most widely used system of modern musical symbols.
Dance and movement
[edit]- Benesh Movement Notation permits a graphical representation of human bodily movements
- Laban Movement Analysis or Labanotation permits a graphical representation of human bodily movements
- Eshkol-Wachman Movement Notation permits a graphical representation of bodily movements of other species in addition to humans, and indeed any kind of movement (e.g. aircraft aerobatics)
- Juggling diagrams represent juggling patterns
- Aresti aerobatic symbols provides a way to represent flight maneuvers in aerobatics
Science
[edit]- Feynman diagrams permit a graphical representation of a perturbative contribution to the transition amplitude or correlation function of a quantum mechanical or statistical field theory
- Structural formulas are graphical representations of molecules
- Venn diagrams shows logical relations between a finite collection of sets.
- Drakon-charts are a graphical representation of algorithms and procedural knowledge.
- Unified Modeling Language is a standard notation for many types of diagrams
Other systems
[edit]- Whyte notation for classifying steam locomotives by wheel arrangement
See also
[edit]References
[edit]- ^ Crystal, David (2011). Dictionary of Linguistics and Phonetics. John Wiley & Sons. ISBN 9781444356755.
- ^ "Notation". Merriam-Webster Dictionary. Encyclopædia Britannica. Retrieved 6 September 2013.
Further reading
[edit]- Nöth, Winfried (1995). Handbook of Semiotics. Indiana University Press. ISBN 9780253209597.
- Hartmut Günther, Otto Ludwig (1996). Writing and Its Use, Volumen 2. Walter de Gruyter. p. 1559. ISBN 9783110147445.
Notation system
View on GrokipediaCore Concepts
Definition and Purpose
A notation system is a structured collection of graphics, symbols, characters, or abbreviated expressions, governed by formalized rules for encoding, representing, and decoding information within particular domains of knowledge or communication.[9] These systems function as diagrammatic or symbolic tools that abstract complex ideas into manageable forms, allowing users to manipulate and interpret them systematically.[10] Key characteristics include syntax, which defines the rules for combining symbols; semantics, which assigns meanings to those symbols in relation to objects or concepts; and pragmatics, which addresses the contextual use and interpretation by users.[11] Notation systems can be universal, such as alphabetic scripts applicable across languages, or domain-specific, tailored to fields like logic or measurement for precise technical representation.[10] The primary purpose of notation systems is to facilitate precise communication and standardization among users, reducing ambiguity and errors in conveying complex information.[12] By providing a shared framework, they enable the encoding of ideas for transmission, analysis, and replication, evolving from oral traditions to durable written records that preserve knowledge across generations.[13] This standardization supports collaborative reasoning and problem-solving, as symbols can be manipulated according to syntactic rules to derive new insights without relying on verbal description alone.[9] Notation systems are essential for abstraction, allowing scalable recording and cross-cultural transmission of ideas by distilling intricate realities into compact forms.[10] Historically, they shifted from concrete pictographs, which directly depicted objects for rudimentary accounting and depiction, to abstract symbols representing sounds or relations, enhancing efficiency and generality in information processing.[13] This evolution underscores their role in advancing human cognition, from localized preservation to global knowledge dissemination.[12]Historical Development
The earliest forms of notation systems emerged in prehistoric times through cave paintings and markings that served as proto-notations for counting, storytelling, and recording events, with symbols dating back nearly 40,000 years across European caves.[14] Tally marks, such as those incised on bones and stones, also appeared around this period as rudimentary methods for tracking quantities and lunar cycles, laying the groundwork for systematic representation.[15] In ancient Mesopotamia, Sumerian cuneiform developed around 3200 BCE as one of the first true writing systems, primarily used for administrative records like economic transactions on clay tablets.[13] Concurrently, Egyptian hieroglyphs arose circa 3100 BCE, combining ideographic elements representing concepts with phonetic signs to denote sounds, enabling both pictorial and linguistic expression in monumental and religious contexts.[16] During the classical eras, the Greek alphabet emerged around 800 BCE by adapting Phoenician script, introducing vowels and a fully phonetic system that profoundly influenced Western notation by facilitating precise representation of spoken language.[17] In parallel, Chinese logographic scripts originated during the Shang dynasty circa 1200 BCE, prioritizing character-based symbols that convey meaning through visual forms rather than phonetic transcription, forming the basis of a non-alphabetic tradition.[18] From the medieval period onward, the adoption of Arabic numerals—originally Hindu in origin—spread to Europe in the 12th century, notably through Leonardo Fibonacci's 1202 publication Liber Abaci, which demonstrated their efficiency for arithmetic over Roman numerals and accelerated commercial calculations.[19] The invention of the printing press by Johannes Gutenberg around 1440 further standardized typographical notations by enabling mass production of uniform text, reducing variations in script forms and promoting consistency across printed materials.[20] In the 20th and 21st centuries, digital advancements transformed notation systems with the creation of ASCII in 1963, a seven-bit encoding standard that unified character representation for early computers, supporting basic text interchange in English and Western scripts.[21] This evolved into Unicode, formalized in 1991 by the Unicode Consortium, providing a universal framework for encoding over a million characters from global writing systems to enable seamless multilingual digital text.[22]Text-Based Notations
General Writing Systems
General writing systems form the foundational frameworks for representing spoken languages through visual symbols, enabling the recording and transmission of information across cultures and time periods. These systems vary in how they map linguistic units—such as sounds, syllables, or morphemes—to graphic forms, influencing literacy, communication efficiency, and cultural adaptation.[23] The primary types of writing systems include alphabetic, abjad, abugida, syllabary, and logographic scripts. Alphabetic systems, such as the Latin alphabet used in English and many Indo-European languages, or the Cyrillic alphabet employed in Russian, typically consist of 20-30 symbols that represent individual phonemes, including both consonants and vowels, allowing for flexible recombination to form words.[23] Abjads, like the Hebrew or Arabic scripts, focus primarily on consonants with 20-30 symbols, where vowels are often omitted or indicated by optional diacritics, relying on reader familiarity with word roots for disambiguation.[23] Abugidas, exemplified by Devanagari used for Hindi and other Indic languages, feature consonants as base characters with an inherent vowel sound, modified by diacritics for other vowels, typically using around 30-50 primary symbols plus modifiers.[23] Syllabaries, such as Japanese hiragana and katakana, assign distinct symbols to syllables (e.g., about 46 basic characters in hiragana), suiting languages with consistent syllable structures but increasing inventory size for complex phonologies.[23] Logographic systems, like Chinese hanzi, use characters to represent words or morphemes, with comprehensive dictionaries cataloging approximately 50,000 characters, though daily literacy requires mastery of 2,000-3,000.[24] Design principles underlying these systems emphasize phonetic mapping to spoken language elements, efficiency in stroke count for ease of writing, and adaptability to linguistic features like morphology or phonotactics. Characters across diverse scripts average about three strokes, balancing recognizability and production speed while minimizing redundancy to around 50%, which optimizes information density without excessive complexity.[25] Evolutionarily, writing systems originated from ideographic representations of concepts, such as Sumerian pictograms around 3200 BCE, gradually incorporating phonographic elements to denote sounds via the rebus principle, leading to fully phonetic scripts like the alphabet derived from Proto-Sinaitic around 1900 BCE.[13] Standardization efforts, such as the ISO 15924 international standard, assign four-letter codes to scripts (e.g., "Latn" for Latin, "Hans" for Han simplified) to facilitate digital encoding, multilingual processing, and bibliographic control, though challenges persist in handling script variations and hybrid systems in global computing environments.[26] A notable tactile extension is Braille, developed in 1824 by Louis Braille, a blind French educator, which uses a 6-dot cell configuration in a 2x3 grid to represent 63 distinct characters, adapting alphabetic principles for visually impaired users through raised dots readable by touch.[27] In linguistics, these systems support phonetic adaptations, such as romanization for transcribing non-Latin scripts.[28]Typographical Conventions
Typographical conventions in notation systems govern the visual formatting of text to enhance clarity, consistency, and aesthetic appeal across written communication. These rules encompass punctuation marks, which originated in ancient Greek scholarship; for instance, the comma for short pauses, period for sentence ends, and colon for longer pauses were developed by Aristophanes of Byzantium around the 3rd century BCE to indicate rhythmic breaks in oral readings of poetry.[29] Spacing between words and lines ensures readability, while capitalization typically denotes proper nouns, sentence beginnings, or emphasis in many alphabetic scripts. Italics serve to highlight terms, foreign words, or stress, providing subtle cues without altering the core symbols. Typefaces play a crucial role in distinguishing notation styles, with serif fonts—characterized by small decorative strokes at letter ends—traditionally favored for printed texts due to their perceived elegance and guidance of the eye along lines, as seen in historical book printing since the 15th century.[30] In contrast, sans-serif fonts, lacking these strokes, offer a cleaner, more modern appearance and are preferred for digital displays to reduce visual strain on screens. Ligatures, such as the æ combining "a" and "e" in Latin-derived texts, bind characters for historical accuracy and space efficiency, originating from medieval scribal practices to represent diphthongs.[31] Kerning adjusts inter-character spacing—particularly for pairs like "AV" or "To"—to achieve optical evenness, a technique refined in metal type era and essential for precise symbol alignment in dense notations.[32] Standardization through encoding and style guides promotes uniformity in notation systems. Unicode, introduced in version 1.0 in October 1991, provides a universal framework for representing 159,801 characters across scripts as of version 17.0 (September 2025), enabling consistent digital rendering of notations including diacritics in non-Latin systems like accented vowels in French or tonal marks in Vietnamese.[33][34] Influential guides include the Chicago Manual of Style, first published in 1906 by the University of Chicago Press as a compilation of typographical rules for academic publishing, and the APA Publication Manual, debuting in 1952 to standardize notations in psychological and social sciences.[35][36] In mathematical typesetting, conventions dictate italicization for variables (e.g., x for an unknown) to differentiate them from upright operators like "+" or "sin" for functions, ensuring unambiguous parsing of expressions; these rules, formalized in standards like those from the International Union of Pure and Applied Chemistry (IUPAC), trace to 19th-century printing practices for scientific clarity.[37] Donald Knuth's TeX system, released in 1978, revolutionized these conventions by automating high-quality typesetting with programmable control over italics, upright forms, and spacing, becoming a cornerstone for academic notations in computing and mathematics.[38]Notations in Linguistics
In linguistics, phonetic notations provide standardized ways to represent the sounds of languages independently of their orthographies. The International Phonetic Alphabet (IPA), developed by the International Phonetic Association in 1886, serves as the primary system for transcribing speech sounds with precision. It consists of 107 letters for consonants and vowels, along with diacritics to modify them, enabling linguists to denote phonetic details such as aspiration or nasalization. For example, the symbol /θ/ represents the voiceless dental fricative in English words like "think."[39] Extensions to the IPA, such as the extIPA symbols introduced in 1990 for disordered speech, further accommodate rare or atypical phonemes encountered in clinical linguistics, including articulatory distortions not covered by the core alphabet. To address limitations in early digital text encoding, alternatives like X-SAMPA emerged as ASCII-compatible representations of IPA symbols. Developed by phonetician John C. Wells in 1995, X-SAMPA uses standard keyboard characters to approximate IPA notations, facilitating computational processing of phonetic data before widespread Unicode adoption. For instance, it renders the IPA's /ʃ/ (as in "ship") as {S}. This system unified earlier language-specific variants of SAMPA, promoting interoperability in speech technology and linguistic databases. Grammatical notations in linguistics capture the structural organization of language, particularly in morphology and syntax. Morpheme breakdowns dissect words into their minimal meaningful units, often using hyphens or brackets for clarity; for example, the English word "unhappiness" is analyzed as un- (negation) + happy (root) + -ness (nominalizer). Syntactic tree diagrams, a staple of generative grammar since the mid-20th century, visually represent hierarchical phrase structures, with nodes indicating constituents like noun phrases (NP) branching from a sentence (S) node.[40] In computational linguistics, frameworks like Universal Dependencies, introduced in 2014, standardize dependency annotations across languages using typed relations (e.g., nsubj for nominal subject) to model syntax in treebanks. Semantic notations employ feature analysis to decompose word meanings into binary or componential attributes, aiding cross-linguistic comparisons. In this approach, concepts are defined by bundles of features such as [+human, +animate] for "person" or [-human, -animate] for "table," highlighting contrasts in lexical semantics./07%3A_Components_of_lexical_meaning/7.04%3A_Componential_analysis) This method, rooted in structural semantics, explains phenomena like hyponymy and synonymy by identifying shared or differing features among related terms./07%3A_Components_of_lexical_meaning/7.04%3A_Componential_analysis)Notations in Mathematics
In mathematics, notations serve as symbolic languages to express abstract concepts, operations, and relationships precisely, enabling rigorous proofs and computations. These symbols, evolved over centuries, facilitate communication of ideas from basic arithmetic to advanced theories, forming the foundation for notations in applied sciences. Arithmetic and algebraic notations underpin fundamental operations and abstractions. Positional decimal notation, which assigns place values to digits for efficient representation of numbers, originated in India around the 5th century CE and was introduced to Europe by Fibonacci in 1202, becoming widespread by the 15th century through printed texts. The plus sign (+) for addition emerged in 15th-century Europe, first appearing in Johannes Widmann's 1489 treatise Behende und hüpsche Rechenung auf allen Kauffmannschaft, derived from the Latin et meaning "and."[41] The multiplication symbol × was introduced by William Oughtred in his 1631 work Clavis Mathematicae, replacing earlier juxtapositions or words.[41] Variables, such as denoting an unknown quantity, were pioneered by François Viète in 1591 using letters for parameters in equations, with René Descartes standardizing lowercase letters like for unknowns in his 1637 La Géométrie.[42][43] Equations and functions employ notations to describe relations and transformations. The quadratic equation is written as , with its general solution derived systematically by Simon Stevin in 1594, building on earlier geometric methods.[44] Summation notation, , uses the Greek capital sigma (Σ) to denote the sum of a sequence, introduced by Leonhard Euler in 1755 in Institutionum calculi integralis.[41] Limits, expressed as , formalize the approach of a function to a value, with the "lim" symbol first used by Karl Weierstrass in 1841 lectures, later refined with the arrow by G. H. Hardy in 1908.[45] Geometric and calculus notations incorporate Greek letters and integrals for spatial and dynamic concepts. The symbol π represents the ratio of a circle's circumference to its diameter, approximately 3.14159, first approximated by Archimedes in 250 BCE using inscribed polygons in Measurement of a Circle. The integral denotes the accumulation of area under a curve, introduced by Gottfried Wilhelm Leibniz in 1675 as an elongated S for "summa."[45] Vectors are denoted by , with the arrow indicating direction and magnitude; this overbar notation appeared in William Rowan Hamilton's 1853 work on quaternions and was popularized by J. Willard Gibbs in 1881 vector analysis. Set theory notations define collections and relations foundational to modern mathematics. The membership symbol ∈ indicates an element belongs to a set, introduced by Giuseppe Peano in 1889 as ε (from Latin est, "is") in Arithmetices principia, nova methodo exposita, later stylized as ∈.[46] Boolean algebra, a subset of set theory for logical operations, uses ∧ for conjunction (AND, or intersection) and ∨ for disjunction (OR, or union); these wedge and vee symbols were adopted in the early 20th century, with ∨ appearing in Bertrand Russell's 1908 Principia Mathematica and ∧ in Arend Heyting's 1930 intuitionistic logic, though Boolean operations trace to George Boole's 1854 An Investigation of the Laws of Thought. These notations enable precise manipulation of truth values and sets, as in for the intersection of sets A and B. A notable development in calculus notation contrasts Leibniz's fractional form for the derivative, introduced in 1675 to represent the ratio of infinitesimal changes, with Isaac Newton's "fluxions" (denoted by dots over variables) from the 1660s; Leibniz's intuitive notation prevailed due to its clarity for higher derivatives like .[45] In category theory, functors—mappings between categories preserving structure—are denoted by , first formalized by Samuel Eilenberg and Saunders Mac Lane in their 1945 paper "General Theory of Natural Equivalences," establishing abstract algebraic topology. These mathematical notations influence physics by providing symbols like vectors for forces, though physical applications adapt them to quantities with units.Notations in Physics
In physics, notations systematically represent quantities, constants, and fundamental laws, enabling precise description of natural phenomena. The International System of Units (SI), formally adopted by the 11th General Conference on Weights and Measures in 1960, standardizes units for physical measurements, such as the meter for length and kilogram for mass.[47] Common symbols include m for mass and v for velocity, as recommended by international standards for clarity in equations and derivations.[48] Fundamental constants like the speed of light in vacuum, denoted c, are defined exactly as 299792458 m/s, fixing the scale for relativistic effects and electromagnetic propagation.[49] Classical mechanics employs algebraic notations to capture motion and forces. Newton's second law, formulated in his 1687 Philosophiæ Naturalis Principia Mathematica, is conventionally written as , where F denotes net force, m mass, and a acceleration, linking cause (force) to effect (motion change).[50] Kinematic equations for constant acceleration, derived from this law, include forms like , with u as initial velocity and t time, used to predict trajectories without forces.[51] Lagrangian mechanics, developed by Joseph-Louis Lagrange in 1788, introduces the function , where T is kinetic energy and V potential energy; this scalar formulation simplifies solving complex systems via the Euler-Lagrange equations./13%3A_Lagrangian_Mechanics/13.04%3A_The_Lagrangian_Equations_of_Motion) Electromagnetism relies on vector and scalar notations to describe fields and charges. Coulomb's law, experimentally established by Charles-Augustin de Coulomb in 1785, quantifies electrostatic force as , with q₁ and q₂ as charges, r separation distance, and k the Coulomb constant (approximately 8.99 × 10⁹ N·m²/C²).[52] James Clerk Maxwell's 1865 equations unify electricity and magnetism in differential vector form, such as Gauss's law , where is the electric field, charge density, and vacuum permittivity; this notation, refined by Oliver Heaviside, reveals electromagnetic waves propagating at speed c.[53] Relativity and quantum mechanics introduce abstract notations for energy, states, and particles. Albert Einstein's 1905 derivation of mass-energy equivalence yields , equating rest energy E to mass m times c squared, foundational to nuclear processes and cosmology.[54] In quantum theory, Paul Dirac's bra-ket notation, introduced in his 1930 The Principles of Quantum Mechanics, represents states as kets like |ψ⟩ (a column vector in Hilbert space) and observables via operators, facilitating computations of probabilities and superpositions.[55] Richard Feynman's 1948 innovation of diagrams provides a graphical notation for quantum field interactions, depicting particle paths as lines (e.g., electrons as solid lines, photons as wavy) with vertices for events, simplifying perturbative calculations in quantum electrodynamics.[56] Post-1998 discoveries of neutrino oscillations, confirming nonzero neutrino masses via experiments like Super-Kamiokande (atmospheric, 1998) and SNO (solar, 2001), employ the Pontecorvo–Maki–Nakagawa–Sakata (PMNS) matrix to notate flavor mixing. This 3×3 unitary matrix U parametrizes transitions between flavor eigenstates (ν_e, ν_μ, ν_τ) and mass eigenstates (ν_1, ν_2, ν_3) using three mixing angles (θ_{12}, θ_{23}, θ_{13}), a CP-violating phase δ, and Majorana phases; the oscillation probability depends on mass-squared differences Δm²_{21} and |Δm²_{31}|, with current values θ_{12} ≈ 33°, θ_{23} ≈ 49°, θ_{13} ≈ 8.5°, and Δm²_{21} ≈ 7.5 × 10^{-5} eV².Notations in Chemistry
In chemistry, notations provide a standardized way to represent elements, compounds, and reactions, facilitating communication and computation across the discipline. Elemental symbols, derived from the Latin or English names of elements, were systematically organized in Dmitri Mendeleev's 1869 periodic table, which arranged elements by atomic weight and properties, using one- or two-letter abbreviations such as H for hydrogen and O for oxygen.[57] These symbols are now standardized by the International Union of Pure and Applied Chemistry (IUPAC), founded in 1919, which assigns them alongside atomic numbers (Z) to denote nuclear charge, ensuring consistency in global scientific literature.[58] For example, carbon is denoted as C with Z = 6, allowing precise identification in formulas and equations. Molecular formulas describe the composition of compounds, with empirical formulas showing the simplest whole-number ratio of atoms, such as CH₂O for formaldehyde, while molecular formulas indicate the exact count, like C₆H₁₂O₆ for glucose.[59] Structural formulas expand on this by illustrating atom connectivity, often using line-angle representations for organic molecules where lines signify carbon-carbon bonds and vertices imply carbon atoms. Isotopes are notated with a superscript mass number preceding the symbol, as in ¹⁴C for carbon-14, distinguishing variants based on neutron count while maintaining the same Z.[60] Chemical reactions employ arrow notations standardized by IUPAC, where a single arrow (→) indicates the forward direction or yield of products, as in 2H₂ + O₂ → 2H₂O for water formation, and a double arrow (⇌) denotes reversible equilibrium.[61] Bonding notations, particularly Lewis structures introduced by Gilbert N. Lewis in 1916, use dots to represent valence electrons and lines for shared pairs, illustrating covalent bonds like the two dots between H and O in H₂O to show electron sharing.[62] For stereochemistry, wedge-dash conventions depict three-dimensional arrangements around chiral centers, with solid wedges indicating bonds projecting toward the viewer and dashed lines for those receding, as recommended by IUPAC for relative configuration in organic structures.[63] In computational chemistry, the Simplified Molecular Input Line Entry System (SMILES), developed by David Weininger in 1988, encodes molecules as text strings for database storage and simulation; for instance, CC(=O)O represents acetic acid by denoting atoms (C for carbon, O for oxygen) and bonds (= for double).[64] These notations collectively enable the precise documentation of chemical entities, supporting both theoretical analysis and experimental design.Notations in Biology and Medicine
In biology and medicine, notations serve as standardized symbols and systems to describe living organisms, genetic material, physiological processes, and clinical conditions, facilitating precise communication among researchers and practitioners. Taxonomic nomenclature employs the binomial system, introduced by Carl Linnaeus in his 1753 work Species Plantarum, which assigns each species a two-part Latin name consisting of the genus and species, such as Homo sapiens for modern humans.[65] This system organizes organisms hierarchically into ranks including kingdom, phylum, class, order, family, genus, and species, promoting clarity in classification across biodiversity studies. The International Code of Zoological Nomenclature (ICZN), first published in 1905, governs animal naming rules, ensuring stability and universality in zoological taxonomy by regulating name usage, priority, and typification.[66] Genetic notations represent molecular structures and processes essential to heredity. The four nucleotide bases in DNA are denoted as adenine (A), thymine (T), cytosine (C), and guanine (G), a convention established in the 1953 elucidation of DNA's double-helix structure. In protein synthesis, codons—triplets of these bases—specify amino acids; for instance, the codon AUG codes for methionine and serves as the start signal in translation, as decoded in the 1960s through experiments mapping the genetic code.[67] Population genetics uses the Hardy-Weinberg equilibrium to model allele frequencies under non-evolutionary conditions, expressed as the equation where and are the frequencies of two alleles at a locus, a principle independently formulated by G.H. Hardy and Wilhelm Weinberg in 1908.[68] Medical notations streamline documentation in clinical practice through abbreviations and codes. The symbol Rx, derived from the Latin recipe meaning "take," indicates a prescription and has been used since the 16th century to instruct pharmacists on medication preparation. BP denotes blood pressure, a vital sign measured in millimeters of mercury (mmHg), essential for diagnosing hypertension and cardiovascular risks. The International Classification of Diseases, 11th Revision (ICD-11), adopted by the World Health Assembly in 2019 and effective from 2022, employs alphanumeric codes (e.g., 1A00 for cholera) to standardize disease diagnosis, morbidity statistics, and healthcare billing globally.[69] Anatomical notations quantify and visualize bodily functions. The Snellen chart, developed by Dutch ophthalmologist Herman Snellen in 1862, assesses visual acuity using rows of letters or symbols of decreasing size, with readings like 20/20 indicating standard vision at 20 feet.[70] In cardiology, electrocardiogram (ECG) tracings label waveforms as P (atrial depolarization), QRS complex (ventricular depolarization), and T (ventricular repolarization), a notation introduced by Willem Einthoven in the early 1900s to interpret heart electrical activity.[71] Recent advancements have introduced specialized notations for emerging biotechnologies. The CRISPR-Cas9 system for gene editing, detailed in 2012, uses single-guide RNA (sgRNA) sequences—typically 20 nucleotides targeting specific DNA loci followed by a protospacer adjacent motif (PAM, e.g., NGG for Cas9)—to direct precise cuts, revolutionizing genomic engineering.[72] Post-2020 mRNA vaccine designs, such as those for SARS-CoV-2, incorporate notations for modified nucleosides like pseudouridine (Ψ) in sequences to enhance stability and reduce immune activation, as seen in formulations encoding the spike protein.[73]Notations in Computing and Logic
In computing and logic, notations serve as standardized symbols and syntactic structures to express algorithms, data representations, and formal reasoning processes, enabling precise communication and execution in computational environments. These notations bridge theoretical foundations with practical implementation, allowing developers and logicians to model complex behaviors without ambiguity. Unlike purely mathematical notations, those in computing emphasize executability and machine interpretability, while logical notations focus on truth evaluation and inference rules. Programming notations encompass the syntax rules defining how instructions are written in high-level languages and informal descriptions. For instance, in Python, functions are defined using thedef keyword followed by the function name and parameters in parentheses, as in def greet(name):, which binds a callable object to the name greet and executes the indented suite upon invocation.[74] Pseudocode, a language-agnostic notation for algorithm design, uses structured English-like statements to outline control flow without implementation details; a common construct is the if-then-else for conditional execution, such as IF condition THEN statement1 ELSE statement2 ENDIF, facilitating clarity in planning before coding.[75] These notations prioritize readability and portability across programming paradigms.
Data notations in computing represent information in machine-readable formats, with binary (base-2) and hexadecimal (base-16) being fundamental for low-level operations. Binary notation encodes values using only 0 and 1, where each digit (bit) corresponds to a power of 2; for example, the decimal 5 is 101 in binary, as .[76] Hexadecimal condenses binary groups of four bits into digits 0-9 and A-F, prefixed by 0x; thus, 0xFF equals 255 in decimal (), aiding memory addressing and debugging due to its compact form.[77]
Logical notations formalize reasoning in computing, underpinning verification and automated theorem proving. Propositional logic uses symbols like (conjunction, "and"), where is true only if both and are true, and (disjunction, "or"), true if at least one holds; truth tables enumerate all possibilities for operators, as shown for conjunction:
| p | q | p ∧ q |
|---|---|---|
| T | T | T |
| T | F | F |
| F | T | F |
| F | F | F |
ClassName: attributes; methods), standardized by the Object Management Group in 1997 to unify modeling practices.[79] Lambda calculus, introduced by Alonzo Church in 1936, employs to denote functions abstracting over variable with body , forming the basis for functional programming languages like Lisp.[80] In quantum computing, Qiskit's notation from IBM (released 2017) uses Pythonic syntax for circuits, such as qc.h(0) for applying a Hadamard gate to qubit 0, integrating classical and quantum operations. Flowcharts, early precursors to modern notations, textually describe processes via symbols like ovals for start/end and diamonds for decisions (e.g., "IF condition?"), predating structured programming but influencing pseudocode development in the mid-20th century.[81]
