Recent from talks
Contribute something
Nothing was collected or created yet.
Science
View on Wikipedia
| Part of a series on |
| Science |
|---|
| General |
| Branches |
| In society |
Science is a systematic discipline that builds and organises knowledge in the form of testable hypotheses and predictions about the universe.[1][2] Modern science is typically divided into two – or three – major branches:[3] the natural sciences, which study the physical world, and the social sciences, which study individuals and societies.[4][5] While referred to as the formal sciences, the study of logic, mathematics, and theoretical computer science are typically regarded as separate because they rely on deductive reasoning instead of the scientific method as their main methodology.[6][7][8][9] Meanwhile, applied sciences are disciplines that use scientific knowledge for practical purposes, such as engineering and medicine.[10][11][12]
The history of science spans the majority of the historical record, with the earliest identifiable predecessors to modern science dating to the Bronze Age in Egypt and Mesopotamia (c. 3000–1200 BCE). Their contributions to mathematics, astronomy, and medicine entered and shaped the Greek natural philosophy of classical antiquity and later medieval scholarship, whereby formal attempts were made to provide explanations of events in the physical world based on natural causes; while further advancements, including the introduction of the Hindu–Arabic numeral system, were made during the Golden Age of India and Islamic Golden Age.[13]: 12 [14][15][16][13]: 163–192 The recovery and assimilation of Greek works and Islamic inquiries into Western Europe during the Renaissance revived natural philosophy,[13]: 193–224, 225–253 [17] which was later transformed by the Scientific Revolution that began in the 16th century[18] as new ideas and discoveries departed from previous Greek conceptions and traditions.[13]: 357–368 [19] The scientific method soon played a greater role in the acquisition of knowledge, and in the 19th century, many of the institutional and professional features of science began to take shape,[20][21] along with the changing of "natural philosophy" to "natural science".[22]
New knowledge in science is advanced by research from scientists who are motivated by curiosity about the world and a desire to solve problems.[23][24] Contemporary scientific research is highly collaborative and is usually done by teams in academic and research institutions,[25] government agencies,[13]: 163–192 and companies.[26] The practical impact of their work has led to the emergence of science policies that seek to influence the scientific enterprise by prioritising the ethical and moral development of commercial products, armaments, health care, public infrastructure, and environmental protection.
Etymology
[edit]The word science has been used in Middle English since the 14th century in the sense of "the state of knowing". The word was borrowed from the Anglo-Norman language as the suffix -cience, which was borrowed from the Latin word scientia, meaning "knowledge, awareness, understanding", a noun derivative of sciens meaning "knowing", itself the present active participle of sciō, "to know".[27]
There are many hypotheses for science's ultimate word origin. According to Michiel de Vaan, Dutch linguist and Indo-Europeanist, sciō may have its origin in the Proto-Italic language as *skije- or *skijo- meaning "to know", which may originate from Proto-Indo-European language as *skh1-ie, *skh1-io meaning "to incise". The Lexikon der indogermanischen Verben proposed sciō is a back-formation of nescīre, meaning "to not know, be unfamiliar with", which may derive from Proto-Indo-European *sekH- in Latin secāre, or *skh2- from *sḱʰeh2(i)-meaning "to cut".[28]
In the past, science was a synonym for "knowledge" or "study", in keeping with its Latin origin. A person who conducted scientific research was called a "natural philosopher" or "man of science".[29] In 1834, William Whewell introduced the term scientist in a review of Mary Somerville's book On the Connexion of the Physical Sciences,[30] crediting it to "some ingenious gentleman" (possibly himself).[31]
History
[edit]Early history
[edit]
Science has no single origin. Rather, scientific thinking emerged gradually over the course of tens of thousands of years,[32][33] taking different forms around the world, and few details are known about the very earliest developments. Women likely played a central role in prehistoric science,[34] as did religious rituals.[35] Some scholars use the term "protoscience" to label activities in the past that resemble modern science in some but not all features;[36][37][38] however, this label has also been criticised as denigrating,[39] or too suggestive of presentism, thinking about those activities only in relation to modern categories.[40]
Direct evidence for scientific processes becomes clearer with the advent of writing systems in the Bronze Age civilisations of Ancient Egypt and Mesopotamia (c. 3000–1200 BCE), creating the earliest written records in the history of science.[13]: 12–15 [14] Although the words and concepts of "science" and "nature" were not part of the conceptual landscape at the time, the ancient Egyptians and Mesopotamians made contributions that would later find a place in Greek and medieval science: mathematics, astronomy, and medicine.[41][13]: 12 From the 3rd millennium BCE, the ancient Egyptians developed a non-positional decimal numbering system,[42] solved practical problems using geometry,[43] and developed a calendar.[44] Their healing therapies involved drug treatments and the supernatural, such as prayers, incantations, and rituals.[13]: 9
The ancient Mesopotamians used knowledge about the properties of various natural chemicals for manufacturing pottery, faience, glass, soap, metals, lime plaster, and waterproofing.[45] They studied animal physiology, anatomy, behaviour, and astrology for divinatory purposes.[46] The Mesopotamians had an intense interest in medicine and the earliest medical prescriptions appeared in Sumerian during the Third Dynasty of Ur.[45][47] They seem to have studied scientific subjects which had practical or religious applications and had little interest in satisfying curiosity.[45]
Classical antiquity
[edit]
In classical antiquity, there is no real ancient analogue of a modern scientist. Instead, well-educated, usually upper-class, and almost universally male individuals performed various investigations into nature whenever they could afford the time.[48] Before the invention or discovery of the concept of phusis or nature by the pre-Socratic philosophers, the same words tend to be used to describe the natural "way" in which a plant grows,[49] and the "way" in which, for example, one tribe worships a particular god. For this reason, it is claimed that these men were the first philosophers in the strict sense and the first to clearly distinguish "nature" and "convention".[50]
The early Greek philosophers of the Milesian school, which was founded by Thales of Miletus and later continued by his successors Anaximander and Anaximenes, were the first to attempt to explain natural phenomena without relying on the supernatural.[51] The Pythagoreans developed a complex number philosophy[52]: 467–468 and contributed significantly to the development of mathematical science.[52]: 465 The theory of atoms was developed by the Greek philosopher Leucippus and his student Democritus.[53][54] Later, Epicurus would develop a full natural cosmology based on atomism, and would adopt a "canon" (ruler, standard) which established physical criteria or standards of scientific truth.[55] The Greek doctor Hippocrates established the tradition of systematic medical science[56][57] and is known as "The Father of Medicine".[58]
A turning point in the history of early philosophical science was Socrates' example of applying philosophy to the study of human matters, including human nature, the nature of political communities, and human knowledge itself. The Socratic method as documented by Plato's dialogues is a dialectic method of hypothesis elimination: better hypotheses are found by steadily identifying and eliminating those that lead to contradictions. The Socratic method searches for general commonly held truths that shape beliefs and scrutinises them for consistency.[59] Socrates criticised the older type of study of physics as too purely speculative and lacking in self-criticism.[60]
In the 4th century BCE, Aristotle created a systematic programme of teleological philosophy.[61] In the 3rd century BCE, Greek astronomer Aristarchus of Samos was the first to propose a heliocentric model of the universe, with the Sun at the centre and all the planets orbiting it.[62] Aristarchus's model was widely rejected because it was believed to violate the laws of physics,[62] while Ptolemy's Almagest, which contains a geocentric description of the Solar System, was accepted through the early Renaissance instead.[63][64] The inventor and mathematician Archimedes of Syracuse made major contributions to the beginnings of calculus.[65] Pliny the Elder was a Roman writer and polymath, who wrote the seminal encyclopaedia Natural History.[66][67][68]
Positional notation for representing numbers likely emerged between the 3rd and 5th centuries CE along Indian trade routes. This numeral system made efficient arithmetic operations more accessible and would eventually become standard for mathematics worldwide.[69]
Middle Ages
[edit]
Due to the collapse of the Western Roman Empire, the 5th century saw an intellectual decline, with knowledge of classical Greek conceptions of the world deteriorating in Western Europe.[13]: 194 Latin encyclopaedists of the period such as Isidore of Seville preserved the majority of general ancient knowledge.[70] In contrast, because the Byzantine Empire resisted attacks from invaders, they were able to preserve and improve prior learning.[13]: 159 John Philoponus, a Byzantine scholar in the 6th century, started to question Aristotle's teaching of physics, introducing the theory of impetus.[13]: 307, 311, 363, 402 His criticism served as an inspiration to medieval scholars and Galileo Galilei, who extensively cited his works ten centuries later.[13]: 307–308 [71]
During late antiquity and the Early Middle Ages, natural phenomena were mainly examined via the Aristotelian approach. The approach includes Aristotle's four causes: material, formal, moving, and final cause.[72] Many Greek classical texts were preserved by the Byzantine Empire and Arabic translations were made by Christians, mainly Nestorians and Miaphysites. Under the Abbasids, these Arabic translations were later improved and developed by Arabic scientists.[73] By the 6th and 7th centuries, the neighbouring Sasanian Empire established the medical Academy of Gondishapur, which was considered by Greek, Syriac, and Persian physicians as the most important medical hub of the ancient world.[74]
Islamic study of Aristotelianism flourished in the House of Wisdom established in the Abbasid capital of Baghdad, Iraq[75] and the flourished[76] until the Mongol invasions in the 13th century. Ibn al-Haytham, better known as Alhazen, used controlled experiments in his optical study.[a][78][79] Avicenna's compilation of The Canon of Medicine, a medical encyclopaedia, is considered to be one of the most important publications in medicine and was used until the 18th century.[80]
By the 11th century most of Europe had become Christian,[13]: 204 and in 1088, the University of Bologna emerged as the first university in Europe.[81] As such, demand for Latin translation of ancient and scientific texts grew,[13]: 204 a major contributor to the Renaissance of the 12th century. Renaissance scholasticism in western Europe flourished, with experiments done by observing, describing, and classifying subjects in nature.[82] In the 13th century, medical teachers and students at Bologna began opening human bodies, leading to the first anatomy textbook based on human dissection by Mondino de Luzzi.[83]
Renaissance
[edit]
New developments in optics played a role in the inception of the Renaissance, both by challenging long-held metaphysical ideas on perception, as well as by contributing to the improvement and development of technology such as the camera obscura and the telescope. At the start of the Renaissance, Roger Bacon, Vitello, and John Peckham each built up a scholastic ontology upon a causal chain beginning with sensation, perception, and finally apperception of the individual and universal forms of Aristotle.[77]: Book I A model of vision later known as perspectivism was exploited and studied by the artists of the Renaissance. This theory uses only three of Aristotle's four causes: formal, material, and final.[84]
In the 16th century, Nicolaus Copernicus formulated a heliocentric model of the Solar System, stating that the planets revolve around the Sun, instead of the geocentric model where the planets and the Sun revolve around the Earth. This was based on a theorem that the orbital periods of the planets are longer as their orbs are farther from the centre of motion, which he found not to agree with Ptolemy's model.[85]
Johannes Kepler and others challenged the notion that the only function of the eye is perception, and shifted the main focus in optics from the eye to the propagation of light.[84][86] Kepler is best known, however, for improving Copernicus' heliocentric model through the discovery of Kepler's laws of planetary motion. Kepler did not reject Aristotelian metaphysics and described his work as a search for the Harmony of the Spheres.[87] Galileo had made significant contributions to astronomy, physics and engineering. However, he became persecuted after Pope Urban VIII sentenced him for writing about the heliocentric model.[88]
The printing press was widely used to publish scholarly arguments, including some that disagreed widely with contemporary ideas of nature.[89] Francis Bacon and René Descartes published philosophical arguments in favour of a new type of non-Aristotelian science. Bacon emphasised the importance of experiment over contemplation, questioned the Aristotelian concepts of formal and final cause, promoted the idea that science should study the laws of nature and the improvement of all human life.[90] Descartes emphasised individual thought and argued that mathematics rather than geometry should be used to study nature.[91]
Age of Enlightenment
[edit]
At the start of the Age of Enlightenment, Isaac Newton formed the foundation of classical mechanics by his Philosophiæ Naturalis Principia Mathematica greatly influencing future physicists.[92] Gottfried Wilhelm Leibniz incorporated terms from Aristotelian physics, now used in a new non-teleological way. This implied a shift in the view of objects: objects were now considered as having no innate goals. Leibniz assumed that different types of things all work according to the same general laws of nature, with no special formal or final causes.[93]
During this time the declared purpose and value of science became producing wealth and inventions that would improve human lives, in the materialistic sense of having more food, clothing, and other things. In Bacon's words, "the real and legitimate goal of sciences is the endowment of human life with new inventions and riches", and he discouraged scientists from pursuing intangible philosophical or spiritual ideas, which he believed contributed little to human happiness beyond "the fume of subtle, sublime or pleasing [speculation]".[94]
Science during the Enlightenment was dominated by scientific societies and academies,[95] which had largely replaced universities as centres of scientific research and development. Societies and academies were the backbones of the maturation of the scientific profession. Another important development was the popularisation of science among an increasingly literate population.[96] Enlightenment philosophers turned to a few of their scientific predecessors – Galileo, Kepler, Boyle, and Newton principally – as the guides to every physical and social field of the day.[97][98]
The 18th century saw significant advancements in the practice of medicine[99] and physics;[100] the development of biological taxonomy by Carl Linnaeus;[101] a new understanding of magnetism and electricity;[102] and the maturation of chemistry as a discipline.[103] Ideas on human nature, society, and economics evolved during the Enlightenment. Hume and other Scottish Enlightenment thinkers developed A Treatise of Human Nature, which was expressed historically in works by authors including James Burnett, Adam Ferguson, John Millar and William Robertson, all of whom merged a scientific study of how humans behaved in ancient and primitive cultures with a strong awareness of the determining forces of modernity.[104] Modern sociology largely originated from this movement.[105] In 1776, Adam Smith published The Wealth of Nations, which is often considered the first work on modern economics.[106]
19th century
[edit]
During the 19th century, many distinguishing characteristics of contemporary modern science began to take shape. These included the transformation of the life and physical sciences; the frequent use of precision instruments; the emergence of terms such as "biologist", "physicist", and "scientist"; an increased professionalisation of those studying nature; scientists gaining cultural authority over many dimensions of society; the industrialisation of numerous countries; the thriving of popular science writings; and the emergence of science journals.[107] During the late 19th century, psychology emerged as a separate discipline from philosophy when Wilhelm Wundt founded the first laboratory for psychological research in 1879.[108]
During the mid-19th century Charles Darwin and Alfred Russel Wallace independently proposed the theory of evolution by natural selection in 1858, which explained how different plants and animals originated and evolved. Their theory was set out in detail in Darwin's book On the Origin of Species, published in 1859.[109] Separately, Gregor Mendel presented his paper, "Experiments on Plant Hybridisation" in 1865,[110] which outlined the principles of biological inheritance, serving as the basis for modern genetics.[111]
Early in the 19th century John Dalton suggested the modern atomic theory, based on Democritus's original idea of indivisible particles called atoms.[112] The laws of conservation of energy, conservation of momentum and conservation of mass suggested a highly stable universe where there could be little loss of resources. However, with the advent of the steam engine and the Industrial Revolution there was an increased understanding that not all forms of energy have the same energy qualities, the ease of conversion to useful work or to another form of energy.[113] This realisation led to the development of the laws of thermodynamics, in which the free energy of the universe is seen as constantly declining: the entropy of a closed universe increases over time.[b]
The electromagnetic theory was established in the 19th century by the works of Hans Christian Ørsted, André-Marie Ampère, Michael Faraday, James Clerk Maxwell, Oliver Heaviside, and Heinrich Hertz. The new theory raised questions that could not easily be answered using Newton's framework. The discovery of X-rays inspired the discovery of radioactivity by Henri Becquerel and Marie Curie in 1896,[116] Marie Curie then became the first person to win two Nobel Prizes.[117] In the next year came the discovery of the first subatomic particle, the electron.[118]
20th century
[edit]
In the first half of the century the development of antibiotics and artificial fertilisers improved human living standards globally.[119][120] Harmful environmental issues such as ozone depletion, ocean acidification, eutrophication, and climate change came to the public's attention and caused the onset of environmental studies.[121]
During this period scientific experimentation became increasingly larger in scale and funding.[122] The extensive technological innovation stimulated by World War I, World War II, and the Cold War led to competitions between global powers, such as the Space Race and nuclear arms race.[123][124] Substantial international collaborations were also made, despite armed conflicts.[125]
In the late 20th century active recruitment of women and elimination of sex discrimination greatly increased the number of women scientists, but large gender disparities remained in some fields.[126] The discovery of the cosmic microwave background in 1964[127] led to a rejection of the steady-state model of the universe in favour of the Big Bang theory of Georges Lemaître.[128]
The century saw fundamental changes within science disciplines. Evolution became a unified theory in the early 20th century when the modern synthesis reconciled Darwinian evolution with classical genetics.[129] Albert Einstein's theory of relativity and the development of quantum mechanics complement classical mechanics to describe physics in extreme length, time and gravity.[130][131] Widespread use of integrated circuits in the last quarter of the 20th century combined with communications satellites led to a revolution in information technology and the rise of the global internet and mobile computing, including smartphones. The need for mass systematisation of long, intertwined causal chains and large amounts of data led to the rise of the fields of systems theory and computer-assisted scientific modelling.[132]
21st century
[edit]The Human Genome Project was completed in 2003 by identifying and mapping all of the genes of the human genome.[133] The first induced pluripotent human stem cells were made in 2006, allowing adult cells to be transformed into stem cells and turn into any cell type found in the body.[134] With the affirmation of the Higgs boson discovery in 2013, the last particle predicted by the Standard Model of particle physics was found.[135] In 2015, gravitational waves, predicted by general relativity a century before, were first observed.[136][137] In 2019, the international collaboration Event Horizon Telescope presented the first direct image of a black hole's accretion disc.[138]
Branches
[edit]Modern science is commonly divided into three major branches: natural science, social science, and formal science.[3] Each of these branches comprises various specialised yet overlapping scientific disciplines that often possess their own nomenclature and expertise.[139] Both natural and social sciences are empirical sciences,[140] as their knowledge is based on empirical observations and is capable of being tested for its validity by other researchers working under the same conditions.[141]
Natural
[edit]Natural science is the study of the physical world. It can be divided into two main branches: life science and physical science. These two branches may be further divided into more specialised disciplines. For example, physical science can be subdivided into physics, chemistry, astronomy, and earth science. Modern natural science is the successor to the natural philosophy that began in Ancient Greece. Galileo, Descartes, Bacon, and Newton debated the benefits of using approaches that were more mathematical and more experimental in a methodical way. Still, philosophical perspectives, conjectures, and presuppositions, often overlooked, remain necessary in natural science.[142] Systematic data collection, including discovery science, succeeded natural history, which emerged in the 16th century by describing and classifying plants, animals, minerals, and other biotic beings.[143] Today, "natural history" suggests observational descriptions aimed at popular audiences.[144]
Social
[edit]
Social science is the study of human behaviour and the functioning of societies.[4][5] It has many disciplines that include, but are not limited to anthropology, economics, history, human geography, political science, psychology, and sociology.[4] In the social sciences, there are many competing theoretical perspectives, many of which are extended through competing research programmes such as the functionalists, conflict theorists, and interactionists in sociology.[4] Due to the limitations of conducting controlled experiments involving large groups of individuals or complex situations, social scientists may adopt other research methods such as the historical method, case studies, and cross-cultural studies. Moreover, if quantitative information is available, social scientists may rely on statistical approaches to better understand social relationships and processes.[4]
Formal
[edit]Formal science is an area of study that generates knowledge using formal systems.[145][146][147] A formal system is an abstract structure used for inferring theorems from axioms according to a set of rules.[148] It includes mathematics,[149][150] systems theory, and theoretical computer science. The formal sciences share similarities with the other two branches by relying on objective, careful, and systematic study of an area of knowledge. They are, however, different from the empirical sciences as they rely exclusively on deductive reasoning, without the need for empirical evidence, to verify their abstract concepts.[8][151][141] The formal sciences are therefore a priori disciplines and because of this, there is disagreement on whether they constitute a science.[6][152] Nevertheless, the formal sciences play an important role in the empirical sciences. Calculus, for example, was initially invented to understand motion in physics.[153] Natural and social sciences that rely heavily on mathematical applications include mathematical physics,[154] chemistry,[155] biology,[156] finance,[157] and economics.[158]
Applied
[edit]Applied science is the use of the scientific method and knowledge to attain practical goals and includes a broad range of disciplines such as engineering and medicine.[159][12] Engineering is the use of scientific principles to invent, design and build machines, structures and technologies.[160] Science may contribute to the development of new technologies.[161] Medicine is the practice of caring for patients by maintaining and restoring health through the prevention, diagnosis, and treatment of injury or disease.[162][163]
Basic
[edit]The applied sciences are often contrasted with the basic sciences, which are focused on advancing scientific theories and laws that explain and predict events in the natural world.[164][165]
Blue skies
[edit]Computational
[edit]Computational science applies computer simulations to science, enabling a better understanding of scientific problems than formal mathematics alone can achieve. The use of machine learning and artificial intelligence is becoming a central feature of computational contributions to science, for example in agent-based computational economics, random forests, topic modeling and various forms of prediction. However, machines alone rarely advance knowledge as they require human guidance and capacity to reason; and they can introduce bias against certain social groups or sometimes underperform against humans.[168][169]
Interdisciplinary
[edit]Interdisciplinary science involves the combination of two or more disciplines into one,[170] such as bioinformatics, a combination of biology and computer science[171] or cognitive sciences. The concept has existed since the ancient Greek period and it became popular again in the 20th century.[172]
Research
[edit]Scientific research can be labelled as either basic or applied research. Basic research is the search for knowledge and applied research is the search for solutions to practical problems using this knowledge. Most understanding comes from basic research, though sometimes applied research targets specific practical problems. This leads to technological advances that were not previously imaginable.[173]
Scientific method
[edit]
Scientific research involves using the scientific method, which seeks to objectively explain the events of nature in a reproducible way.[174] Scientists usually take for granted a set of basic assumptions that are needed to justify the scientific method: there is an objective reality shared by all rational observers; this objective reality is governed by natural laws; these laws were discovered by means of systematic observation and experimentation.[2] Mathematics is essential in the formation of hypotheses, theories, and laws, because it is used extensively in quantitative modelling, observing, and collecting measurements.[175] Statistics is used to summarise and analyse data, which allows scientists to assess the reliability of experimental results.[176]
In the scientific method an explanatory thought experiment or hypothesis is put forward as an explanation using parsimony principles and is expected to seek consilience – fitting with other accepted facts related to an observation or scientific question.[177] This tentative explanation is used to make falsifiable predictions, which are typically posted before being tested by experimentation. Disproof of a prediction is evidence of progress.[174]: 4–5 [178] Experimentation is especially important in science to help establish causal relationships to avoid the correlation fallacy, though in some sciences such as astronomy or geology, a predicted observation might be more appropriate.[179]
When a hypothesis proves unsatisfactory it is modified or discarded. If the hypothesis survives testing, it may become adopted into the framework of a scientific theory, a validly reasoned, self-consistent model or framework for describing the behaviour of certain natural events. A theory typically describes the behaviour of much broader sets of observations than a hypothesis; commonly, a large number of hypotheses can be logically bound together by a single theory. Thus, a theory is a hypothesis explaining various other hypotheses. In that vein, theories are formulated according to most of the same scientific principles as hypotheses. Scientists may generate a model, an attempt to describe or depict an observation in terms of a logical, physical or mathematical representation, and to generate new hypotheses that can be tested by experimentation.[180]
While performing experiments to test hypotheses, scientists may have a preference for one outcome over another.[181][182] Eliminating the bias can be achieved through transparency, careful experimental design, and a thorough peer review process of the experimental results and conclusions.[183][184] After the results of an experiment are announced or published, it is normal practice for independent researchers to double-check how the research was performed, and to follow up by performing similar experiments to determine how dependable the results might be.[185] Taken in its entirety, the scientific method allows for highly creative problem solving while minimising the effects of subjective and confirmation bias.[186] Intersubjective verifiability, the ability to reach a consensus and reproduce results, is fundamental to the creation of all scientific knowledge.[187]
Literature
[edit]
Scientific research is published in a range of literature.[188] Scientific journals communicate and document the results of research carried out in universities and various other research institutions, serving as an archival record of science. The first scientific journals, Journal des sçavans followed by Philosophical Transactions, began publication in 1665. Since that time the total number of active periodicals has steadily increased. In 1981, one estimate for the number of scientific and technical journals in publication was 11,500.[189]
Most scientific journals cover a single scientific field and publish the research within that field; the research is normally expressed in the form of a scientific paper. Science has become so pervasive in modern societies that it is considered necessary to communicate the achievements, news, and ambitions of scientists to a wider population.[190]
Challenges
[edit]The replication crisis is an ongoing methodological crisis that affects parts of the social and life sciences. In subsequent investigations, the results of many scientific studies have been proven to be unrepeatable.[191] The crisis has long-standing roots; the phrase was coined in the early 2010s[192] as part of a growing awareness of the problem. The replication crisis represents an important body of research in metascience, which aims to improve the quality of all scientific research while reducing waste.[193]
An area of study or speculation that masquerades as science in an attempt to claim legitimacy that it would not otherwise be able to achieve is sometimes referred to as pseudoscience, fringe science, or junk science.[194][195] Physicist Richard Feynman coined the term "cargo cult science" for cases in which researchers believe, and at a glance, look like they are doing science but lack the honesty to allow their results to be rigorously evaluated.[196] Various types of commercial advertising, ranging from hype to fraud, may fall into these categories. Science has been described as "the most important tool" for separating valid claims from invalid ones.[197]
There can also be an element of political bias or ideological bias on all sides of scientific debates. Sometimes, research may be characterised as "bad science", research that may be well-intended but is incorrect, obsolete, incomplete, or over-simplified expositions of scientific ideas. The term scientific misconduct refers to situations such as where researchers have intentionally misrepresented their published data or have purposely given credit for a discovery to the wrong person.[198]
Philosophy
[edit]

There are different schools of thought in the philosophy of science. The most popular position is empiricism, which holds that knowledge is created by a process involving observation; scientific theories generalise observations.[199] Empiricism generally encompasses inductivism, a position that explains how general theories can be made from the finite amount of empirical evidence available. Many versions of empiricism exist, with the predominant ones being Bayesianism and the hypothetico-deductive method.[200][199]
Empiricism has stood in contrast to rationalism, the position originally associated with Descartes, which holds that knowledge is created by the human intellect, not by observation.[201] Critical rationalism is a contrasting 20th-century approach to science, first defined by Austrian-British philosopher Karl Popper. Popper rejected the way that empiricism describes the connection between theory and observation. He claimed that theories are not generated by observation, but that observation is made in the light of theories, and that the only way theory A can be affected by observation is after theory A were to conflict with observation, but theory B were to survive the observation.[202] Popper proposed replacing verifiability with falsifiability as the landmark of scientific theories, replacing induction with falsification as the empirical method.[202] Popper further claimed that there is actually only one universal method, not specific to science: the negative method of criticism, trial and error,[203] covering all products of the human mind, including science, mathematics, philosophy, and art.[204]
Another approach, instrumentalism, emphasises the utility of theories as instruments for explaining and predicting phenomena. It views scientific theories as black boxes, with only their input (initial conditions) and output (predictions) being relevant. Consequences, theoretical entities, and logical structure are claimed to be things that should be ignored.[205] Close to instrumentalism is constructive empiricism, according to which the main criterion for the success of a scientific theory is whether what it says about observable entities is true.[206]
Thomas Kuhn argued that the process of observation and evaluation takes place within a paradigm, a logically consistent "portrait" of the world that is consistent with observations made from its framing. He characterised normal science as the process of observation and "puzzle solving", which takes place within a paradigm, whereas revolutionary science occurs when one paradigm overtakes another in a paradigm shift.[207] Each paradigm has its own distinct questions, aims, and interpretations. The choice between paradigms involves setting two or more "portraits" against the world and deciding which likeness is most promising. A paradigm shift occurs when a significant number of observational anomalies arise in the old paradigm and a new paradigm makes sense of them. That is, the choice of a new paradigm is based on observations, even though those observations are made against the background of the old paradigm. For Kuhn, acceptance or rejection of a paradigm is a social process as much as a logical process. Kuhn's position, however, is not one of relativism.[208]
Another approach often cited in debates of scientific scepticism against controversial movements like "creation science" is methodological naturalism. Naturalists maintain that a difference should be made between natural and supernatural, and science should be restricted to natural explanations.[209] Methodological naturalism maintains that science requires strict adherence to empirical study and independent verification.[210]
Community
[edit]The scientific community is a network of interacting scientists who conduct scientific research. The community consists of smaller groups working in scientific fields. By having peer review, through discussion and debate within journals and conferences, scientists maintain the quality of research methodology and objectivity when interpreting results.[211]
Scientists
[edit]
Scientists are individuals who conduct scientific research to advance knowledge in an area of interest.[212][213] Scientists may exhibit a strong curiosity about reality and a desire to apply scientific knowledge for the benefit of public health, nations, the environment, or industries; other motivations include recognition by peers and prestige.[citation needed] In modern times, many scientists study within specific areas of science in academic institutions, often obtaining advanced degrees in the process.[214] Many scientists pursue careers in various fields such as academia, industry, government, and nonprofit organisations.[215][216][217]
Science has historically been a male-dominated field, with notable exceptions. Women have faced considerable discrimination in science, much as they have in other areas of male-dominated societies. For example, women were frequently passed over for job opportunities and denied credit for their work.[218] The achievements of women in science have been attributed to the defiance of their traditional role as labourers within the domestic sphere.[219]
Learned societies
[edit]
Learned societies for the communication and promotion of scientific thought and experimentation have existed since the Renaissance.[220] Many scientists belong to a learned society that promotes their respective scientific discipline, profession, or group of related disciplines.[221] Membership may either be open to all, require possession of scientific credentials, or conferred by election.[222] Most scientific societies are nonprofit organisations,[223] and many are professional associations. Their activities typically include holding regular conferences for the presentation and discussion of new research results and publishing or sponsoring academic journals in their discipline. Some societies act as professional bodies, regulating the activities of their members in the public interest, or the collective interest of the membership.
The professionalisation of science, begun in the 19th century, was partly enabled by the creation of national distinguished academies of sciences such as the Italian Accademia dei Lincei in 1603,[224] the British Royal Society in 1660,[225] the French Academy of Sciences in 1666,[226] the American National Academy of Sciences in 1863,[227] the German Kaiser Wilhelm Society in 1911,[228] and the Chinese Academy of Sciences in 1949.[229] International scientific organisations, such as the International Science Council, are devoted to international cooperation for science advancement.[230]
Awards
[edit]Science awards are usually given to individuals or organisations that have made significant contributions to a discipline. They are often given by prestigious institutions; thus, it is considered a great honour for a scientist receiving them. Since the early Renaissance, scientists have often been awarded medals, money, and titles. The Nobel Prize, a widely regarded prestigious award, is awarded annually to those who have achieved scientific advances in the fields of medicine, physics, and chemistry.[231]
Society
[edit]Funding and policies
[edit]
Funding of science is often through a competitive process in which potential research projects are evaluated and only the most promising receive funding. Such processes, which are run by government, corporations, or foundations, allocate scarce funds. Total research funding in most developed countries is between 1.5% and 3% of GDP.[232] In the OECD, around two-thirds of research and development in scientific and technical fields is carried out by industry, and 20% and 10%, respectively, by universities and government. The government funding proportion in certain fields is higher, and it dominates research in social science and the humanities. In less developed nations, the government provides the bulk of the funds for their basic scientific research.[233]
Many governments have dedicated agencies to support scientific research, such as the National Science Foundation in the United States,[234] the National Scientific and Technical Research Council in Argentina,[235] Commonwealth Scientific and Industrial Research Organisation in Australia,[236] National Centre for Scientific Research in France,[237] the Max Planck Society in Germany,[238] and National Research Council in Spain.[239] In commercial research and development, all but the most research-orientated corporations focus more heavily on near-term commercialisation possibilities than research driven by curiosity.[240]
Science policy is concerned with policies that affect the conduct of the scientific enterprise, including research funding, often in pursuance of other national policy goals such as technological innovation to promote commercial product development, weapons development, health care, and environmental monitoring. Science policy sometimes refers to the act of applying scientific knowledge and consensus to the development of public policies. In accordance with public policy being concerned about the well-being of its citizens, science policy's goal is to consider how science and technology can best serve the public.[241] Public policy can directly affect the funding of capital equipment and intellectual infrastructure for industrial research by providing tax incentives to those organisations that fund research.[190]
Education and awareness
[edit]Science education for the general public is embedded in the school curriculum, and is supplemented by online pedagogical content (for example, YouTube and Khan Academy), museums, and science magazines and blogs. Major organisations of scientists such as the American Association for the Advancement of Science (AAAS) consider the sciences to be a part of the liberal arts traditions of learning, along with philosophy and history.[242] Scientific literacy is chiefly concerned with an understanding of the scientific method, units and methods of measurement, empiricism, a basic understanding of statistics (correlations, qualitative versus quantitative observations, aggregate statistics), and a basic understanding of core scientific fields such as physics, chemistry, biology, ecology, geology, and computation. As a student advances into higher stages of formal education, the curriculum becomes more in depth. Traditional subjects usually included in the curriculum are natural and formal sciences, although recent movements include social and applied science as well.[243]
The mass media face pressures that can prevent them from accurately depicting competing scientific claims in terms of their credibility within the scientific community as a whole. Determining how much weight to give different sides in a scientific debate may require considerable expertise regarding the matter.[244] Few journalists have real scientific knowledge, and even beat reporters who are knowledgeable about certain scientific issues may be ignorant about other scientific issues that they are suddenly asked to cover.[245][246]
Science magazines such as New Scientist, Science & Vie, and Scientific American cater to the needs of a much wider readership and provide a non-technical summary of popular areas of research, including notable discoveries and advances in certain fields of research.[247] The science fiction genre, primarily speculative fiction, can transmit the ideas and methods of science to the general public.[248] Recent efforts to intensify or develop links between science and non-scientific disciplines, such as literature or poetry, include the Creative Writing Science resource developed through the Royal Literary Fund.[249]
Anti-science attitudes
[edit]While the scientific method is broadly accepted in the scientific community, some fractions of society reject certain scientific positions or are sceptical about science. Examples are the common notion that COVID-19 is not a major health threat to the US (held by 39% of Americans in August 2021)[250] or the belief that climate change is not a major threat to the US (also held by 40% of Americans, in late 2019 and early 2020).[251] Psychologists have pointed to four factors driving rejection of scientific results:[252]
- Scientific authorities are sometimes seen as inexpert, untrustworthy, or biased.
- Some marginalised social groups hold anti-science attitudes, in part because these groups have often been exploited in unethical experiments.[253]
- Messages from scientists may contradict deeply held existing beliefs or morals.
- The delivery of a scientific message may not be appropriately targeted to a recipient's learning style.
Anti-science attitudes often seem to be caused by fear of rejection in social groups. For instance, climate change is perceived as a threat by only 22% of Americans on the right side of the political spectrum, but by 85% on the left.[254] That is, if someone on the left would not consider climate change as a threat, this person may face contempt and be rejected in that social group. In fact, people may rather deny a scientifically accepted fact than lose or jeopardise their social status.[255]
Politics
[edit]
Attitudes towards science are often determined by political opinions and goals. Government, business and advocacy groups have been known to use legal and economic pressure to influence scientific researchers. Many factors can act as facets of the politicisation of science such as anti-intellectualism, perceived threats to religious beliefs, and fear for business interests.[257] Politicisation of science is usually accomplished when scientific information is presented in a way that emphasises the uncertainty associated with the scientific evidence.[258] Tactics such as shifting conversation, failing to acknowledge facts, and capitalising on doubt of scientific consensus have been used to gain more attention for views that have been undermined by scientific evidence.[259] Examples of issues that have involved the politicisation of science include the global warming controversy, health effects of pesticides, and health effects of tobacco.[259][260]
See also
[edit]Notes
[edit]- ^ Ibn al-Haytham's Book of Optics Book I, [6.54]. pages 372 and 408 disputed Claudius Ptolemy's extramission theory of vision; "Hence, the extramission of [visual] rays is superfluous and useless". —A.Mark Smith's translation of the Latin version of Ibn al-Haytham.[77]: Book I, [6.54]. pp. 372, 408
- ^ Whether the universe is closed or open, or the shape of the universe, is an open question. The 2nd law of thermodynamics,[113]: 9 [114] and the 3rd law of thermodynamics[115] imply the heat death of the universe if the universe is a closed system, but not necessarily for an expanding universe.
References
[edit]- ^ Wilson, E. O. (1999). "The natural sciences". Consilience: The Unity of Knowledge (Reprint ed.). New York: Vintage. pp. 49–71. ISBN 978-0-679-76867-8.
- ^ a b Heilbron, J. L.; et al. (2003). "Preface". The Oxford Companion to the History of Modern Science. New York: Oxford University Press. pp. vii–x. ISBN 978-0-19-511229-0.
...modern science is a discovery as well as an invention. It was a discovery that nature generally acts regularly enough to be described by laws and even by mathematics; and required invention to devise the techniques, abstractions, apparatus, and organization for exhibiting the regularities and securing their law-like descriptions.
- ^ a b Cohen, Eliel (2021). "The boundary lens: theorising academic activity". The University and its Boundaries: Thriving or Surviving in the 21st Century. New York: Routledge. pp. 14–41. ISBN 978-0-367-56298-4. Archived from the original on 5 May 2021. Retrieved 4 May 2021.
- ^ a b c d e Colander, David C.; Hunt, Elgin F. (2019). "Social science and its methods". Social Science: An Introduction to the Study of Society (17th ed.). New York: Routledge. pp. 1–22.
- ^ a b Nisbet, Robert A.; Greenfeld, Liah (16 October 2020). "Social Science". Encyclopædia Britannica. Archived from the original on 2 February 2022. Retrieved 9 May 2021.
- ^ a b Bishop, Alan (1991). "Environmental activities and mathematical culture". Mathematical Enculturation: A Cultural Perspective on Mathematics Education. Norwell, MA: Kluwer. pp. 20–59. ISBN 978-0-7923-1270-3. Retrieved 24 March 2018.
- ^ Bunge, Mario (1998). "The Scientific Approach". Philosophy of Science: Volume 1, From Problem to Theory. Vol. 1 (revised ed.). New York: Routledge. pp. 3–50. ISBN 978-0-7658-0413-6.
- ^ a b Fetzer, James H. (2013). "Computer reliability and public policy: Limits of knowledge of computer-based systems". Computers and Cognition: Why Minds are not Machines. Newcastle, United Kingdom: Kluwer. pp. 271–308. ISBN 978-1-4438-1946-6.
- ^ Nickles, Thomas (2013). "The Problem of Demarcation". Philosophy of Pseudoscience: Reconsidering the Demarcation Problem. The University of Chicago Press. p. 104.
- ^ Fischer, M. R.; Fabry, G (2014). "Thinking and acting scientifically: Indispensable basis of medical education". GMS Zeitschrift für Medizinische Ausbildung. 31 (2): Doc24. doi:10.3205/zma000916. PMC 4027809. PMID 24872859.
- ^ Sinclair, Marius (1993). "On the Differences between the Engineering and Scientific Methods". The International Journal of Engineering Education. Archived from the original on 15 November 2017. Retrieved 7 September 2018.
- ^ a b Bunge, M. (1966). "Technology as Applied Science". In Rapp, F. (ed.). Contributions to a Philosophy of Technology. Dordrecht: Springer. pp. 19–39. doi:10.1007/978-94-010-2182-1_2. ISBN 978-94-010-2184-5. S2CID 110332727.
- ^ a b c d e f g h i j k l m n Lindberg, David C. (2007). The beginnings of Western science: the European Scientific tradition in philosophical, religious, and institutional context (2nd ed.). University of Chicago Press. ISBN 978-0-226-48205-7.
- ^ a b Grant, Edward (2007). "Ancient Egypt to Plato". A History of Natural Philosophy: From the Ancient World to the Nineteenth Century. New York: Cambridge University Press. pp. 1–26. ISBN 978-0-521-68957-1.
- ^ Building Bridges Among the BRICs Archived 18 April 2023 at the Wayback Machine, p. 125, Robert Crane, Springer, 2014
- ^ Keay, John (2000). India: A history. Atlantic Monthly Press. p. 132. ISBN 978-0-87113-800-2.
The great era of all that is deemed classical in Indian literature, art and science was now dawning. It was this crescendo of creativity and scholarship, as much as ... political achievements of the Guptas, which would make their age so golden.
- ^ Sease, Virginia; Schmidt-Brabant, Manfrid. Thinkers, Saints, Heretics: Spiritual Paths of the Middle Ages. 2007. Pages 80–81 Archived 27 August 2024 at the Wayback Machine. Retrieved 6 October 2023
- ^ Principe, Lawrence M. (2011). "Introduction". Scientific Revolution: A Very Short Introduction. New York: Oxford University Press. pp. 1–3. ISBN 978-0-19-956741-6.
- ^ Grant, Edward (2007). "Transformation of medieval natural philosophy from the early period modern period to the end of the nineteenth century". A History of Natural Philosophy: From the Ancient World to the Nineteenth Century. New York: Cambridge University Press. pp. 274–322. ISBN 978-0-521-68957-1.
- ^ Cahan, David, ed. (2003). From Natural Philosophy to the Sciences: Writing the History of Nineteenth-Century Science. University of Chicago Press. ISBN 978-0-226-08928-7.
- ^ Lightman, Bernard (2011). "13. Science and the Public". In Shank, Michael; Numbers, Ronald; Harrison, Peter (eds.). Wrestling with Nature: From Omens to Science. University of Chicago Press. p. 367. ISBN 978-0-226-31783-0.
- ^ Harrison, Peter (2015). The Territories of Science and Religion. University of Chicago Press. pp. 164–165. ISBN 978-0-226-18451-7.
The changing character of those engaged in scientific endeavors was matched by a new nomenclature for their endeavors. The most conspicuous marker of this change was the replacement of "natural philosophy" by "natural science". In 1800 few had spoken of the "natural sciences" but by 1880 this expression had overtaken the traditional label "natural philosophy". The persistence of "natural philosophy" in the twentieth century is owing largely to historical references to a past practice (see figure 11). As should now be apparent, this was not simply the substitution of one term by another, but involved the jettisoning of a range of personal qualities relating to the conduct of philosophy and the living of the philosophical life.
- ^ MacRitchie, Finlay (2011). "Introduction". Scientific Research as a Career. New York: Routledge. pp. 1–6. ISBN 978-1-4398-6965-9. Archived from the original on 5 May 2021. Retrieved 5 May 2021.
- ^ Marder, Michael P. (2011). "Curiosity and research". Research Methods for Science. New York: Cambridge University Press. pp. 1–17. ISBN 978-0-521-14584-8. Archived from the original on 5 May 2021. Retrieved 5 May 2021.
- ^ de Ridder, Jeroen (2020). "How many scientists does it take to have knowledge?". In McCain, Kevin; Kampourakis, Kostas (eds.). What is Scientific Knowledge? An Introduction to Contemporary Epistemology of Science. New York: Routledge. pp. 3–17. ISBN 978-1-138-57016-0. Archived from the original on 5 May 2021. Retrieved 5 May 2021.
- ^ Szycher, Michael (2016). "Establishing your dream team". Commercialization Secrets for Scientists and Engineers. New York: Routledge. pp. 159–176. ISBN 978-1-138-40741-1. Archived from the original on 18 August 2021. Retrieved 5 May 2021.
- ^ "Science". Merriam-Webster Online Dictionary. Archived from the original on 1 September 2019. Retrieved 16 October 2011.
- ^ Vaan, Michiel de (2008). "sciō". Etymological Dictionary of Latin and the other Italic Languages. Indo-European Etymological Dictionary. p. 545. ISBN 978-90-04-16797-1.
- ^ Cahan, David (2003). From natural philosophy to the sciences: writing the history of nineteenth-century science. University of Chicago Press. pp. 3–15. ISBN 0-226-08927-4.
- ^ Ross, Sydney (1962). "Scientist: The story of a word". Annals of Science. 18 (2): 65–85. doi:10.1080/00033796200202722.
- ^ "scientist". Oxford English Dictionary (Online ed.). Oxford University Press. (Subscription or participating institution membership required.)
- ^ Carruthers, Peter (2 May 2002). Carruthers, Peter; Stich, Stephen; Siegal, Michael (eds.). "The roots of scientific reasoning: infancy, modularity and the art of tracking". The Cognitive Basis of Science. Cambridge University Press. pp. 73–96. doi:10.1017/cbo9780511613517.005. ISBN 978-0-521-81229-0.
- ^ Lombard, Marlize; Gärdenfors, Peter (2017). "Tracking the Evolution of Causal Cognition in Humans". Journal of Anthropological Sciences. 95 (95): 219–234. doi:10.4436/JASS.95006. ISSN 1827-4765. PMID 28489015.
- ^ Graeber, David; Wengrow, David (2021). The Dawn of Everything. p. 248.
- ^ Budd, Paul; Taylor, Timothy (1995). "The Faerie Smith Meets the Bronze Industry: Magic Versus Science in the Interpretation of Prehistoric Metal-Making". World Archaeology. 27 (1): 133–143. doi:10.1080/00438243.1995.9980297. JSTOR 124782.
- ^ Tuomela, Raimo (1987). "Science, Protoscience, and Pseudoscience". In Pitt, J. C.; Pera, M. (eds.). Rational Changes in Science. Boston Studies in the Philosophy of Science. Vol. 98. Dordrecht: Springer. pp. 83–101. doi:10.1007/978-94-009-3779-6_4. ISBN 978-94-010-8181-8.
- ^ Smith, Pamela H. (2009). "Science on the Move: Recent Trends in the History of Early Modern Science". Renaissance Quarterly. 62 (2): 345–375. doi:10.1086/599864. PMID 19750597. S2CID 43643053.
- ^ Fleck, Robert (March 2021). "Fundamental Themes in Physics from the History of Art". Physics in Perspective. 23 (1): 25–48. Bibcode:2021PhP....23...25F. doi:10.1007/s00016-020-00269-7. ISSN 1422-6944. S2CID 253597172.
- ^ Scott, Colin (2011). "The Case of James Bay Cree Knowledge Construction". In Harding, Sandra (ed.). Science for the West, Myth for the Rest?. The Postcolonial Science and Technology Studies Reader. Durham, NC: Duke University Press. pp. 175–197. doi:10.2307/j.ctv11g96cc.16. ISBN 978-0-8223-4936-5. JSTOR j.ctv11g96cc.16.
- ^ Dear, Peter (2012). "Historiography of Not-So-Recent Science". History of Science. 50 (2): 197–211. doi:10.1177/007327531205000203. S2CID 141599452.
- ^ Rochberg, Francesca (2011). "Ch.1 Natural Knowledge in Ancient Mesopotamia". In Shank, Michael; Numbers, Ronald; Harrison, Peter (eds.). Wrestling with Nature: From Omens to Science. University of Chicago Press. p. 9. ISBN 978-0-226-31783-0.
- ^ Krebs, Robert E. (2004). Groundbreaking Scientific Experiments, Inventions, and Discoveries of the Middle Ages and the Renaissance. Greenwood Publishing Group. p. 127. ISBN 978-0-313-32433-8.
- ^ Erlich, Ḥaggai; Gershoni, Israel (2000). The Nile: Histories, Cultures, Myths. Lynne Rienner. pp. 80–81. ISBN 978-1-55587-672-2. Retrieved 9 January 2020.
The Nile occupied an important position in Egyptian culture; it influenced the development of mathematics, geography, and the calendar; Egyptian geometry advanced due to the practice of land measurement "because the overflow of the Nile caused the boundary of each person's land to disappear."
- ^ "Telling Time in Ancient Egypt". The Met's Heilbrunn Timeline of Art History. February 2017. Archived from the original on 3 March 2022. Retrieved 27 May 2022.
- ^ a b c McIntosh, Jane R. (2005). Ancient Mesopotamia: New Perspectives. Santa Barbara, CA: ABC-CLIO. pp. 273–276. ISBN 978-1-57607-966-9. Retrieved 20 October 2020.
- ^ Aaboe, Asger (2 May 1974). "Scientific Astronomy in Antiquity". Philosophical Transactions of the Royal Society. 276 (1257): 21–42. Bibcode:1974RSPTA.276...21A. doi:10.1098/rsta.1974.0007. JSTOR 74272. S2CID 122508567.
- ^ Biggs, R. D. (2005). "Medicine, Surgery, and Public Health in Ancient Mesopotamia". Journal of Assyrian Academic Studies. 19 (1): 7–18.
- ^ Lehoux, Daryn (2011). "2. Natural Knowledge in the Classical World". In Shank, Michael; Numbers, Ronald; Harrison, Peter (eds.). Wrestling with Nature: From Omens to Science. University of Chicago Press. p. 39. ISBN 978-0-226-31783-0.
- ^ An account of the pre-Socratic use of the concept of φύσις may be found in Naddaf, Gerard (2006). The Greek Concept of Nature. SUNY Press, and in Ducarme, Frédéric; Couvet, Denis (2020). "What does 'nature' mean?" (PDF). Palgrave Communications. 6 (14) 14. Springer Nature. doi:10.1057/s41599-020-0390-y. Archived (PDF) from the original on 16 August 2023. Retrieved 16 August 2023. The word φύσις, while first used in connection with a plant in Homer, occurs early in Greek philosophy, and in several senses. Generally, these senses match rather well the current senses in which the English word nature is used, as confirmed by Guthrie, W. K. C. Presocratic Tradition from Parmenides to Democritus (volume 2 of his History of Greek Philosophy), Cambridge University Press, 1965.
- ^ Strauss, Leo; Gildin, Hilail (1989). "Progress or Return? The Contemporary Crisis in Western Education". An Introduction to Political Philosophy: Ten Essays by Leo Strauss. Wayne State University Press. p. 209. ISBN 978-0-8143-1902-4. Retrieved 30 May 2022.
- ^ O'Grady, Patricia F. (2016). Thales of Miletus: The Beginnings of Western Science and Philosophy. New York: Routledge. p. 245. ISBN 978-0-7546-0533-1. Retrieved 20 October 2020.
- ^ a b Burkert, Walter (1 June 1972). Lore and Science in Ancient Pythagoreanism. Cambridge, MA: Harvard University Press. ISBN 978-0-674-53918-1.
- ^ Pullman, Bernard (1998). The Atom in the History of Human Thought. Oxford University Press. pp. 31–33. Bibcode:1998ahht.book.....P. ISBN 978-0-19-515040-7. Retrieved 20 October 2020.
- ^ Cohen, Henri; Lefebvre, Claire, eds. (2017). Handbook of Categorization in Cognitive Science (2nd ed.). Amsterdam: Elsevier. p. 427. ISBN 978-0-08-101107-2. Retrieved 20 October 2020.
- ^ Lucretius (fl.1st cenruty BCE) De rerum natura
- ^ Margotta, Roberto (1968). The Story of Medicine. New York: Golden Press. Retrieved 18 November 2020.
- ^ Touwaide, Alain (2005). Glick, Thomas F.; Livesey, Steven; Wallis, Faith (eds.). Medieval Science, Technology, and Medicine: An Encyclopedia. New York: Routledge. p. 224. ISBN 978-0-415-96930-7. Retrieved 20 October 2020.
- ^ Leff, Samuel; Leff, Vera (1956). From Witchcraft to World Health. London: Macmillan. Retrieved 23 August 2020.
- ^ "Plato, Apology". p. 17. Archived from the original on 29 January 2018. Retrieved 1 November 2017.
- ^ "Plato, Apology". p. 27. Archived from the original on 29 January 2018. Retrieved 1 November 2017.
- ^ Aristotle. Nicomachean Ethics (H. Rackham ed.). 1139b. Archived from the original on 17 March 2012. Retrieved 22 September 2010.
- ^ a b McClellan, James E. III; Dorn, Harold (2015). Science and Technology in World History: An Introduction. Baltimore: Johns Hopkins University Press. pp. 99–100. ISBN 978-1-4214-1776-9. Retrieved 20 October 2020.
- ^ Graßhoff, Gerd (1990). The History of Ptolemy's Star Catalogue. Studies in the History of Mathematics and Physical Sciences. Vol. 14. New York: Springer. doi:10.1007/978-1-4612-4468-4. ISBN 978-1-4612-8788-9.
- ^ Hoffmann, Susanne M. (2017). Hipparchs Himmelsglobus (in German). Wiesbaden: Springer Fachmedien Wiesbaden. Bibcode:2017hihi.book.....H. doi:10.1007/978-3-658-18683-8. ISBN 978-3-658-18682-1.
- ^ Edwards, C. H. Jr. (1979). The Historical Development of the Calculus. New York: Springer. p. 75. ISBN 978-0-387-94313-8. Retrieved 20 October 2020.
- ^ Lawson, Russell M. (2004). Science in the Ancient World: An Encyclopedia. Santa Barbara, CA: ABC-CLIO. pp. 190–191. ISBN 978-1-85109-539-1. Retrieved 20 October 2020.
- ^ Murphy, Trevor Morgan (2004). Pliny the Elder's Natural History: The Empire in the Encyclopedia. Oxford University Press. p. 1. ISBN 978-0-19-926288-5. Retrieved 20 October 2020.
- ^ Doody, Aude (2010). Pliny's Encyclopedia: The Reception of the Natural History. Cambridge University Press. p. 1. ISBN 978-1-139-48453-4. Retrieved 20 October 2020.
- ^ Conner, Clifford D. (2005). A People's History of Science: Miners, Midwives, and "Low Mechanicks". New York: Nation Books. pp. 72–74. ISBN 1-56025-748-2.
- ^ Grant, Edward (1996). The Foundations of Modern Science in the Middle Ages: Their Religious, Institutional and Intellectual Contexts. Cambridge Studies in the History of Science. Cambridge University Press. pp. 7–17. ISBN 978-0-521-56762-6. Retrieved 9 November 2018.
- ^ Wildberg, Christian (1 May 2018). "Philoponus". In Zalta, Edward N. (ed.). Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University. Archived from the original on 22 August 2019. Retrieved 1 May 2018.
- ^ Falcon, Andrea (2019). "Aristotle on Causality". In Zalta, Edward (ed.). Stanford Encyclopedia of Philosophy (Spring 2019 ed.). Metaphysics Research Lab, Stanford University. Archived from the original on 9 October 2020. Retrieved 3 October 2020.
- ^ Grant, Edward (2007). "Islam and the eastward shift of Aristotelian natural philosophy". A History of Natural Philosophy: From the Ancient World to the Nineteenth Century. Cambridge University Press. pp. 62–67. ISBN 978-0-521-68957-1.
- ^ Fisher, W. B. (1968–1991). The Cambridge history of Iran. Cambridge University Press. ISBN 978-0-521-20093-6.
- ^ "Bayt al-Hikmah". Encyclopædia Britannica. Archived from the original on 4 November 2016. Retrieved 3 November 2016.
- ^ Hossein Nasr, Seyyed; Leaman, Oliver, eds. (2001). History of Islamic Philosophy. Routledge. pp. 165–167. ISBN 978-0-415-25934-7.
- ^ a b Smith, A. Mark (2001). Alhacen's Theory of Visual Perception: A Critical Edition, with English Translation and Commentary, of the First Three Books of Alhacen's De Aspectibus, the Medieval Latin Version of Ibn al-Haytham's Kitāb al-Manāẓir, 2 vols. Transactions of the American Philosophical Society. Vol. 91. Philadelphia: American Philosophical Society. ISBN 978-0-87169-914-5.
- ^ Toomer, G. J. (1964). "Reviewed work: Ibn al-Haythams Weg zur Physik, Matthias Schramm". Isis. 55 (4): 463–465. doi:10.1086/349914. JSTOR 228328. See p. 464: "Schramm sums up [Ibn Al-Haytham's] achievement in the development of scientific method.", p. 465: "Schramm has demonstrated .. beyond any dispute that Ibn al-Haytham is a major figure in the Islamic scientific tradition, particularly in the creation of experimental techniques." p. 465: "only when the influence of Ibn al-Haytham and others on the mainstream of later medieval physical writings has been seriously investigated can Schramm's claim that Ibn al-Haytham was the true founder of modern physics be evaluated."
- ^ Cohen, H. Floris (2010). "Greek nature knowledge transplanted: The Islamic world". How modern science came into the world. Four civilizations, one 17th-century breakthrough (2nd ed.). Amsterdam University Press. pp. 99–156. ISBN 978-90-8964-239-4.
- ^ Selin, Helaine, ed. (2006). Encyclopaedia of the History of Science, Technology, and Medicine in Non-Western Cultures. Springer. pp. 155–156. Bibcode:2008ehst.book.....S. ISBN 978-1-4020-4559-2.
- ^ Russell, Josiah C. (1959). "Gratian, Irnerius, and the Early Schools of Bologna". The Mississippi Quarterly. 12 (4): 168–188. JSTOR 26473232.
Perhaps even as early as 1088 (the date officially set for the founding of the University)
- ^ "St. Albertus Magnus". Encyclopædia Britannica. Archived from the original on 28 October 2017. Retrieved 27 October 2017.
- ^ Numbers, Ronald (2009). Galileo Goes to Jail and Other Myths about Science and Religion. Harvard University Press. p. 45. ISBN 978-0-674-03327-6. Archived from the original on 20 January 2021. Retrieved 27 March 2018.
- ^ a b Smith, A. Mark (1981). "Getting the Big Picture in Perspectivist Optics". Isis. 72 (4): 568–589. doi:10.1086/352843. JSTOR 231249. PMID 7040292. S2CID 27806323.
- ^ Goldstein, Bernard R. (2016). "Copernicus and the Origin of his Heliocentric System" (PDF). Journal for the History of Astronomy. 33 (3): 219–235. doi:10.1177/002182860203300301. S2CID 118351058. Archived from the original (PDF) on 12 April 2020. Retrieved 12 April 2020.
- ^ Cohen, H. Floris (2010). "Greek nature knowledge transplanted and more: Renaissance Europe". How modern science came into the world. Four civilizations, one 17th-century breakthrough (2nd ed.). Amsterdam University Press. pp. 99–156. ISBN 978-90-8964-239-4.
- ^ Koestler, Arthur (1990) [1959]. The Sleepwalkers: A History of Man's Changing Vision of the Universe. London: Penguin. p. 1. ISBN 0-14-019246-8.
- ^ van Helden, Al (1995). "Pope Urban VIII". The Galileo Project. Archived from the original on 11 November 2016. Retrieved 3 November 2016.
- ^ Gingerich, Owen (1975). "Copernicus and the Impact of Printing". Vistas in Astronomy. 17 (1): 201–218. Bibcode:1975VA.....17..201G. doi:10.1016/0083-6656(75)90061-6.
- ^ Zagorin, Perez (1998). Francis Bacon. Princeton University Press. p. 84. ISBN 978-0-691-00966-7.
- ^ Davis, Philip J.; Hersh, Reuben (1986). Descartes' Dream: The World According to Mathematics. Cambridge, MA: Harcourt Brace Jovanovich.
- ^ Gribbin, John (2002). Science: A History 1543–2001. Allen Lane. p. 241. ISBN 978-0-7139-9503-9.
Although it was just one of the many factors in the Enlightenment, the success of Newtonian physics in providing a mathematical description of an ordered world clearly played a big part in the flowering of this movement in the eighteenth century
- ^ "Gottfried Leibniz – Biography". Maths History. Archived from the original on 11 July 2017. Retrieved 2 March 2021.
- ^ Freudenthal, Gideon; McLaughlin, Peter (20 May 2009). The Social and Economic Roots of the Scientific Revolution: Texts by Boris Hessen and Henryk Grossmann. Springer. ISBN 978-1-4020-9604-4. Retrieved 25 July 2018.
- ^ Goddard Bergin, Thomas; Speake, Jennifer, eds. (1987). Encyclopedia of the Renaissance. Facts on File. ISBN 978-0-8160-1315-9.
- ^ van Horn Melton, James (2001). The Rise of the Public in Enlightenment Europe. Cambridge University Press. pp. 82–83. doi:10.1017/CBO9780511819421. ISBN 978-0-511-81942-1. Archived from the original on 20 January 2022. Retrieved 27 May 2022.
- ^ "The Scientific Revolution and the Enlightenment (1500–1780)" (PDF). Archived (PDF) from the original on 14 January 2024. Retrieved 29 January 2024.
- ^ "Scientific Revolution". Encyclopædia Britannica. Archived from the original on 18 May 2019. Retrieved 29 January 2024.
- ^ Madigan, M.; Martinko, J., eds. (2006). Brock Biology of Microorganisms (11th ed.). Prentice Hall. ISBN 978-0-13-144329-7.
- ^ Guicciardini, N. (1999). Reading the Principia: The Debate on Newton's Methods for Natural Philosophy from 1687 to 1736. New York: Cambridge University Press. ISBN 978-0-521-64066-4.
- ^ Calisher, CH (2007). "Taxonomy: what's in a name? Doesn't a rose by any other name smell as sweet?". Croatian Medical Journal. 48 (2): 268–270. PMC 2080517. PMID 17436393.
- ^ Darrigol, Olivier (2000). Electrodynamics from Ampère to Einstein. New York: Oxford University Press. ISBN 0-19-850594-9.
- ^ Olby, R. C.; Cantor, G. N.; Christie, J. R. R.; Hodge, M. J. S. (1990). Companion to the History of Modern Science. London: Routledge. p. 265.
- ^ Magnusson, Magnus (10 November 2003). "Review of James Buchan, Capital of the Mind: how Edinburgh Changed the World". New Statesman. Archived from the original on 6 June 2011. Retrieved 27 April 2014.
- ^ Swingewood, Alan (1970). "Origins of Sociology: The Case of the Scottish Enlightenment". The British Journal of Sociology. 21 (2): 164–180. doi:10.2307/588406. JSTOR 588406.
- ^ Fry, Michael (1992). Adam Smith's Legacy: His Place in the Development of Modern Economics. Paul Samuelson, Lawrence Klein, Franco Modigliani, James M. Buchanan, Maurice Allais, Theodore Schultz, Richard Stone, James Tobin, Wassily Leontief, Jan Tinbergen. Routledge. ISBN 978-0-415-06164-3.
- ^ Lightman, Bernard (2011). "13. Science and the Public". In Shank, Michael; Numbers, Ronald; Harrison, Peter (eds.). Wrestling with Nature: From Omens to Science. University of Chicago Press. p. 367. ISBN 978-0-226-31783-0.
- ^ Leahey, Thomas Hardy (2018). "The psychology of consciousness". A History of Psychology: From Antiquity to Modernity (8th ed.). New York: Routledge. pp. 219–253. ISBN 978-1-138-65242-2.
- ^ Padian, Kevin (2008). "Darwin's enduring legacy". Nature. 451 (7179): 632–634. Bibcode:2008Natur.451..632P. doi:10.1038/451632a. PMID 18256649.
- ^ Henig, Robin Marantz (2000). The monk in the garden: the lost and found genius of Gregor Mendel, the father of genetics. pp. 134–138.
- ^ Miko, Ilona (2008). "Gregor Mendel's principles of inheritance form the cornerstone of modern genetics. So just what are they?". Nature Education. 1 (1): 134. Archived from the original on 19 July 2019. Retrieved 9 May 2021.
- ^ Rocke, Alan J. (2005). "In Search of El Dorado: John Dalton and the Origins of the Atomic Theory". Social Research. 72 (1): 125–158. doi:10.1353/sor.2005.0003. JSTOR 40972005. S2CID 141350239.
- ^ a b Reichl, Linda (1980). A Modern Course in Statistical Physics. Edward Arnold. ISBN 0-7131-2789-9.
- ^ Rao, Y. V. C. (1997). Chemical Engineering Thermodynamics. Universities Press. p. 158. ISBN 978-81-7371-048-3.
- ^ Heidrich, M. (2016). "Bounded energy exchange as an alternative to the third law of thermodynamics". Annals of Physics. 373: 665–681. Bibcode:2016AnPhy.373..665H. doi:10.1016/j.aop.2016.07.031.
- ^ Mould, Richard F. (1995). A century of X-rays and radioactivity in medicine: with emphasis on photographic records of the early years (Reprint. with minor corr ed.). Bristol: Inst. of Physics Publ. p. 12. ISBN 978-0-7503-0224-1.
- ^ a b Estreicher, Tadeusz (1938). "Curie, Maria ze Skłodowskich". Polski słownik biograficzny, vol. 4 (in Polish). p. 113.
- ^ Thomson, J. J. (1897). "Cathode Rays". Philosophical Magazine. 44 (269): 293–316. doi:10.1080/14786449708621070.
- ^ Goyotte, Dolores (2017). "The Surgical Legacy of World War II. Part II: The age of antibiotics" (PDF). The Surgical Technologist. 109: 257–264. Archived (PDF) from the original on 5 May 2021. Retrieved 8 January 2021.
- ^ Erisman, Jan Willem; Sutton, M. A.; Galloway, J.; Klimont, Z.; Winiwarter, W. (October 2008). "How a century of ammonia synthesis changed the world". Nature Geoscience. 1 (10): 636–639. Bibcode:2008NatGe...1..636E. doi:10.1038/ngeo325. S2CID 94880859. Archived from the original on 23 July 2010. Retrieved 22 October 2010.
- ^ Emmett, Robert; Zelko, Frank (2014). Emmett, Rob; Zelko, Frank (eds.). "Minding the Gap: Working Across Disciplines in Environmental Studies". Environment & Society Portal. RCC Perspectives no. 2. doi:10.5282/rcc/6313. Archived from the original on 21 January 2022.
- ^ Furner, Jonathan (1 June 2003). "Little Book, Big Book: Before and After Little Science, Big Science: A Review Article, Part I". Journal of Librarianship and Information Science. 35 (2): 115–125. doi:10.1177/0961000603352006. S2CID 34844169.
- ^ Kraft, Chris; Schefter, James (2001). Flight: My Life in Mission Control. New York: Dutton. pp. 3–5. ISBN 0-525-94571-7.
- ^ Kahn, Herman (1962). Thinking about the Unthinkable. Horizon.
- ^ Shrum, Wesley (2007). Structures of scientific collaboration. Joel Genuth, Ivan Chompalov. Cambridge, MA: MIT Press. ISBN 978-0-262-28358-8.
- ^ Rosser, Sue V. (12 March 2012). Breaking into the Lab: Engineering Progress for Women in Science. New York University Press. p. 7. ISBN 978-0-8147-7645-2.
- ^ Penzias, A. A. (2006). "The origin of elements" (PDF). Science. 205 (4406). Nobel Foundation: 549–554. doi:10.1126/science.205.4406.549. PMID 17729659. Archived (PDF) from the original on 17 January 2011. Retrieved 4 October 2006.
- ^ Weinberg, S. (1972). Gravitation and Cosmology. John Whitney & Sons. pp. 464–495. ISBN 978-0-471-92567-5.
- ^ Futuyma, Douglas J.; Kirkpatrick, Mark (2017). "Chapter 1: Evolutionary Biology". Evolution (4th ed.). Sinauer. pp. 3–26. ISBN 978-1-60535-605-1.
- ^ Miller, Arthur I. (1981). Albert Einstein's special theory of relativity. Emergence (1905) and early interpretation (1905–1911). Reading: Addison–Wesley. ISBN 978-0-201-04679-3.
- ^ ter Haar, D. (1967). The Old Quantum Theory. Pergamon. pp. 206. ISBN 978-0-08-012101-7.
- ^ von Bertalanffy, Ludwig (1972). "The History and Status of General Systems Theory". The Academy of Management Journal. 15 (4): 407–426. JSTOR 255139.
- ^ Naidoo, Nasheen; Pawitan, Yudi; Soong, Richie; Cooper, David N.; Ku, Chee-Seng (October 2011). "Human genetics and genomics a decade after the release of the draft sequence of the human genome". Human Genomics. 5 (6): 577–622. doi:10.1186/1479-7364-5-6-577. PMC 3525251. PMID 22155605.
- ^ Rashid, S. Tamir; Alexander, Graeme J. M. (March 2013). "Induced pluripotent stem cells: from Nobel Prizes to clinical applications". Journal of Hepatology. 58 (3): 625–629. doi:10.1016/j.jhep.2012.10.026. ISSN 1600-0641. PMID 23131523.
- ^ O'Luanaigh, C. (14 March 2013). "New results indicate that new particle is a Higgs boson" (Press release). CERN. Archived from the original on 20 October 2015. Retrieved 9 October 2013.
- ^ Abbott, B. P.; Abbott, R.; Abbott, T. D.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Afrough, M.; Agarwal, B.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allen, G.; Allocca, A.; Altin, P. A.; Amato, A.; Ananyeva, A.; Anderson, S. B.; Anderson, W. G.; Angelova, S. V.; et al. (2017). "Multi-messenger Observations of a Binary Neutron Star Merger". The Astrophysical Journal. 848 (2): L12. arXiv:1710.05833. Bibcode:2017ApJ...848L..12A. doi:10.3847/2041-8213/aa91c9. S2CID 217162243.
- ^ Cho, Adrian (2017). "Merging neutron stars generate gravitational waves and a celestial light show". Science. doi:10.1126/science.aar2149.
- ^ "Media Advisory: First Results from the Event Horizon Telescope to be Presented on April 10th". Event Horizon Telescope. 20 April 2019. Archived from the original on 20 April 2019. Retrieved 21 September 2021.
- ^ "Scientific Method: Relationships Among Scientific Paradigms". Seed Magazine. 7 March 2007. Archived from the original on 1 November 2016. Retrieved 4 November 2016.
- ^ Bunge, Mario Augusto (1998). Philosophy of Science: From Problem to Theory. Transaction. p. 24. ISBN 978-0-7658-0413-6.
- ^ a b Popper, Karl R. (2002a) [1959]. "A survey of some fundamental problems". The Logic of Scientific Discovery. New York: Routledge. pp. 3–26. ISBN 978-0-415-27844-7.
- ^ Gauch, Hugh G. Jr. (2003). "Science in perspective". Scientific Method in Practice. Cambridge University Press. pp. 21–73. ISBN 978-0-521-01708-4. Retrieved 3 September 2018.
- ^ Oglivie, Brian W. (2008). "Introduction". The Science of Describing: Natural History in Renaissance Europe (Paperback ed.). University of Chicago Press. pp. 1–24. ISBN 978-0-226-62088-6.
- ^ "Natural History". Princeton University WordNet. Archived from the original on 3 March 2012. Retrieved 21 October 2012.
- ^ "Formal Sciences: Washington and Lee University". Washington and Lee University. Archived from the original on 14 May 2021. Retrieved 14 May 2021.
A "formal science" is an area of study that uses formal systems to generate knowledge such as in Mathematics and Computer Science. Formal sciences are important subjects because all of quantitative science depends on them.
- ^ Löwe, Benedikt (2002). "The formal sciences: their scope, their foundations, and their unity". Synthese. 133 (1/2): 5–11. doi:10.1023/A:1020887832028. ISSN 0039-7857. S2CID 9272212.
- ^ Rucker, Rudy (2019). "Robots and souls". Infinity and the Mind: The Science and Philosophy of the Infinite (Reprint ed.). Princeton University Press. pp. 157–188. ISBN 978-0-691-19138-6. Archived from the original on 26 February 2021. Retrieved 11 May 2021.
- ^ "Formal system". Encyclopædia Britannica. Archived from the original on 29 April 2008. Retrieved 30 May 2022.
- ^ Tomalin, Marcus (2006). Linguistics and the Formal Sciences.
- ^ Löwe, Benedikt (2002). "The Formal Sciences: Their Scope, Their Foundations, and Their Unity". Synthese. 133 (1/2): 5–11. doi:10.1023/a:1020887832028. S2CID 9272212.
- ^ Bill, Thompson (2007). "2.4 Formal Science and Applied Mathematics". The Nature of Statistical Evidence. Lecture Notes in Statistics. Vol. 189. Springer. p. 15.
- ^ Bunge, Mario (1998). "The Scientific Approach". Philosophy of Science: Volume 1, From Problem to Theory. Vol. 1 (revised ed.). New York: Routledge. pp. 3–50. ISBN 978-0-7658-0413-6.
- ^ Mujumdar, Anshu Gupta; Singh, Tejinder (2016). "Cognitive science and the connection between physics and mathematics". In Aguirre, Anthony; Foster, Brendan (eds.). Trick or Truth?: The Mysterious Connection Between Physics and Mathematics. The Frontiers Collection. Switzerland: Springer. pp. 201–218. ISBN 978-3-319-27494-2.
- ^ "About the Journal". Journal of Mathematical Physics. Archived from the original on 3 October 2006. Retrieved 3 October 2006.
- ^ Restrepo, G. (2016). "Mathematical chemistry, a new discipline". In Scerri, E.; Fisher, G. (eds.). Essays in the philosophy of chemistry. New York: Oxford University Press. pp. 332–351. ISBN 978-0-19-049459-9. Archived from the original on 10 June 2021. Retrieved 31 May 2022.
- ^ "What is mathematical biology". Centre for Mathematical Biology, University of Bath. Archived from the original on 23 September 2018. Retrieved 7 June 2018.
- ^ Johnson, Tim (1 September 2009). "What is financial mathematics?". +Plus Magazine. Archived from the original on 8 April 2022. Retrieved 1 March 2021.
- ^ Varian, Hal (1997). "What Use Is Economic Theory?". In D'Autume, A.; Cartelier, J. (eds.). Is Economics Becoming a Hard Science?. Edward Elgar. Pre-publication. Archived 25 June 2006 at the Wayback Machine. Retrieved 1 April 2008.
- ^ Abraham, Reem Rachel (2004). "Clinically oriented physiology teaching: strategy for developing critical-thinking skills in undergraduate medical students". Advances in Physiology Education. 28 (3): 102–104. doi:10.1152/advan.00001.2004. PMID 15319191. S2CID 21610124.
- ^ "Engineering". Cambridge Dictionary. Cambridge University Press. Archived from the original on 19 August 2019. Retrieved 25 March 2021.
- ^ Brooks, Harvey (1 September 1994). "The relationship between science and technology" (PDF). Research Policy. Special Issue in Honor of Nathan Rosenberg. 23 (5): 477–486. doi:10.1016/0048-7333(94)01001-3. ISSN 0048-7333. Archived (PDF) from the original on 30 December 2022. Retrieved 14 October 2022.
- ^ Firth, John (2020). "Science in medicine: when, how, and what". Oxford textbook of medicine. Oxford University Press. ISBN 978-0-19-874669-0.
- ^ Saunders, J. (June 2000). "The practice of clinical medicine as an art and as a science". Med Humanit. 26 (1): 18–22. doi:10.1136/mh.26.1.18. PMC 1071282. PMID 12484313. S2CID 73306806.
- ^ Davis, Bernard D. (March 2000). "Limited scope of science". Microbiology and Molecular Biology Reviews. 64 (1): 1–12. doi:10.1128/MMBR.64.1.1-12.2000. PMC 98983. PMID 10704471 & "Technology" in Davis, Bernard (March 2000). "The scientist's world". Microbiology and Molecular Biology Reviews. 64 (1): 1–12. doi:10.1128/MMBR.64.1.1-12.2000. PMC 98983. PMID 10704471.
- ^ McCormick, James (2001). "Scientific medicine—fact of fiction? The contribution of science to medicine". Occasional Paper (Royal College of General Practitioners) (80): 3–6. PMC 2560978. PMID 19790950.
- ^ Bell, David (2005). Science, Technology and Culture. McGraw-Hill International. p. 33. ISBN 978-0-335-21326-9.
- ^ Henderson, Mark (19 September 2005). "Politics clouds blue-sky science". The Times. Archived from the original on 1 March 2007. Retrieved 26 April 2010.
- ^ Breznau, Nate (2022). "Integrating Computer Prediction Methods in Social Science: A Comment on Hofman et al. (2021)". Social Science Computer Review. 40 (3): 844–853. doi:10.1177/08944393211049776. S2CID 248334446. Archived from the original on 29 April 2024. Retrieved 16 August 2023.
- ^ Hofman, Jake M.; Watts, Duncan J.; Athey, Susan; Garip, Filiz; Griffiths, Thomas L.; Kleinberg, Jon; Margetts, Helen; Mullainathan, Sendhil; Salganik, Matthew J.; Vazire, Simine; Vespignani, Alessandro (July 2021). "Integrating explanation and prediction in computational social science". Nature. 595 (7866): 181–188. Bibcode:2021Natur.595..181H. doi:10.1038/s41586-021-03659-0. ISSN 1476-4687. PMID 34194044. S2CID 235697917. Archived from the original on 25 September 2021. Retrieved 25 September 2021.
- ^ Nissani, M. (1995). "Fruits, Salads, and Smoothies: A Working definition of Interdisciplinarity". The Journal of Educational Thought. 29 (2): 121–128. doi:10.55016/ojs/jet.v29i2.52385. JSTOR 23767672.
- ^ Moody, G. (2004). Digital Code of Life: How Bioinformatics is Revolutionizing Science, Medicine, and Business. John Wiley & Sons. p. vii. ISBN 978-0-471-32788-2.
- ^ Ausburg, Tanya (2006). Becoming Interdisciplinary: An Introduction to Interdisciplinary Studies (2nd ed.). New York: Kendall/Hunt Publishing.
- ^ Dawkins, Richard (10 May 2006). "To Live at All Is Miracle Enough". RichardDawkins.net. Archived from the original on 19 January 2012. Retrieved 5 February 2012.
- ^ a b di Francia, Giuliano Toraldo (1976). "The method of physics". The Investigation of the Physical World. Cambridge University Press. pp. 1–52. ISBN 978-0-521-29925-1.
The amazing point is that for the first time since the discovery of mathematics, a method has been introduced, the results of which have an intersubjective value!
- ^ Popper, Karl R. (2002e) [1959]. "The problem of the empirical basis". The Logic of Scientific Discovery. New York: Routledge. pp. 3–26. ISBN 978-0-415-27844-7.
- ^ Diggle, Peter J.; Chetwynd, Amanda G. (2011). Statistics and Scientific Method: An Introduction for Students and Researchers. Oxford University Press. pp. 1–2. ISBN 978-0-19-954318-2.
- ^ Wilson, Edward (1999). Consilience: The Unity of Knowledge. New York: Vintage. ISBN 978-0-679-76867-8.
- ^ Fara, Patricia (2009). "Decisions". Science: A Four Thousand Year History. Oxford University Press. p. 408. ISBN 978-0-19-922689-4.
- ^ Aldrich, John (1995). "Correlations Genuine and Spurious in Pearson and Yule". Statistical Science. 10 (4): 364–376. doi:10.1214/ss/1177009870. JSTOR 2246135.
- ^ Nola, Robert; Irzik, Gürol (2005). Philosophy, science, education and culture. Science & technology education library. Vol. 28. Springer. pp. 207–230. ISBN 978-1-4020-3769-6.
- ^ van Gelder, Tim (1999). ""Heads I win, tails you lose": A Foray Into the Psychology of Philosophy" (PDF). University of Melbourne. Archived from the original (PDF) on 9 April 2008. Retrieved 28 March 2008.
- ^ Pease, Craig (6 September 2006). "Chapter 23. Deliberate bias: Conflict creates bad science". Science for Business, Law and Journalism. Vermont Law School. Archived from the original on 19 June 2010.
- ^ Shatz, David (2004). Peer Review: A Critical Inquiry. Rowman & Littlefield. ISBN 978-0-7425-1434-8.
- ^ Krimsky, Sheldon (2003). Science in the Private Interest: Has the Lure of Profits Corrupted the Virtue of Biomedical Research. Rowman & Littlefield. ISBN 978-0-7425-1479-9.
- ^ Bulger, Ruth Ellen; Heitman, Elizabeth; Reiser, Stanley Joel (2002). The Ethical Dimensions of the Biological and Health Sciences (2nd ed.). Cambridge University Press. ISBN 978-0-521-00886-0.
- ^ Backer, Patricia Ryaby (29 October 2004). "What is the scientific method?". San Jose State University. Archived from the original on 8 April 2008. Retrieved 28 March 2008.
- ^ Ziman, John (1978c). "Common observation". Reliable knowledge: An exploration of the grounds for belief in science. Cambridge University Press. pp. 42–76. ISBN 978-0-521-22087-3.
- ^ Ziman, J. M. (1980). "The proliferation of scientific literature: a natural process". Science. 208 (4442): 369–371. Bibcode:1980Sci...208..369Z. doi:10.1126/science.7367863. PMID 7367863.
- ^ Subramanyam, Krishna; Subramanyam, Bhadriraju (1981). Scientific and Technical Information Resources. CRC Press. ISBN 978-0-8247-8297-9.
- ^ a b Bush, Vannevar (July 1945). "Science the Endless Frontier". National Science Foundation. Archived from the original on 7 November 2016. Retrieved 4 November 2016.
- ^ Schooler, J. W. (2014). "Metascience could rescue the 'replication crisis'". Nature. 515 (7525): 9. Bibcode:2014Natur.515....9S. doi:10.1038/515009a. PMID 25373639.
- ^ Pashler, Harold; Wagenmakers, Eric Jan (2012). "Editors' Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence?". Perspectives on Psychological Science. 7 (6): 528–530. doi:10.1177/1745691612465253. PMID 26168108. S2CID 26361121.
- ^ Ioannidis, John P. A.; Fanelli, Daniele; Dunne, Debbie Drake; Goodman, Steven N. (2 October 2015). "Meta-research: Evaluation and Improvement of Research Methods and Practices". PLOS Biology. 13 (10): –1002264. doi:10.1371/journal.pbio.1002264. ISSN 1545-7885. PMC 4592065. PMID 26431313.
- ^ Hansson, Sven Ove (3 September 2008). "Science and Pseudoscience". In Zalta, Edward N. (ed.). Stanford Encyclopedia of Philosophy. Section 2: The "science" of pseudoscience. Archived from the original on 29 October 2021. Retrieved 28 May 2022.
- ^ Shermer, Michael (1997). Why people believe weird things: pseudoscience, superstition, and other confusions of our time. New York: W. H. Freeman & Co. p. 17. ISBN 978-0-7167-3090-3.
- ^ Feynman, Richard (1974). "Cargo Cult Science". Center for Theoretical Neuroscience. Columbia University. Archived from the original on 4 March 2005. Retrieved 4 November 2016.
- ^ Novella, Steven (2018). The Skeptics' Guide to the Universe: How to Know What's Really Real in a World Increasingly Full of Fake. Hodder & Stoughton. p. 162. ISBN 978-1-4736-9641-9.
- ^ "Coping with fraud" (PDF). The COPE Report 1999: 11–18. Archived from the original (PDF) on 28 September 2007. Retrieved 21 July 2011.
It is 10 years, to the month, since Stephen Lock ... Reproduced with kind permission of the Editor, The Lancet.
- ^ a b Godfrey-Smith, Peter (2003c). "Induction and confirmation". Theory and Reality: An Introduction to the Philosophy of Science. University of Chicago. pp. 39–56. ISBN 978-0-226-30062-7.
- ^ Godfrey-Smith, Peter (2003o). "Empiricism, naturalism, and scientific realism?". Theory and Reality: An Introduction to the Philosophy of Science. University of Chicago. pp. 219–232. ISBN 978-0-226-30062-7.
- ^ Godfrey-Smith, Peter (2003b). "Logic plus empiricism". Theory and Reality: An Introduction to the Philosophy of Science. University of Chicago. pp. 19–38. ISBN 978-0-226-30062-7.
- ^ a b Godfrey-Smith, Peter (2003d). "Popper: Conjecture and refutation". Theory and Reality: An Introduction to the Philosophy of Science. University of Chicago. pp. 57–74. ISBN 978-0-226-30062-7.
- ^ Godfrey-Smith, Peter (2003g). "Lakatos, Laudan, Feyerabend, and frameworks". Theory and Reality: An Introduction to the Philosophy of Science. University of Chicago. pp. 102–121. ISBN 978-0-226-30062-7.
- ^ Popper, Karl (1972). Objective Knowledge.
- ^ Newton-Smith, W. H. (1994). The Rationality of Science. London: Routledge. p. 30. ISBN 978-0-7100-0913-5.
- ^ Votsis, I. (2004). The Epistemological Status of Scientific Theories: An Investigation of the Structural Realist Account (PhD thesis). University of London, London School of Economics. p. 39.
- ^ Bird, Alexander (2013). "Thomas Kuhn". In Zalta, Edward N. (ed.). Stanford Encyclopedia of Philosophy. Archived from the original on 15 July 2020. Retrieved 26 October 2015.
- ^ Kuhn, Thomas S. (1970). The Structure of Scientific Revolutions (2nd ed.). University of Chicago Press. p. 206. ISBN 978-0-226-45804-5. Archived from the original on 19 October 2021. Retrieved 30 May 2022.
- ^ Godfrey-Smith, Peter (2003). "Naturalistic philosophy in theory and practice". Theory and Reality: An Introduction to the Philosophy of Science. University of Chicago. pp. 149–162. ISBN 978-0-226-30062-7.
- ^ Brugger, E. Christian (2004). "Casebeer, William D. Natural Ethical Facts: Evolution, Connectionism, and Moral Cognition". The Review of Metaphysics. 58 (2).
- ^ Kornfeld, W.; Hewitt, C. E. (1981). "The Scientific Community Metaphor" (PDF). IEEE Transactions on Systems, Man, and Cybernetics. 11 (1): 24–33. doi:10.1109/TSMC.1981.4308575. hdl:1721.1/5693. S2CID 1322857. Archived (PDF) from the original on 8 April 2016. Retrieved 26 May 2022.
- ^ "Eusocial climbers" (PDF). E. O. Wilson Foundation. Archived (PDF) from the original on 27 April 2019. Retrieved 3 September 2018.
But he's not a scientist, he's never done scientific research. My definition of a scientist is that you can complete the following sentence: 'he or she has shown that...'," Wilson says.
- ^ "Our definition of a scientist". Science Council. Archived from the original on 23 August 2019. Retrieved 7 September 2018.
A scientist is someone who systematically gathers and uses research and evidence, making a hypothesis and testing it, to gain and share understanding and knowledge.
- ^ Cyranoski, David; Gilbert, Natasha; Ledford, Heidi; Nayar, Anjali; Yahia, Mohammed (2011). "Education: The PhD factory". Nature. 472 (7343): 276–279. Bibcode:2011Natur.472..276C. doi:10.1038/472276a. PMID 21512548.
- ^ Kwok, Roberta (2017). "Flexible working: Science in the gig economy". Nature. 550 (7677): 419–421. doi:10.1038/nj7677-549a.
- ^ Woolston, Chris (2007). "Many junior scientists need to take a hard look at their job prospects". Nature. 550 (7677): 549–552. doi:10.1038/nj7677-549a.
- ^ Lee, Adrian; Dennis, Carina; Campbell, Phillip (2007). "Graduate survey: A love–hurt relationship". Nature. 550 (7677): 549–552. doi:10.1038/nj7677-549a.
- ^ Whaley, Leigh Ann (2003). Women's History as Scientists. Santa Barbara, CA: ABC-CLIO.
- ^ Spanier, Bonnie (1995). "From Molecules to Brains, Normal Science Supports Sexist Beliefs about Difference". Im/partial Science: Gender Identity in Molecular Biology. Indiana University Press. ISBN 978-0-253-20968-9.
- ^ Parrott, Jim (9 August 2007). "Chronicle for Societies Founded from 1323 to 1599". Scholarly Societies Project. Archived from the original on 6 January 2014. Retrieved 11 September 2007.
- ^ "The Environmental Studies Association of Canada – What is a Learned Society?". Archived from the original on 29 May 2013. Retrieved 10 May 2013.
- ^ "Learned societies & academies". Archived from the original on 3 June 2014. Retrieved 10 May 2013.
- ^ "Learned Societies, the key to realising an open access future?". Impact of Social Sciences. London School of Economics. 24 June 2019. Archived from the original on 5 February 2023. Retrieved 22 January 2023.
- ^ "Accademia Nazionale dei Lincei" (in Italian). 2006. Archived from the original on 28 February 2010. Retrieved 11 September 2007.
- ^ "Prince of Wales opens Royal Society's refurbished building". The Royal Society. 7 July 2004. Archived from the original on 9 April 2015. Retrieved 7 December 2009.
- ^ Meynell, G. G. "The French Academy of Sciences, 1666–91: A reassessment of the French Académie royale des sciences under Colbert (1666–83) and Louvois (1683–91)". Archived from the original on 18 January 2012. Retrieved 13 October 2011.
- ^ "Founding of the National Academy of Sciences". .nationalacademies.org. Archived from the original on 3 February 2013. Retrieved 12 March 2012.
- ^ "The founding of the Kaiser Wilhelm Society (1911)". Max-Planck-Gesellschaft. Archived from the original on 2 March 2022. Retrieved 30 May 2022.
- ^ "Introduction". Chinese Academy of Sciences. Archived from the original on 31 March 2022. Retrieved 31 May 2022.
- ^ "Two main Science Councils merge to address complex global challenges". UNESCO. 5 July 2018. Archived from the original on 12 July 2021. Retrieved 21 October 2018.
- ^ Stockton, Nick (7 October 2014). "How did the Nobel Prize become the biggest award on Earth?". Wired. Archived from the original on 19 June 2019. Retrieved 3 September 2018.
- ^ "Main Science and Technology Indicators – 2008-1" (PDF). OECD. Archived from the original (PDF) on 15 February 2010.
- ^ OECD Science, Technology and Industry Scoreboard 2015: Innovation for growth and society. OECD. 2015. p. 156. doi:10.1787/sti_scoreboard-2015-en. ISBN 978-92-64-23978-4. Archived from the original on 25 May 2022. Retrieved 28 May 2022 – via oecd-ilibrary.org.
- ^ Kevles, Daniel (1977). "The National Science Foundation and the Debate over Postwar Research Policy, 1942–1945". Isis. 68 (241): 4–26. doi:10.1086/351711. PMID 320157. S2CID 32956693.
- ^ "Argentina, National Scientific and Technological Research Council (CONICET)". International Science Council. Archived from the original on 16 May 2022. Retrieved 31 May 2022.
- ^ Innis, Michelle (17 May 2016). "Australia to Lay Off Leading Scientist on Sea Levels". The New York Times. ISSN 0362-4331. Archived from the original on 7 May 2021. Retrieved 31 May 2022.
- ^ "Le CNRS recherche 10.000 passionnés du blob". Le Figaro (in French). 20 October 2021. Archived from the original on 27 April 2022. Retrieved 31 May 2022.
- ^ Bredow, Rafaela von (18 December 2021). "How a Prestigious Scientific Organization Came Under Suspicion of Treating Women Unequally". Der Spiegel. ISSN 2195-1349. Archived from the original on 29 May 2022. Retrieved 31 May 2022.
- ^ "En espera de una "revolucionaria" noticia sobre Sagitario A*, el agujero negro supermasivo en el corazón de nuestra galaxia". ELMUNDO (in Spanish). 12 May 2022. Archived from the original on 13 May 2022. Retrieved 31 May 2022.
- ^ Fletcher, Anthony C.; Bourne, Philip E. (27 September 2012). "Ten Simple Rules To Commercialize Scientific Research". PLOS Computational Biology. 8 (9) e1002712. Bibcode:2012PLSCB...8E2712F. doi:10.1371/journal.pcbi.1002712. ISSN 1553-734X. PMC 3459878. PMID 23028299.
- ^ Marburger, John Harmen III (10 February 2015). Science policy up close. Crease, Robert P. Cambridge, MA: Harvard University Press. ISBN 978-0-674-41709-0.
- ^ Gauch, Hugh G. (2012). Scientific Method in Brief. New York: Cambridge University Press. pp. 7–10. ISBN 978-1-107-66672-6.
- ^ Benneworth, Paul; Jongbloed, Ben W. (31 July 2009). "Who matters to universities? A stakeholder perspective on humanities, arts and social sciences valorisation" (PDF). Higher Education. 59 (5): 567–588. doi:10.1007/s10734-009-9265-2. ISSN 0018-1560. Archived (PDF) from the original on 24 October 2023. Retrieved 16 August 2023.
- ^ Dickson, David (11 October 2004). "Science journalism must keep a critical edge". Science and Development Network. Archived from the original on 21 June 2010.
- ^ Mooney, Chris (November–December 2004). "Blinded By Science, How 'Balanced' Coverage Lets the Scientific Fringe Hijack Reality". Columbia Journalism Review. Vol. 43, no. 4. Archived from the original on 17 January 2010. Retrieved 20 February 2008.
- ^ McIlwaine, S.; Nguyen, D. A. (2005). "Are Journalism Students Equipped to Write About Science?". Australian Studies in Journalism. 14: 41–60. Archived from the original on 1 August 2008. Retrieved 20 February 2008.
- ^ Webb, Sarah (December 2013). "Popular science: Get the word out". Nature. 504 (7478): 177–179. doi:10.1038/nj7478-177a. PMID 24312943.
- ^ Wilde, Fran (21 January 2016). "How Do You Like Your Science Fiction? Ten Authors Weigh In On 'Hard' vs. 'Soft' SF". Tor.com. Archived from the original on 4 April 2019. Retrieved 4 April 2019.
- ^ Petrucci, Mario. "Creative Writing – Science". Archived from the original on 6 January 2009. Retrieved 27 April 2008.
- ^ Tyson, Alec; Funk, Cary; Kennedy, Brian; Johnson, Courtney (15 September 2021). "Majority in U.S. Says Public Health Benefits of COVID-19 Restrictions Worth the Costs, Even as Large Shares Also See Downsides". Pew Research Center Science & Society. Archived from the original on 9 August 2022. Retrieved 4 August 2022.
- ^ Kennedy, Brian (16 April 2020). "U.S. concern about climate change is rising, but mainly among Democrats". Pew Research Center. Archived from the original on 3 August 2022. Retrieved 4 August 2022.
- ^ Philipp-Muller, Aviva; Lee, Spike W. S.; Petty, Richard E. (26 July 2022). "Why are people antiscience, and what can we do about it?". Proceedings of the National Academy of Sciences. 119 (30) e2120755119. Bibcode:2022PNAS..11920755P. doi:10.1073/pnas.2120755119. ISSN 0027-8424. PMC 9335320. PMID 35858405.
- ^ Gauchat, Gordon William (2008). "A Test of Three Theories of Anti-Science Attitudes". Sociological Focus. 41 (4): 337–357. doi:10.1080/00380237.2008.10571338. S2CID 144645723.
- ^ Poushter, Jacob; Fagan, Moira; Gubbala, Sneha (31 August 2022). "Climate Change Remains Top Global Threat Across 19-Country Survey". Pew Research Center's Global Attitudes Project. Archived from the original on 31 August 2022. Retrieved 5 September 2022.
- ^ McRaney, David (2022). How Minds Change: The Surprising Science of Belief, Opinion, and Persuasion. New York: Portfolio/Penguin. ISBN 978-0-593-19029-6.
- ^ McGreal, Chris (26 October 2021). "Revealed: 60% of Americans say oil firms are to blame for the climate crisis". The Guardian. Archived from the original on 26 October 2021.
Source: Guardian/Vice/CCN/YouGov poll. Note: ±4% margin of error.
- ^ Goldberg, Jeanne (2017). "The Politicization of Scientific Issues: Looking through Galileo's Lens or through the Imaginary Looking Glass". Skeptical Inquirer. 41 (5): 34–39. Archived from the original on 16 August 2018. Retrieved 16 August 2018.
- ^ Bolsen, Toby; Druckman, James N. (2015). "Counteracting the Politicization of Science". Journal of Communication (65): 746.
- ^ a b Freudenberg, William F.; Gramling, Robert; Davidson, Debra J. (2008). "Scientific Certainty Argumentation Methods (SCAMs): Science and the Politics of Doubt" (PDF). Sociological Inquiry. 78 (1): 2–38. doi:10.1111/j.1475-682X.2008.00219.x. Archived (PDF) from the original on 26 November 2020. Retrieved 12 April 2020.
- ^ van der Linden, Sander; Leiserowitz, Anthony; Rosenthal, Seth; Maibach, Edward (2017). "Inoculating the Public against Misinformation about Climate Change" (PDF). Global Challenges. 1 (2): 1. Bibcode:2017GloCh...100008V. doi:10.1002/gch2.201600008. PMC 6607159. PMID 31565263. Archived (PDF) from the original on 4 April 2020. Retrieved 25 August 2019.
External links
[edit]Science
View on GrokipediaDefinition and Fundamentals
Core Definition and Distinctions
Science constitutes the disciplined pursuit of understanding the natural world through empirical observation, experimentation, and the development of falsifiable hypotheses that yield testable predictions. This process emphasizes reproducibility, where independent investigators can verify results under controlled conditions, and relies on inductive and deductive reasoning to generalize from specific data to broader principles. Explanations in science are constrained to mechanisms observable and measurable within the physical universe, excluding supernatural or metaphysical claims that cannot be empirically assessed.[9][10] A defining feature of science is its commitment to falsifiability, as articulated by philosopher Karl Popper in the mid-20th century: for a theory to qualify as scientific, it must entail observable consequences that could potentially refute it, rather than merely accommodating all outcomes through unfalsifiable adjustments. This criterion distinguishes science from pseudoscience, which often presents claims resembling scientific inquiry—such as astrology or certain alternative medicine practices—but evades rigorous disconfirmation by shifting explanations post hoc or prioritizing confirmatory evidence over potential refutations. Scientific progress advances via iterative cycles of hypothesis testing, where surviving scrutiny strengthens theories, whereas pseudoscientific assertions typically resist empirical challenge and lack predictive power.[11][12][13] Science further differentiates from philosophy and religion by its methodological naturalism and evidential standards: while philosophy explores conceptual foundations and ethics through logical argumentation, and religion posits truths via revelation or faith, science demands material evidence and quantitative validation, rendering it agnostic toward untestable propositions like ultimate origins or moral absolutes. This demarcation ensures science's self-correcting nature, as evidenced by historical paradigm shifts like the replacement of geocentric models with heliocentrism through telescopic observations contradicting prior doctrines. Yet, science's scope remains provisional; theories represent the best current approximations, subject to revision with new data, underscoring its distinction from dogmatic systems that claim infallibility.[14][9]Etymology and Historical Terminology
The English word science derives from the Latin scientia, signifying "knowledge" or "a knowing," which stems from the verb scire, meaning "to know" or "to discern."[15] This root traces back to Proto-Indo-European origins, though its precise etymology remains uncertain, with scientia originally encompassing any assured or systematic understanding, including moral, theological, and practical domains rather than exclusively empirical inquiry.[15] The term entered Old French as science around the 12th century, denoting learning or a corpus of human knowledge, before being adopted into Middle English by the mid-14th century to describe the "state of knowing" or accumulated expertise acquired through study.[15] Historically, scientia served as the Latin translation of the Greek epistēmē, a concept central to philosophers like Plato and Aristotle, who used it to denote justified true belief or demonstrative knowledge distinct from mere opinion (doxa) or practical skill (technē).[16] In antiquity and the medieval period, what modern usage terms "science" was broadly classified under philosophia naturalis (natural philosophy), encompassing inquiries into nature's causes via reason and observation, as articulated by thinkers from Aristotle's Physica to Islamic scholars like Avicenna, who integrated Greek epistēmē with empirical methods in works like Kitab al-Shifa.[17] By the Renaissance, terms like physica or "physic" persisted in English for natural studies, reflecting Aristotelian divisions, while "natural history" described descriptive compilations of phenomena, as in Pliny the Elder's Naturalis Historia (77 CE).[18] The narrowing of "science" to its contemporary sense—systematic, empirical investigation of the physical world—occurred gradually during the Scientific Revolution, with fuller specialization by 1725 in English usage, coinciding with the exclusion of non-empirical fields like theology.[15] Practitioners were initially termed "natural philosophers" or "cultivators of science," but in 1833–1834, Cambridge philosopher William Whewell proposed "scientist" as a neutral descriptor analogous to "artist," replacing gendered or class-laden alternatives amid the professionalization of disciplines like chemistry and biology.[19] This shift reflected broader terminological evolution, where Greek-derived suffixes like -logia (e.g., biologia coined in 1802 by Gottfried Reinhold Treviranus) proliferated to denote specialized empirical studies, distinguishing them from speculative philosophy.[20] Earlier, medieval Latin texts often used scientia experimentalis for knowledge gained through trial, as in Roger Bacon's 13th-century advocacy for verification over authority, prefiguring modern distinctions.[21]Historical Development
Ancient and Pre-Classical Origins
The earliest recorded precursors to scientific inquiry appeared in the agricultural civilizations of Mesopotamia and ancient Egypt around 3500–3000 BCE, where empirical observations supported practical needs like flood prediction, land measurement, and celestial tracking for calendars. In Mesopotamia, Sumerian development of cuneiform writing circa 3200 BCE enabled scribes to document systematic records of economic transactions, astronomical events, and basic computations, marking the transition from ad hoc knowledge to codified data.[22] These efforts prioritized utility over abstract theory, with mathematics focused on solving real-world problems such as dividing fields or calculating interest, reflecting causal reasoning grounded in observable patterns rather than speculative metaphysics.[23] Babylonian advancements in mathematics and astronomy, building on Sumerian foundations, flourished from approximately 2000 BCE to 539 BCE, utilizing a sexagesimal numeral system that persists in modern time and angle measurements. Tablets from this period demonstrate proficiency in quadratic equations, geometric series, and approximations of irrational numbers like square roots, with Plimpton 322 (circa 1800 BCE) listing Pythagorean triples—pairs of integers satisfying —indicating empirical derivation of ratios through proportional reasoning rather than axiomatic proof.[24] Astronomical records, including clay tablets detailing planetary positions and lunar eclipses from as early as 1800 BCE, employed predictive algorithms based on accumulated observations, achieving accuracies sufficient for agricultural and astrological forecasting without reliance on uniform circular motion models later adopted in Greece.[25] In ancient Egypt, scientific practices similarly emphasized empirical application, with mathematics documented in papyri like the Rhind Mathematical Papyrus (circa 1650 BCE) addressing problems in arithmetic, geometry, and volume calculations essential for pyramid construction and Nile inundation surveys. Egyptian geometers used a seked unit (run-to-rise ratio) for slope determination, solving linear equations implicitly to achieve precise alignments, as seen in the Great Pyramid of Giza (circa 2580–2560 BCE), whose base approximates a square with sides varying by less than 20 cm over 230 meters.[26] Medical knowledge, preserved in texts such as the Ebers Papyrus (circa 1550 BCE), cataloged over 700 remedies derived from trial-and-error observations of herbal effects, surgical techniques, and anatomical descriptions, prioritizing observable symptoms and outcomes over humoral theories.[27] Egyptian astronomy established a 365-day civil calendar by circa 3000 BCE, aligning solar years with Nile cycles through star observations like the heliacal rising of Sothis, demonstrating causal links between celestial periodicity and terrestrial agriculture.[28] These Mesopotamian and Egyptian contributions laid foundational techniques in quantification and pattern recognition, though intertwined with religious divination—such as Babylonian omen texts interpreting celestial events—their reliance on verifiable data and repeatable methods prefigured later scientific empiricism, distinct from purely mythical explanations prevalent in prehistoric oral traditions.[29] Early Chinese records from the Shang Dynasty (circa 1600–1046 BCE) similarly show oracle bone inscriptions tracking eclipses and calendars, but Near Eastern systems provided the most extensive preserved evidence of proto-scientific systematization before Hellenistic synthesis.[22]Classical Antiquity and Hellenistic Advances
In Classical Antiquity, particularly from the 6th century BCE onward in Ionian Greece, thinkers began seeking naturalistic explanations for phenomena, marking a departure from mythological accounts. Thales of Miletus (c. 624–546 BCE), often regarded as the first philosopher, proposed water as the fundamental substance underlying all matter and reportedly predicted a solar eclipse in 585 BCE using geometric reasoning derived from Babylonian observations.[30] His successors, Anaximander and Anaximenes, extended this by positing the apeiron (boundless) and air as primary principles, respectively, emphasizing empirical observation and rational speculation over divine intervention.[31] Pythagoras (c. 570–495 BCE) and his school advanced mathematics as a means to uncover cosmic order, discovering the Pythagorean theorem for right triangles and linking numerical ratios to musical harmonies, which influenced later views of the universe as mathematically structured.[32] Democritus (c. 460–370 BCE) introduced atomism, theorizing that the universe consists of indivisible particles (atomos) moving in a void, a mechanistic model anticipating modern atomic theory, though it lacked experimental verification at the time.[33] Hippocrates of Kos (c. 460–370 BCE) founded the basis of Western medicine by emphasizing clinical observation, prognosis, and natural causes of disease over supernatural ones, compiling case histories and articulating the humoral theory—positing imbalances in blood, phlegm, yellow bile, and black bile as disease origins—which guided diagnostics for centuries.[31] Aristotle (384–322 BCE) systematized knowledge across disciplines, classifying over 500 animal species based on empirical dissections and observations, developing syllogistic logic as a tool for deduction, and formulating theories of motion and causality (material, formal, efficient, final causes) that dominated natural philosophy until the Scientific Revolution.[34] The Hellenistic period, following Alexander the Great's conquests (323–31 BCE), saw scientific inquiry flourish in cosmopolitan centers like Alexandria's Mouseion, supported by royal patronage and the Library of Alexandria, which amassed vast collections for scholars.[35] Euclid (fl. c. 300 BCE) codified geometry in his Elements, presenting 13 books of theorems derived from five axioms and postulates, establishing deductive proof as the standard for mathematical rigor and influencing fields from engineering to astronomy.[31] Archimedes of Syracuse (c. 287–212 BCE) pioneered hydrostatics with his principle of buoyancy—stating that a submerged body displaces fluid equal to its weight—applied in devices like the screw pump for irrigation, and approximated π between 3 10/71 and 3 1/7 using polygonal methods, while devising levers capable of moving the Earth in principle.[36] In astronomy, Aristarchus of Samos (c. 310–230 BCE) proposed a heliocentric model with the Earth rotating daily and orbiting the Sun, estimating relative sizes but facing rejection due to inconsistencies with geocentric observations; Eratosthenes (c. 276–194 BCE) calculated Earth's circumference at approximately 252,000 stadia (about 39,000–46,000 km, close to modern 40,075 km) via angle measurements from shadows in Alexandria and Syene.[37] Ptolemy (c. 100–170 CE), synthesizing Hellenistic traditions, detailed a geocentric system in the Almagest using epicycles and deferents to model planetary retrograde motion with trigonometric tables, achieving predictive accuracy for eclipses and conjunctions that endured until Copernicus.[38] Advances in medicine included Herophilus (c. 335–280 BCE) and Erasistratus (c. 304–250 BCE) performing human dissections in Alexandria, identifying nerves, the brain's role in intelligence, and distinguishing arteries from veins, though vivisections on criminals raised ethical concerns later suppressed under Roman influence.[39]Medieval Period and Non-Western Contributions
In Western Europe following the fall of the Roman Empire around 476 CE, scientific knowledge from antiquity was largely preserved rather than advanced, with monastic institutions serving as key repositories for copying classical texts in Latin. Figures such as Isidore of Seville (c. 560–636 CE) compiled encyclopedic works like Etymologies, synthesizing Greco-Roman learning on natural history and astronomy, while the Venerable Bede (c. 673–735 CE) contributed to computus, refining calendar calculations for Easter dating based on empirical observations of lunar cycles.[40] Despite narratives of stagnation, medieval scholars developed practical technologies, including mechanical clocks by the late 13th century and eyeglasses around 1286 CE, alongside early empirical approaches in agriculture and medicine through trial-and-error herbalism in monastic gardens.[41] Universities emerging from the 12th century, such as Bologna (1088 CE) and Paris (c. 1150 CE), fostered scholasticism, integrating Aristotelian logic with theology, though emphasis on authority over experimentation limited novel discoveries.[42] Parallel to these efforts, the Islamic world during the Golden Age (c. 8th–13th centuries) drove significant advancements by translating and expanding upon Greek, Persian, and Indian texts in centers like Baghdad's House of Wisdom, established under Caliph al-Ma'mun (r. 813–833 CE). Muhammad ibn Musa al-Khwarizmi (c. 780–850 CE) systematized algebra in Al-Kitab al-Mukhtasar fi Hisab al-Jabr wal-Muqabala (c. 820 CE), introducing methods for solving linear and quadratic equations that influenced later European mathematics.[43] In optics, Ibn al-Haytham (965–1040 CE) pioneered the scientific method through experimentation in Kitab al-Manazir (c. 1011–1021 CE), disproving emission theories of vision and describing refraction and the camera obscura, laying groundwork for perspective in art and physics.[44] Medical compendia like Ibn Sina's Canon of Medicine (c. 1025 CE) integrated pharmacology, anatomy, and clinical trials, remaining a standard text in Europe until the 17th century.[45] These works, often building causally on preserved empirical data rather than pure speculation, were later translated into Latin via Toledo and Sicily in the 12th century, facilitating Europe's recovery of classical knowledge.[46] In medieval India, mathematical and astronomical traditions persisted from earlier Siddhanta texts, with scholars like Bhaskara II (1114–1185 CE) advancing calculus precursors in Lilavati (c. 1150 CE), including solutions to indeterminate equations and early concepts of Rolle's theorem through geometric proofs.[47] Indian astronomers refined heliocentric elements and trigonometric functions, as in the Siddhanta Shiromani, calculating planetary positions with sine tables accurate to within arcminutes, influencing Persian and Islamic computations.[48] During China's Song Dynasty (960–1279 CE), technological innovations emphasized practical engineering over theoretical abstraction, with gunpowder formulas refined for military use by the 10th century, enabling bombs, rockets, and cannons documented in texts like the Wujing Zongyao (1044 CE).[49] The magnetic compass evolved into a reliable navigational tool by the 11th century, using lodestone needles in water bowls for maritime expansion, while movable-type printing (c. 1040 CE by Bi Sheng) accelerated knowledge dissemination.[50] These developments, driven by state-sponsored empiricism in civil and naval projects, contrasted with Europe's feudal fragmentation.[51]Scientific Revolution and Early Modern Era
The Scientific Revolution, occurring primarily between the mid-16th and late 17th centuries, represented a profound transformation in natural philosophy, shifting emphasis from qualitative Aristotelian explanations and reliance on ancient authorities to quantitative analysis, mathematical modeling, and direct empirical observation of natural phenomena.[52] This era's advancements were driven by innovations in instrumentation, such as the telescope, and a growing commitment to experimentation, laying the groundwork for modern physics and astronomy. Key developments challenged the Ptolemaic geocentric system, which posited Earth as the unmoving center of the universe surrounded by celestial spheres, in favor of evidence-based alternatives.[53] Nicolaus Copernicus initiated this shift with the 1543 publication of De revolutionibus orbium coelestium, proposing a heliocentric model in which the Sun occupied the center, with Earth and other planets orbiting it in circular paths, thereby simplifying celestial mechanics compared to the epicycle-laden geocentric framework.[54] Although Copernicus retained some circular orbits and deferred full endorsement to avoid controversy, his work provided a conceptual foundation that subsequent observers built upon through precise measurements.[54] Tycho Brahe's meticulous naked-eye observations from 1576 to 1601, including comet trajectories that pierced supposedly solid celestial spheres, supplied the data needed to refine these ideas, though Brahe himself favored a geo-heliocentric hybrid.[54] Johannes Kepler, using Brahe's data after 1601, formulated three empirical laws of planetary motion: first, orbits are ellipses with the Sun at one focus (1609); second, a line from a planet to the Sun sweeps equal areas in equal times, implying varying speeds (1609); and third, the square of a planet's orbital period is proportional to the cube of its semi-major axis (1619).[55] These laws discarded uniform circular motion, aligning theory with observation and enabling predictions of planetary positions with unprecedented accuracy.[55] Galileo Galilei advanced this empirical turn by improving the telescope in 1609, observing Jupiter's four moons (thus demonstrating orbiting bodies beyond Earth), the phases of Venus (consistent only with heliocentrism), and lunar craters, which refuted the Aristotelian doctrine of perfect, unchanging heavens.[53] His 1632 Dialogue Concerning the Two Chief World Systems publicly defended Copernicanism, leading to a 1633 Inquisition trial where he was convicted of heresy for asserting heliocentrism as fact rather than hypothesis, resulting in house arrest until his death in 1642.[56] Galileo's kinematic studies, including falling bodies and projectile motion, emphasized mathematics as the language of nature, prefiguring unified physical laws.[57] In biology and medicine, William Harvey demonstrated in 1628, through vivisections and quantitative measurements of blood volume, that blood circulates continuously as a closed loop pumped by the heart, overturning Galen's ancient model of ebb-and-flow tides and establishing circulation as a mechanical process verifiable by experiment.[58] Harvey's work quantified cardiac output, estimating the heart pumps about two ounces per beat, multiplying to over 500 ounces daily—far exceeding bodily blood volume—thus proving unidirectional flow.[58] Isaac Newton's Philosophiæ Naturalis Principia Mathematica (1687) synthesized these threads into a comprehensive mechanical framework, articulating three laws of motion— inertia, F=ma, and action-reaction—and the law of universal gravitation, positing that every mass attracts every other with force proportional to product of masses and inverse-square of distance.[59] By deriving Kepler's laws from these principles, Newton demonstrated celestial and terrestrial mechanics as governed by the same quantifiable rules, applicable from falling apples to orbiting planets, without invoking occult qualities.[59] This causal unification, rooted in mathematical deduction from observed effects, marked a pinnacle of the era's method.[60] Methodologically, Francis Bacon's Novum Organum (1620) advocated inductive reasoning from systematic observations and experiments to generalize laws, critiquing deductive syllogisms and "idols" of the mind—biases like unexamined traditions—that distort inquiry.[61] Bacon's tables of presence, absence, and degrees aimed to eliminate variables incrementally, promoting collaborative, cumulative knowledge over isolated speculation.[61] In chemistry, Robert Boyle's corpuscular theory viewed matter as composed of minute, shape- and size-varying particles in motion, whose interactions explain properties like gas pressure; his 1662 experiments established Boyle's law (PV constant at fixed temperature), distinguishing chemical experimentation from alchemical mysticism.[62] Institutionalization accelerated progress: the Royal Society of London, founded November 28, 1660, and chartered in 1662 by Charles II, fostered empirical verification through weekly meetings, publications like Philosophical Transactions (from 1665), and rejection of untested claims, embodying Baconian ideals of organized inquiry.[63] Similar academies emerged in Paris (1666), promoting standardized methods amid Europe's intellectual networks. These developments, while facing resistance from entrenched scholasticism, propelled science toward predictive power and falsifiability, influencing the Early Modern Era's broader Enlightenment rationalism.[63]19th-Century Industrialization of Science
The 19th century witnessed the transformation of science from an avocation of elite amateurs into a structured profession integrated with industrial and academic institutions. This shift, often termed the professionalization of science, involved the creation of dedicated research facilities, formalized training programs, and career paths dependent on institutional support rather than private patronage. Key drivers included the demands of the Industrial Revolution for technological advancements and the emulation of rigorous organizational models from emerging nation-states. By mid-century, scientific output surged, with specialized journals proliferating to disseminate findings rapidly.[64][65] Pioneering laboratories exemplified this industrialization. Justus von Liebig established a model teaching and research laboratory in chemistry at the University of Giessen in 1824, training over 1,000 students in standardized experimental methods that emphasized quantitative analysis and reproducibility. This approach influenced global chemical education and contributed to industrial applications, such as synthetic dyes and fertilizers, fostering a feedback loop between academic research and manufacturing. In Britain, the Royal Institution, founded in 1799 but expanded under Humphry Davy and Michael Faraday, hosted systematic investigations into electromagnetism, while the Cavendish Laboratory at Cambridge opened in 1874 to advance experimental physics.[66][67][68] Institutional frameworks solidified scientific practice. The term "scientist" was introduced by philosopher William Whewell in 1833 to denote full-time investigators, reflecting the era's recognition of science as a distinct vocation. Universities adopted the German Humboldtian ideal of research-oriented education, with the PhD degree standardizing advanced training; Johns Hopkins University in the United States, established in 1876, exemplified this by prioritizing graduate research over undergraduate instruction. Scientific societies expanded, such as the American Association for the Advancement of Science founded in 1848, which coordinated efforts and lobbied for funding. Government and industry investment grew, with Britain's Patent Office recording over 10,000 patents annually by the 1880s, many rooted in scientific principles.[64][69][63] This era's industrialization accelerated discoveries but introduced tensions, including competition for resources and the alignment of research agendas with economic priorities. Empirical methodologies refined through repeated experimentation yielded breakthroughs in thermodynamics and spectroscopy, underpinning the Second Industrial Revolution from the 1870s. However, reliance on institutional funding raised questions about independence, as private enterprises like the Pennsylvania Railroad established in-house labs by 1875 to pursue proprietary innovations. Overall, these developments scaled scientific production, making it a cornerstone of modern technological progress.[70][71][72]20th-Century Theoretical and Experimental Revolutions
The 20th century marked transformative shifts in scientific understanding, primarily through revolutions in physics that redefined space, time, matter, and energy, with subsequent experimental validations enabling technological applications like nuclear power and semiconductors. Theoretical advancements began with Max Planck's 1900 quantum hypothesis, which posited that electromagnetic radiation is emitted and absorbed in discrete packets of energy called quanta to resolve discrepancies in blackbody radiation spectra.[73] Albert Einstein's 1905 special theory of relativity challenged classical notions by establishing that the laws of physics are the same for all non-accelerating observers and that the speed of light is constant, leading to consequences such as time dilation and mass-energy equivalence (E=mc²).[74] Einstein extended this in 1915 with general relativity, describing gravity as the curvature of spacetime caused by mass and energy, later confirmed by observations like the 1919 solar eclipse deflection of starlight.[75] Quantum mechanics emerged as a comprehensive framework in the 1920s, building on Planck's quanta and Einstein's 1905 explanation of the photoelectric effect, where light behaves as particles (photons) to eject electrons from metals. Niels Bohr's 1913 atomic model incorporated quantized electron orbits to explain hydrogen's spectral lines, bridging classical and quantum ideas by postulating stationary states and photon emission during transitions.[75] Werner Heisenberg's 1927 uncertainty principle formalized the inherent limits on simultaneously measuring a particle's position and momentum, underscoring the probabilistic nature of quantum phenomena rather than deterministic trajectories.[76] These developments culminated in matrix mechanics (Heisenberg, 1925) and wave mechanics (Schrödinger, 1926), unifying into a theory predicting atomic and subatomic behaviors with unprecedented accuracy, though interpretations like Copenhagen emphasized observer-dependent outcomes.[76] Experimental breakthroughs validated these theories and spurred further revolutions. Otto Hahn and Fritz Strassmann's 1938 discovery of nuclear fission, where uranium nuclei split upon neutron bombardment to release energy and lighter elements like barium, built on quantum insights into nuclear stability and enabled chain reactions harnessed in the 1940s Manhattan Project.[77] In biology, James Watson and Francis Crick's 1953 double-helix model of DNA elucidated genetic information storage via base pairing, integrating chemical structure with heredity and paving the way for molecular biology.[78] Geosciences underwent a paradigm shift with plate tectonics, accepted in the late 1960s after evidence from seafloor spreading, magnetic striping, and earthquake distributions showed continents drift on lithospheric plates driven by mantle convection.[79] In cosmology, Edwin Hubble's 1929 observation of galactic redshifts supported an expanding universe, bolstering Georges Lemaître's 1927 primeval atom hypothesis (later Big Bang), with decisive 1965 detection of cosmic microwave background radiation by Penzias and Wilson providing relic heat from the early universe.[80] These revolutions, grounded in empirical verification, expanded science's explanatory power while revealing fundamental limits, such as quantum indeterminacy and relativistic invariance.Post-1945 Expansion and Contemporary Frontiers
The end of World War II marked a pivotal shift in the scale and organization of scientific endeavor, driven by recognition of science's wartime contributions such as the Manhattan Project and radar advancements. In 1945, Vannevar Bush's report Science, the Endless Frontier argued for sustained federal investment in basic research to maintain national security and economic prosperity, influencing the establishment of the National Science Foundation (NSF) in 1950 with an initial budget of $3.5 million, which grew to support thousands of grants annually by the 1960s.[81] [82] Federal R&D funding in the United States, negligible before the war, expanded to encompass over 50% of basic research by the late 20th century, fostering national laboratories like Los Alamos and Argonne, and international collaborations such as CERN founded in 1954 for particle physics exploration.[83] [84] This era saw "big science" emerge, characterized by large-scale, capital-intensive projects requiring interdisciplinary teams and substantial resources. The launch of Sputnik by the Soviet Union in 1957 prompted a surge in U.S. funding, leading to NASA's creation in 1958 and the Apollo program's achievement of the Moon landing in 1969, which involved over 400,000 personnel and advanced rocketry, materials science, and computing.[85] In biology, the 1953 elucidation of DNA's double helix structure by Watson, Crick, Franklin, and Wilkins laid foundations for molecular biology, culminating in the Human Genome Project's completion in 2003, which sequenced the human genome at a cost of $2.7 billion using international consortia.[86] Computing advanced from the 1947 invention of the transistor at Bell Labs to integrated circuits and the ARPANET precursor of the internet in 1969, enabling data-driven research across disciplines.[87] Particle physics progressed through accelerators like the Stanford Linear Accelerator (operational 1966) and Fermilab (1967), confirming the Standard Model's quarks and gluons by the 1970s, with the Higgs boson discovery at CERN's Large Hadron Collider in 2012 validating mass-generation mechanisms.[88] Biomedical fields expanded with penicillin's mass production post-war and recombinant DNA techniques in the 1970s, leading to biotechnology industries valued at trillions by the 2020s.[89] Contemporary frontiers encompass quantum technologies, where Google's 2019 demonstration of quantum supremacy highlighted computational potentials beyond classical limits, though scalability remains challenged by decoherence.[90] Gene editing via CRISPR-Cas9, developed in 2012, achieved FDA approval for sickle cell treatment in 2023, enabling precise genomic modifications but raising ethical concerns over germline edits.[90] In cosmology, the James Webb Space Telescope's 2021 deployment revealed early universe galaxies, probing dark matter and energy comprising 95% of the cosmos, while fusion experiments like the National Ignition Facility's 2022 net energy gain advance sustainable power prospects.[91] Artificial intelligence, powered by deep learning frameworks since the 2010s, drives applications in protein folding predictions via AlphaFold (2020) and autonomous systems, yet faces scrutiny over energy demands and alignment with human values.[92] Climate research, amid debates over modeling reliability and policy influences, utilizes satellite data for tracking phenomena like Arctic ice melt, with partisan divides evident in surveys showing 90% Democratic versus 30% Republican acceptance of anthropogenic warming in the U.S.[90] These pursuits, supported by global R&D expenditures exceeding $2 trillion annually by 2020, underscore science's institutionalization but highlight tensions between empirical rigor and institutional biases in funding allocation.[93]Scientific Method and Epistemology
Principles of Empirical Inquiry
Empirical inquiry constitutes the foundational approach in science for deriving knowledge from direct observation, measurement, and verifiable evidence rather than untested assumptions or authority. This method insists that claims about natural phenomena must be grounded in data accessible through the senses or precise instrumentation, enabling independent replication and scrutiny.[5][94] As articulated in guidelines from the National Academy of Sciences, it involves posing hypotheses that are empirically testable and designing studies capable of ruling out alternative explanations through controlled evidence collection.[95] Central to empirical inquiry is the principle of systematic observation, which requires recording phenomena according to explicit protocols that specify what data to gather, from where, and in what manner to minimize variability and ensure comparability.[96] This approach counters subjective interpretation by emphasizing quantifiable metrics—such as lengths measured to 0.1 mm precision or temperatures logged via calibrated thermometers—over anecdotal reports.[97] Repeatability serves as a cornerstone, mandating that observations yield consistent results when repeated by different investigators under identical conditions, as demonstrated in foundational experiments like Galileo's 1609 telescopic observations of Jupiter's moons, which multiple astronomers verified shortly thereafter.[98] Another key principle is objectivity through methodological controls, which seeks to isolate causal factors by varying one element while holding others constant, thereby attributing effects to specific variables rather than confounding influences.[99] For instance, in testing gravitational acceleration, dropping objects of varying masses in a vacuum eliminates air resistance as a variable, yielding a uniform 9.8 m/s² value across trials.[100] Empirical inquiry thus privileges causal realism by demanding evidence of mechanisms observable in the natural world, rejecting explanations reliant solely on theoretical constructs without supporting data.[101] This rigor has enabled self-correction in science, as erroneous claims—like the 19th-century phlogiston theory of combustion—succumb to contradictory empirical findings, such as Lavoisier's 1770s quantitative gas measurements revealing oxygen's role.[100] Empirical principles also incorporate skepticism toward unverified generalizations, favoring inductive reasoning that builds from specific instances to tentative laws only after extensive data accumulation.[102] Quantitative and qualitative data collection methods, such as randomized sampling in surveys yielding statistically significant p-values below 0.05, further ensure robustness against sampling errors.[103] While institutional biases in data interpretation can arise—particularly in fields influenced by prevailing ideologies—adherence to these principles, including peer review and data transparency, provides mechanisms for detection and rectification, as seen in the retraction of over 10,000 papers annually due to evidential shortcomings reported by databases like Retraction Watch since 2010.[104]Hypothetico-Deductive Framework and Falsification
The hypothetico-deductive framework describes scientific inquiry as a process beginning with the formulation of a testable hypothesis derived from a broader theory, followed by the logical deduction of specific, observable predictions that the hypothesis entails under given conditions.[105] These predictions are then subjected to empirical testing through controlled experiments or systematic observations; if the outcomes match the predictions, the hypothesis gains tentative corroboration, whereas discrepancies lead to its rejection or modification.[106] This approach contrasts with strict inductivism, which relies on accumulating confirmatory instances to generalize theories, by emphasizing deduction from general principles to particular testable claims.[107] Central to this framework is the principle of falsification, articulated by philosopher Karl Popper in his 1934 work Logik der Forschung (published in English as The Logic of Scientific Discovery in 1959), which posits that a hypothesis or theory qualifies as scientific only if it is empirically falsifiable—meaning it prohibits certain outcomes and risks refutation by potential evidence.[11] Popper argued that confirmation through repeated positive instances cannot conclusively verify universal theories, as an infinite number of confirmations remain logically possible without proving the theory true, but a single contradictory observation suffices to falsify it, thereby demarcating science from non-scientific pursuits like metaphysics or pseudoscience.[11] For instance, Einstein's general theory of relativity advanced falsifiable predictions about light deflection during the 1919 solar eclipse, which, if unmet, would have refuted it; the observed confirmation thus provided strong but provisional support rather than irrefutable proof.[11] Falsification underscores an asymmetric logic in hypothesis testing: while failed predictions decisively undermine a theory (barring ad hoc adjustments to auxiliary assumptions), successful predictions merely fail to disprove it, aligning with causal realism by prioritizing mechanisms that could refute rather than affirm causal claims.[108] Popper's criterion, however, has faced critiques for oversimplifying scientific practice; the Duhem-Quine thesis holds that no experiment isolates a single hypothesis, as tests invariably involve background assumptions, allowing researchers to preserve favored theories by tweaking auxiliaries rather than abandoning the core idea.[109] Empirical studies of scientific history, such as Thomas Kuhn's analysis in The Structure of Scientific Revolutions (1962), reveal that paradigms persist amid anomalies until cumulative evidence prompts shifts, not strict single falsifications, suggesting falsification functions more as an ideal regulative principle than a literal historical descriptor.[110] Despite these limitations, the framework promotes rigorous empirical scrutiny, reducing reliance on untested authority and fostering progress through bold, refutable conjectures.[111]Experimentation, Observation, and Verification
Experimentation in science involves the deliberate manipulation of one or more independent variables under rigorously controlled conditions to determine their causal effects on dependent variables, thereby isolating specific mechanisms from confounding influences.[112] Controlled experiments typically incorporate randomization in assigning subjects or units to treatment and control groups to minimize selection biases and ensure comparability.[113] For instance, in clinical trials, double-blinding prevents experimenter and participant expectations from skewing outcomes, as demonstrated in randomized controlled trials evaluating pharmaceutical efficacy.[114] Observation complements experimentation by systematically collecting data on phenomena without direct intervention, often through precise instrumentation such as telescopes for astronomical events or sensors in environmental monitoring.[115] This method relies on predefined protocols to record measurements objectively, reducing subjective interpretation; for example, satellite observations of Earth's climate have provided longitudinal datasets on temperature anomalies since the 1970s.[116] In fields like astronomy or ecology, where manipulation is infeasible, repeated observations across diverse conditions serve to build empirical patterns amenable to statistical analysis. Verification entails subjecting experimental or observational results to independent replication, statistical scrutiny, and cross-validation to confirm reliability and rule out artifacts. Key techniques include calculating p-values to assess the probability of results occurring by chance—conventionally set below 0.05 for significance—and employing confidence intervals to quantify estimate precision.[117] However, empirical evidence reveals systemic challenges: in psychology, a 2015 large-scale replication attempt succeeded in only about 36% of cases, highlighting issues like underpowered studies and selective reporting.[118] Similarly, biomedical research shows replication failure rates exceeding 50% in some domains, underscoring the need for preregistration of protocols and open data sharing to mitigate p-hacking and publication bias.[119] These verification shortcomings, often exacerbated by incentives favoring novel over replicable findings, necessitate causal realism in interpreting single-study claims, prioritizing mechanisms grounded in first-principles over correlative associations alone.Limitations, Errors, and Sources of Bias
The scientific method, while robust for empirical inquiry, is inherently limited in scope, applicable primarily to phenomena that are observable, testable, and falsifiable, thereby excluding questions of metaphysics, ethics, or the foundational presuppositions of science itself, such as the uniformity of nature or the reliability of induction.[98][120] These constraints arise because the method relies on repeatable experiments and observations, which cannot definitively prove universal generalizations or address non-empirical domains like aesthetic or normative judgments.[121] Furthermore, the problem of induction—highlighted by philosophers like David Hume—persists, as finite observations cannot logically guarantee future outcomes, rendering scientific laws probabilistic rather than certain.[98] Errors in scientific practice often stem from statistical and methodological pitfalls, including Type I errors (false positives) and Type II errors (false negatives) in hypothesis testing, exacerbated by practices like p-hacking or selective reporting.[122] The replication crisis exemplifies these issues, with empirical attempts to reproduce findings in fields like psychology yielding success rates around 40%, and surveys indicating that nearly three-quarters of biomedical researchers acknowledge a reproducibility problem as of 2024.[123][124] Such failures arise not only from measurement inaccuracies but also from underpowered studies and inadequate controls, leading to inflated effect sizes that erode cumulative knowledge when non-replicable results accumulate in the literature.[125] Sources of bias further undermine reliability, with confirmation bias prompting researchers to favor evidence aligning with preconceptions while discounting disconfirming data, a tendency documented in experimental settings where participants selectively sample supportive information.[126][127] Publication bias compounds this by disproportionately favoring positive or statistically significant results, as evidenced by meta-analyses showing suppressed null findings in preclinical research, which distorts meta-analytic conclusions and wastes resources on futile pursuits.[128][125] Additional vectors include selection bias in sampling, where non-representative populations skew generalizability, and funding influences, where sponsor interests—such as in pharmaceutical trials—correlate with favorable outcomes, though empirical reviews confirm modest rather than overwhelming effects from such pressures.[129][130] Institutional factors, including "publish or perish" incentives, amplify these biases by prioritizing novel over rigorous findings, particularly in environments where ideological conformity in academia may subtly favor hypotheses aligning with prevailing worldviews, though direct causal evidence for systemic distortion remains contested and requires scrutiny beyond self-reported surveys.[124][131] Despite self-corrective mechanisms like peer review, these limitations necessitate preregistration, replication mandates, and transparent reporting to mitigate distortions.[132]Branches of Science
Natural Sciences
Natural sciences constitute the core disciplines investigating the physical universe and living systems through empirical observation, experimentation, and quantitative analysis to uncover invariant laws and causal mechanisms. These fields prioritize testable hypotheses, reproducible results, and predictive models derived from data, distinguishing them from formal sciences like mathematics, which abstractly manipulate logical structures without reference to physical reality, and from social sciences, which grapple with human actions influenced by subjective factors and less controllable variables.[133][134] The primary branches include physical sciences—physics, which delineates fundamental particles, forces, and spacetime dynamics as in the standard model encompassing quarks, leptons, and gauge bosons; chemistry, detailing atomic and molecular interactions via quantum mechanics and thermodynamics, exemplified by the periodic table organizing 118 elements by electron configuration; and astronomy, mapping cosmic structures from solar systems to galaxy clusters using spectroscopy and general relativity. Earth sciences integrate geology, probing tectonic plate movements at rates of 2-10 cm per year, oceanography, analyzing currents driving global heat distribution, and atmospheric science, modeling weather patterns through fluid dynamics equations.[135][136][137] Life sciences, centered on biology, dissect organismal processes from cellular metabolism—where ATP hydrolysis powers reactions with a free energy change of -30.5 kJ/mol—to evolutionary adaptations, as evidenced by fossil records spanning 3.5 billion years and genetic sequences revealing 99% human-chimpanzee DNA similarity. Interdisciplinary extensions like biochemistry link chemical kinetics to enzymatic catalysis rates exceeding 10^6 s^-1, while ecology quantifies population dynamics via Lotka-Volterra equations predicting predator-prey oscillations. These pursuits have engineered breakthroughs, including semiconductor physics enabling transistors with billions on microchips since the 1960s integrated circuit invention, and CRISPR gene editing, achieving targeted DNA cuts with 90% efficiency in lab settings by 2012.[133] Contemporary frontiers probe unification of gravity with quantum field theory, biological origins via abiogenesis hypotheses tested in Miller-Urey simulations yielding amino acids under primordial conditions, and planetary habitability through exoplanet detections numbering over 5,000 by NASA's Kepler and TESS missions as of 2023. Despite institutional biases potentially skewing interpretations in areas like environmental modeling, natural sciences advance via rigorous skepticism and data confrontation, fostering technologies from mRNA vaccines deployed in 2020 with efficacy rates above 90% against specific pathogens to fusion energy pursuits achieving net gain in 2022 inertial confinement experiments.[138][139]Formal Sciences
Formal sciences comprise disciplines that analyze abstract structures and formal systems using deductive methods and logical inference, independent of empirical observation or the physical world. These fields establish truths through axiomatic foundations and proofs, yielding apodictic certainty rather than probabilistic conclusions typical of empirical inquiry. Key examples include mathematics, which explores quantities, structures, and patterns; logic, which examines principles of valid reasoning; theoretical computer science, focusing on computation, algorithms, and automata; and statistics, which formalizes methods for data inference and probability. Systems theory and decision theory also fall within this domain, modeling abstract relationships and choices under uncertainty.[140][141] In contrast to natural sciences, which test hypotheses against observable phenomena through experimentation, formal sciences operate a priori: their validity stems from internal consistency within the system, not external validation. For instance, a mathematical theorem holds regardless of real-world applicability, as long as it follows from accepted axioms like those in Euclidean geometry or Peano arithmetic. This distinction traces to philosophical roots, with formal methods enabling rigorous argumentation since antiquity—Aristotle's syllogistic logic in the 4th century BCE systematized deduction—but modern formalization accelerated in the 19th century with George Boole's 1847 work on algebraic logic and Gottlob Frege's 1879 Begriffsschrift, which introduced predicate logic. These developments addressed foundational crises, such as paradoxes in set theory identified by Bertrand Russell in 1901, prompting axiomatic reforms by David Hilbert and others in the early 20th century.[142][143] The role of formal sciences extends as foundational tools for other branches: mathematics underpins physical models in physics, as in differential equations describing motion since Isaac Newton's 1687 Principia; statistics enables hypothesis testing in biology, with methods like the chi-squared test formalized by Karl Pearson in 1900; and theoretical computer science informs algorithms in data analysis across disciplines. In 2023, computational complexity theory, a formal subfield, proved limits on efficient solvability for problems like the traveling salesman, impacting optimization in engineering. While debates persist on classifying formal disciplines as "sciences"—given their non-empirical nature—they integrate via hybrid applications, such as probabilistic models bridging statistics and empirical data. Formal sciences thus provide frameworks resistant to observational biases, prioritizing logical rigor over contingent evidence.[140][144]Social and Behavioral Sciences
The social and behavioral sciences investigate human actions, societal patterns, and institutional mechanisms through systematic observation and analysis. These fields encompass psychology, which examines individual mental processes and behaviors; sociology, which analyzes group interactions and social structures; economics, which models resource distribution and decision-making under scarcity; political science, which studies power dynamics and governance systems; and anthropology, which documents cultural practices and human adaptation.[145][146] Methodologies in these disciplines blend quantitative techniques, such as randomized experiments, regression analysis, and large-scale surveys, with qualitative approaches like ethnographic fieldwork and case studies.[147][148] Economists frequently employ mathematical modeling, as in supply and demand equilibrium, to predict market outcomes based on incentives and constraints.[145] Challenges persist due to the complexity of human systems, including confounding variables, ethical limits on manipulation, and low statistical power in studies. The replication crisis exemplifies these issues: a 2015 project replicated only 39% of 100 psychological experiments, while economics saw 61% success across 18 studies.[149] Sociology has been slower to address such concerns compared to psychology and economics.[150] Ideological imbalances among researchers compound these problems, with surveys showing 76% of social scientists in top universities identifying as left-wing and 16% as far-left, far exceeding conservative representation.[151] This homogeneity, at ratios exceeding 10:1 in some subfields, can skew hypothesis selection toward preferred narratives and hinder scrutiny of dissenting evidence.[152] Many theories in interpretive branches like cultural anthropology resist strict falsification, allowing flexible reinterpretations of data that evade decisive refutation, thus blurring boundaries with non-empirical speculation.[153] Despite limitations, advancements in causal inference tools, such as instrumental variables and natural experiments, have strengthened claims in economics and political science.[154] Such partisan divides, as in public perceptions of climate issues, highlight the domains these sciences address, where empirical data often reveal deep worldview cleavages.[155]Applied and Engineering Sciences
Applied sciences involve the practical application of knowledge derived from natural and formal sciences to achieve tangible outcomes, such as developing technologies, improving processes, or solving societal challenges. This contrasts with basic research, which seeks to expand fundamental understanding without immediate utility, by emphasizing empirical validation in real-world contexts to produce usable products or methods.[156][157] Engineering sciences represent a specialized extension of applied sciences, focusing on the systematic design, analysis, construction, and optimization of structures, machines, and systems under constraints like cost, safety, and reliability. While applied sciences may prioritize adapting scientific principles to specific problems, engineering integrates these with iterative prototyping, mathematical modeling, and regulatory compliance to ensure scalable functionality, distinguishing it through its emphasis on creation and deployment rather than mere application.[158][159] Major fields within engineering sciences include:- Civil engineering: Designs infrastructure such as bridges, roads, and water systems, addressing load-bearing capacities and environmental durability.[160]
- Mechanical engineering: Develops machinery and thermal systems, applying thermodynamics and mechanics to engines and robotics.[160]
- Electrical engineering: Focuses on power generation, electronics, and signal processing, underpinning devices from circuits to renewable grids.[160]
- Chemical engineering: Scales chemical processes for manufacturing fuels, pharmaceuticals, and materials, optimizing reaction efficiency and safety.[160]
- Biomedical engineering: Merges biology with engineering to create medical devices like prosthetics and imaging tools, enhancing diagnostics and treatments.[161]
Research Practices and Institutions
Methodologies and Tools
Scientific methodologies in research typically adhere to an iterative empirical process involving observation of phenomena, formulation of testable hypotheses, data collection through experimentation or systematic measurement, statistical analysis, and drawing conclusions that may lead to new hypotheses.[167] This framework, often termed the scientific method, emphasizes reproducibility and falsifiability to distinguish robust findings from conjecture.[168] Experimental methodologies dominate fields amenable to control, such as physics and chemistry, where independent variables are manipulated while holding others constant to infer causality, as in randomized controlled trials that allocate subjects to treatment or control groups via random assignment to minimize selection bias.[169] Observational methodologies, conversely, rely on natural variation without intervention, prevalent in astronomy or epidemiology, where techniques like cohort studies track groups over time to identify associations, though they cannot conclusively prove causation due to confounding factors.[170] Quantitative methodologies predominate in hypothesis-driven research, employing numerical data analyzed via inferential statistics to generalize from samples to populations, including t-tests for comparing means and analysis of variance (ANOVA) for multiple groups.[171] Regression analysis models relationships between variables, such as linear regression fitting a straight line to predict outcomes like y = β0 + β1x + ε, where β coefficients quantify effect sizes.[172] Qualitative methodologies complement these by exploring contexts through interviews, thematic analysis, or grounded theory, generating hypotheses from patterns in non-numerical data, though they require triangulation with quantitative evidence to enhance validity.[173] Computational methodologies, including simulations and machine learning algorithms, model complex systems; for instance, Monte Carlo methods use random sampling to approximate probabilities in scenarios intractable analytically, as in particle physics for estimating event rates.[174] Research tools span physical instruments, software, and analytical frameworks tailored to disciplinary needs. Laboratory instruments like spectrophotometers measure light absorption to quantify molecular concentrations with precision up to parts per million, enabling biochemical assays since their development in the early 20th century.[175] Advanced imaging tools, such as scanning electron microscopes, provide high-resolution surface topography by scanning electron beams, achieving magnifications over 100,000x for nanomaterial characterization.[175] In fieldwork, sensors and data loggers, including GPS-enabled devices and environmental probes, automate collection of variables like temperature or seismic activity at high temporal resolution.[176] Software tools facilitate data handling and modeling; Python libraries like NumPy and SciPy perform matrix operations and optimization, while R excels in statistical computing for tasks like generalized linear models.[171] Bayesian statistical tools, implemented in packages such as Stan, incorporate prior knowledge to update posteriors via Markov chain Monte Carlo sampling, offering advantages over frequentist methods in handling uncertainty with small datasets.[177] Survey instruments, including validated questionnaires with Likert scales, standardize self-reported data in social sciences, ensuring reliability through pilot testing and Cronbach's alpha for internal consistency, typically targeting values above 0.7.[178] These tools, when calibrated and validated, underpin verifiable results, though improper use—such as ignoring multiple testing corrections in hypothesis evaluations—can inflate false positives.[171]Peer Review, Replication, and Publication
Peer review serves as a quality control mechanism in scientific publishing, wherein independent experts in the relevant field assess submitted manuscripts for methodological soundness, originality, validity of conclusions, and overall contribution to knowledge prior to acceptance in journals.[179] The process typically involves editors selecting 2-3 reviewers, who provide confidential recommendations, though it does not formally "approve" truth but rather filters for plausibility and rigor.[180] Common variants include single-anonymized review, where reviewers know authors' identities but not vice versa, which remains predominant due to tradition; double-anonymized, concealing both parties to mitigate bias; and open review, revealing identities to promote accountability but risking reprisal.[181][182] Despite its centrality, peer review exhibits systemic limitations, including failure to consistently detect fraud, errors, or misconduct—such as data fabrication in high-profile retractions—and susceptibility to biases favoring established researchers or trendy topics over substantive merit.[183][180] Reviewers identify issues in data, methods, and results more effectively than textual plagiarism, yet the process remains overburdened, with delays averaging months and rejection rates exceeding 70% in top journals, exacerbating the "publish or perish" incentive structure that prioritizes quantity over verification.[184][185] Empirical evaluations indicate peer review enhances manuscript quality modestly but does not eliminate flawed publications, as evidenced by post-publication retractions and critiques highlighting its role in perpetuating echo chambers rather than ensuring causal validity.[186][187] Replication constitutes an independent re-execution of experiments or studies to confirm original findings, distinguishing robust effects from artifacts of chance, error, or bias, and forms a cornerstone of empirical validation in science by testing generalizability across contexts.[188][189] However, replication rates remain alarmingly low: in psychology, only 39% of 100 prominent studies replicated in a 2015 large-scale effort, while economics saw 61% success across 18 studies and analogous failures in medicine undermine clinical reliability.[149][123] This "replication crisis," spanning disciplines since the 2010s, stems from underpowered original studies, selective reporting, and institutional disincentives—replications garner fewer citations and career rewards than novel claims—yielding inflated effect sizes in initial reports.[190][8] Recent reforms, including preregistration and transparency mandates, have elevated replication success to nearly 90% in compliant psychology studies, underscoring that methodological safeguards can mitigate but not erase incentive-driven distortions.[191] Publication practices amplify these issues through biases like publication bias, where non-significant results face rejection, skewing the literature toward positive findings, and p-hacking, involving post-hoc data dredging or analysis flexibility to achieve statistical significance (p < 0.05).[192][193] Econometric analyses of submissions reveal bunching at p = 0.05 thresholds, indicating manipulation, with p-hacked results cited disproportionately despite lower replicability, eroding meta-analytic reliability and public trust.[194][195] Journals' emphasis on impact factors incentivizes sensationalism over incremental replication, though emerging open-access models and preprints bypass traditional gates, enabling faster scrutiny but risking unvetted dissemination.[123] Collectively, these elements highlight that while peer review and publication facilitate dissemination, true advancement demands rigorous replication, often sidelined by career pressures favoring apparent novelty.[7]Funding Mechanisms and Organizational Structures
Scientific research funding derives from multiple sources, with government grants forming the primary mechanism for basic research, while industry investments dominate applied and development activities. In the United States, the federal government accounted for 41% of basic research funding in recent assessments, channeled through agencies such as the National Science Foundation (NSF) and National Institutes of Health (NIH), which disbursed billions annually via competitive grants and contracts.[196][197] Globally, total R&D expenditures reached approximately $2.5 trillion in 2022, with the business sector performing the majority—around 70-80%—driven by profit motives, whereas governments funded about 10-20% of performed R&D in countries like the US and UK.[198][199] Philanthropic foundations, such as the Gates Foundation, supplement these by targeting specific fields like global health, though their allocations can prioritize donor agendas over broad scientific inquiry.[200] The US federal R&D budget for fiscal year 2024 included proposals for $181.4 billion in requested investments across agencies, supporting both intramural and extramural research through mechanisms like research grants (R series), cooperative agreements (U series), and small business innovation research (SBIR) contracts.[201][202] These funds often flow to external performers via peer-reviewed proposals, but allocation decisions reflect policy priorities, such as national security or health crises, potentially skewing toward applied outcomes over fundamental discovery.[203] Industry funding, comprising the bulk of global R&D at nearly $940 billion in the US alone for 2023, incentivizes proprietary research with commercial potential, as seen in pharmaceutical and technology sectors.[204] Organizational structures in scientific research encompass universities, government laboratories, and private institutes, each with distinct governance models influencing productivity and focus. Universities, often structured hierarchically with principal investigators (PIs) leading labs under departmental oversight, emphasize academic freedom and tenure systems but face administrative burdens from grant cycles.[205][206] National laboratories, such as those under the US Department of Energy, operate as federally funded research and development centers (FFRDCs) with mission-driven mandates, employing matrix organizations that integrate disciplinary teams for large-scale projects like particle physics.[207] Private entities, including corporate R&D divisions and non-profits like the Howard Hughes Medical Institute, adopt flatter or project-based structures to accelerate innovation, though profit imperatives can limit data sharing compared to public institutions.[208] International collaborations, such as CERN's consortium model involving member states, pool resources through intergovernmental agreements, fostering specialized facilities beyond single-nation capacities.[209] These structures and funding paths interact dynamically; for instance, university researchers rely heavily on federal grants (75% of some institutions' totals), creating dependencies that may align inquiries with agency priorities rather than unfettered curiosity.[210] Critics note that funder influence—whether governmental policy alignment or corporate interests—shapes research trajectories, underscoring the need for diversified support to mitigate directional biases.[200][207]Global Collaboration and Competition
International scientific collaboration in fields like particle physics and space exploration leverages shared infrastructure and diverse expertise to address challenges beyond national capacities. The European Organization for Nuclear Research (CERN), founded in 1954 by 12 European countries and now comprising 23 member states, exemplifies this through projects such as the Large Hadron Collider (LHC), operational since 2008, which confirmed the Higgs boson particle on July 4, 2012, via data from over 10,000 scientists worldwide.[211] Similarly, the International Space Station (ISS), assembled in orbit starting in 1998 and continuously inhabited since November 2, 2000, unites agencies from the United States (NASA), Russia (Roscosmos), Europe (ESA), Japan (JAXA), and Canada (CSA), enabling experiments in microgravity that have yielded over 3,000 investigations advancing materials science and biology.[212] These efforts distribute costs—CERN's annual budget exceeds 1.2 billion Swiss francs—and foster knowledge exchange, though they require navigating differing regulatory frameworks and intellectual property agreements.[213] Despite collaborative successes, geopolitical competition propels scientific advancement by incentivizing rapid innovation and resource allocation. The U.S.-Soviet space race from 1957, triggered by Sputnik 1's launch on October 4, culminated in the Apollo 11 moon landing on July 20, 1969, spurring technologies like integrated circuits and weather satellites that benefited civilian applications.[214] In contemporary terms, U.S.-China rivalry manifests in space ambitions, with NASA's Artemis program targeting lunar south pole landings by 2026 contrasting China's plans for a lunar research station by 2030, alongside competitions in artificial intelligence and quantum computing.[215] This dynamic is underscored by global R&D expenditures: in 2023, the United States invested $823 billion, narrowly surpassing China's $780 billion, while the top eight economies accounted for 82% of the world's $2.5 trillion total, highlighting concentrated efforts amid export controls, such as U.S. restrictions on advanced semiconductors to China implemented in October 2022.[216][199] Tensions between collaboration and competition introduce challenges, including data-sharing restrictions and funding dependencies exacerbated by conflicts like the 2022 Russia-Ukraine war, which strained ISS operations despite continued joint missions.[217] Benefits of cooperation—enhanced research capacity, reduced duplication, and breakthroughs from interdisciplinary input—are empirically linked to higher citation impacts for internationally co-authored papers, yet barriers such as language differences, cultural variances, and national security concerns persist, often requiring bilateral agreements to mitigate.[218] In fields like climate modeling, initiatives such as the Intergovernmental Panel on Climate Change (IPCC), involving thousands of scientists from 195 countries since 1988, demonstrate collaboration's role in synthesizing evidence, though competitive national priorities can skew participation or interpretations. Overall, while competition accelerates targeted progress, sustained global collaboration remains essential for existential challenges like pandemics, where frameworks like COVAX facilitated vaccine distribution but faced inequities in access.[219]Philosophy of Science
Ontology and Epistemological Foundations
The ontology of science posits an objective reality independent of human perception, comprising entities and processes with inherent causal structures that scientific inquiry aims to uncover. This view aligns with scientific realism, which asserts that mature and successful scientific theories provide approximately true descriptions of both observable and unobservable aspects of the world, such as subatomic particles or gravitational fields.[220] The no-miracles argument supports this position: the predictive and explanatory successes of theories like quantum mechanics or general relativity would be extraordinarily improbable if they did not correspond to actual features of reality, rather than mere calculational devices.[221] In contrast, instrumentalism treats theories primarily as tools for organizing observations and generating predictions, denying commitment to the literal existence of theoretical entities, a stance historically associated with logical positivism but critiqued for undermining the depth of scientific explanation.[222] Epistemologically, science rests on empiricism, where knowledge claims derive from sensory experience, systematic observation, and controlled experimentation, rather than pure deduction or intuition. This foundation traces to figures like Francis Bacon, who in 1620 advocated inductive methods to build generalizations from particulars, emphasizing repeatable evidence over speculative metaphysics.[223] Yet, science integrates rationalist elements, particularly in formal sciences like mathematics, where a priori reasoning establishes theorems independent of empirical input, as seen in Euclidean geometry's axioms yielding deductive proofs.[224] The interplay resolves in a hypothetico-deductive framework: hypotheses are rationally formulated and logically derived to testable predictions, then empirically assessed, with confirmation strengthening but never proving theories due to the problem of induction highlighted by David Hume in 1748, which notes that past regularities do not logically guarantee future ones.[225] Central to scientific epistemology is falsifiability, as articulated by Karl Popper in his 1934 work Logique der Forschung, where theories gain credibility not through verification but by surviving rigorous attempts at refutation through experiment.[226] This criterion demarcates scientific claims from non-scientific ones, prioritizing causal mechanisms testable against reality over unfalsifiable assertions. Bayesian approaches further refine this by quantifying evidence through probability updates based on data likelihoods relative to priors, enabling cumulative progress despite underdetermination—where multiple theories fit observations equally well—resolved pragmatically by simplicity and predictive power.[227] Critiques from constructivist quarters, prevalent in some academic circles, portray knowledge as socially negotiated rather than discovered, but such views falter against the objective convergence of results across diverse investigators, as evidenced by replicated findings in physics from independent labs worldwide.[228]Key Paradigms and Shifts
Thomas Kuhn defined scientific paradigms as the shared constellation of theories, methods, exemplars, and values that a scientific community accepts, guiding "normal science" where practitioners extend and refine the paradigm by solving puzzles it defines as legitimate.[229] Accumulating anomalies—empirical results incompatible with the paradigm—can precipitate a crisis, potentially culminating in a paradigm shift, wherein a rival framework gains acceptance through revolutionary change rather than incremental accumulation.[230] Kuhn posited that such shifts involve incommensurability, where old and new paradigms resist direct rational comparison due to differing conceptual frameworks, resembling perceptual gestalt changes more than objective progress toward truth.[229] This model, outlined in Kuhn's The Structure of Scientific Revolutions (1962), has faced criticism for relativism and overemphasis on extrarational factors like community sociology, potentially undermining the role of empirical evidence and logical argumentation in theory choice.[229] Karl Popper rejected Kuhn's revolutionary discontinuities, arguing instead for progress via falsification: theories advance by surviving rigorous tests that could refute them, with paradigm-like commitments tested incrementally rather than overthrown wholesale.[231] Empirical history suggests shifts often correlate with superior predictive power and explanatory scope, as new paradigms resolve anomalies while accommodating prior successes, though Kuhn's framework highlights how entrenched assumptions can delay acceptance despite evidential warrant.[232] A foundational paradigm shift occurred during the Scientific Revolution with the transition from Ptolemaic geocentrism—reliant on epicycles and equants to fit observations to an Earth-centered cosmos—to heliocentrism, initiated by Copernicus's De revolutionibus orbium coelestium (1543), which posited circular orbits around the Sun for simplicity and aesthetic appeal, though initially lacking dynamical explanation.[233] Galileo's 1610 telescopic discoveries of Jupiter's satellites and Venus's phases provided empirical support, undermining geocentric uniqueness, while Kepler's laws of planetary motion (1609, 1619) introduced elliptical orbits derived from Tycho Brahe's precise data (1576–1601). Newton's Philosophiæ Naturalis Principia Mathematica (1687) effected closure by deriving Kepler's laws from a universal inverse-square law of gravitation, unifying terrestrial and celestial mechanics under empirical laws verifiable by pendulum experiments and comet trajectories. In biology, Charles Darwin's On the Origin of Species (1859) instigated a shift from typological and creationist views—positing fixed species designed by divine agency—to descent with modification via natural selection, mechanistically explaining adaptive diversity through variation, heredity, overproduction, and differential survival, supported by geological uniformitarianism (Lyell, 1830–1833) and Malthusian population pressures (1798).[234] This paradigm integrated fossil records showing transitional forms (e.g., Archaeopteryx, discovered 1861) and biogeographical patterns, though Mendel's genetic mechanisms (1865, rediscovered 1900) later refined inheritance against blending inheritance assumptions. Twentieth-century physics witnessed dual shifts: Einstein's special relativity (1905) resolved the null result of the Michelson-Morley experiment (1887) by abolishing absolute space and time, predicting E=mc² verified in particle accelerators from 1932 onward; general relativity (1915) extended this gravitationally, forecasting light deflection confirmed during the 1919 solar eclipse. Concurrently, quantum mechanics supplanted classical determinism, with Planck's quantum hypothesis (1900) explaining blackbody radiation, Bohr's atomic model (1913) fitting spectral lines, and wave-particle duality formalized in Schrödinger's equation (1926) and Heisenberg's matrix mechanics (1925), accommodating anomalies like the photoelectric effect (Einstein, 1905, verified Millikan 1916). These paradigms persist due to unprecedented predictive accuracy, such as quantum electrodynamics' g-factor predictions matching experiment to 12 decimal places by 1986.[233] Other shifts include Lavoisier's oxygen paradigm in chemistry (1777 treatise), displacing phlogiston by quantifying combustion weights and identifying elements via precise measurements, and Pasteur's germ theory (1860s swan-neck flask experiments), establishing microbes as causal agents of fermentation and disease, validated by Koch's postulates (1884) and reduced postoperative infections via antisepsis (Lister, 1867).[233] Despite Kuhnian crises, acceptance hinged on replicable experiments and quantitative consilience, underscoring causal mechanisms over narrative persuasion.[235]Demarcation from Pseudoscience
The demarcation problem in philosophy of science concerns the challenge of establishing criteria to reliably distinguish scientific theories and practices from pseudoscience, non-science, or metaphysics.[236] This issue gained prominence in the 20th century amid efforts to clarify the rational foundations of knowledge following the logical positivist movement, which sought verifiable empirical content as a boundary but faced limitations in application.[237] Pseudoscience, by contrast, mimics scientific form—employing terminology, experiments, or claims of evidence—while systematically evading rigorous empirical scrutiny, often through unfalsifiable assertions or selective confirmation.[237] Karl Popper proposed falsifiability as a primary criterion in the 1930s, arguing that scientific theories must make bold predictions capable of being empirically tested and potentially refuted; theories that are immune to disconfirmation, such as those accommodating any outcome via ad hoc modifications, belong to pseudoscience.[238] For instance, Albert Einstein's general theory of relativity qualified as scientific because it risked falsification through observable predictions like the 1919 solar eclipse deflection of starlight, whereas Sigmund Freud's psychoanalysis and astrology failed this test by interpreting diverse behaviors or events as confirmatory regardless of specifics.[11] Popper's approach emphasized that science advances through conjecture and refutation, prioritizing error-elimination over inductive confirmation, which pseudosciences often prioritize to sustain core dogmas.[239] This criterion, while influential, drew critiques for overlooking auxiliary hypotheses that complicate outright falsification, as noted by Imre Lakatos in his framework of progressive versus degenerative research programs, where the latter resemble pseudoscience by protecting falsified predictions through endless adjustments.[237] Subsequent philosophers like Thomas Kuhn and Paul Feyerabend challenged strict demarcation, with Kuhn viewing scientific boundaries as paradigm-dependent and Feyerabend rejecting methodological rules altogether, suggesting "anything goes" in scientific practice.[236] Nonetheless, practical indicators persist: scientific claims demand reproducibility by independent researchers, quantitative precision testable against controls, and integration with broader empirical knowledge, whereas pseudosciences like homeopathy—positing "water memory" effects from extreme dilutions—resist replication under standardized conditions and ignore null results from rigorous trials.[240][241] Another marker is evidential indifference; pseudosciences often dismiss contradictory data as artifacts or conspiracies, lacking the self-correcting mechanisms of peer-reviewed science, such as statistical hypothesis testing with predefined significance thresholds (e.g., p < 0.05).[242] In contemporary assessments, demarcation functions less as a binary than a spectrum, informed by social processes like communal scrutiny and error-correction norms, yet core to scientific integrity is causal accountability: theories must yield novel, risky predictions explaining phenomena via mechanisms grounded in observable regularities, not vague correlations or unfalsifiable essences.[243] For example, evolutionary biology demarcates from creationism by generating testable phylogenies and genetic forecasts, such as predicting transitional fossils or molecular clocks, while intelligent design retreats to irreducible complexity claims unamenable to disproof.[12] This emphasis on empirical vulnerability underscores science's provisional yet robust status, contrasting pseudoscience's stasis amid accumulating anomalies.[244]Critiques of Scientism and Reductionism
Critiques of scientism contend that it constitutes an ideological overextension of scientific methods beyond empirical domains, asserting science as the exclusive source of knowledge while dismissing philosophical, ethical, and interpretive inquiries. Philosopher Austin L. Hughes described scientism as a folly that seeks to supplant philosophy with science, arguing that scientific claims about reality's ultimate nature require philosophical justification, rendering scientism self-undermining.[245] Similarly, evolutionary biologist critiques highlight how scientism's blind faith in "settled science" has historically justified authoritarian policies, such as eugenics programs in the early 20th century, by conflating empirical findings with moral imperatives.[246] Michael Polanyi emphasized the role of tacit knowledge—unarticulated skills and intuitions essential to scientific practice—that eludes formal scientific codification, as elaborated in his 1958 work Personal Knowledge, undermining scientism's claim to completeness.[247] Further objections note scientism's inability to address normative questions, such as ethical values or aesthetic judgments, which resist empirical verification; for instance, the assertion that "only scientific knowledge counts" is itself a non-scientific philosophical stance, leading to performative contradiction.[248] Karl Popper and Thomas Kuhn illustrated science's provisional nature through falsifiability and paradigm shifts, respectively, challenging scientism's portrayal of science as cumulatively authoritative across all domains.[249] In social sciences, Friedrich Hayek critiqued the "pretence of knowledge" in 1974, arguing that complex human systems defy predictive modeling akin to physics due to dispersed, subjective knowledge, as seen in failed central planning experiments like Soviet economics.[246] Reductionism, the methodological commitment to explaining phenomena by decomposing them into fundamental components, encounters limitations in accounting for emergent properties arising from system interactions that surpass part-wise predictions. In molecular biology, complex gene regulatory networks exhibit nonlinear dynamics where outcomes cannot be deduced from isolated molecular behaviors, as evidenced by unpredictable cellular responses in genetic perturbation studies.[250] Physical systems involving many-body interactions, such as protein folding or turbulence, resist computational reduction due to exponential complexity, rendering "strong" reductionism practically infeasible despite theoretical appeals.[251] Philosophers like Thomas Nagel argued in 1974's "What Is It Like to Be a Bat?" that subjective consciousness defies reductive explanation in physical terms, as qualia involve irreducible first-person perspectives not capturable by third-person scientific descriptions.[252] Emergentism posits that higher-level properties, such as liquidity in water molecules or ant colony behaviors, arise from part interactions without being predictable or explainable solely by part properties, supported by observations in chaos theory where small initial variations yield macro-scale divergences.[253] These critiques do not reject analytical decomposition but advocate methodological pluralism, integrating holistic approaches to capture causal realities overlooked by pure reduction, as in ecological systems where species interactions produce ecosystem stability irreducible to individual genetics.[254]Science in Society
Education, Literacy, and Public Engagement
Science education typically emphasizes foundational concepts in physics, chemistry, biology, and earth sciences, often integrating the scientific method as a core framework for inquiry-based learning.[255] In the United States, national assessments like the National Assessment of Educational Progress (NAEP) track student performance; in 2024, the average eighth-grade science score stood at 4 points lower than in 2019, reflecting stagnation or decline since 2009.[256] Internationally, the Programme for International Student Assessment (PISA) 2022 results placed the U.S. average science literacy score higher than 56 education systems but lower than 9 others, indicating middling global standing.[257] Similarly, the Trends in International Mathematics and Science Study (TIMSS) 2019 showed U.S. fourth-graders scoring 539 in science, above the international centerpoint of 500, though subsequent data reveal declines, particularly among lower-performing students.[258] Scientific literacy among adults remains limited, with surveys highlighting gaps in understanding core principles. A 2019 Pew Research Center study found that 39% of Americans answered 9 to 11 out of 11 basic science questions correctly, qualifying as high knowledge, while many struggled with concepts like genetics and probability.[259] The 2020 Wellcome Global Monitor reported that only 23% of Americans claimed to know "a lot" about science, underscoring broader deficiencies.[260] A 2021 Cleveland Museum of Natural History survey indicated that 85% of Americans desire more science knowledge, yet 44% feel they are falling behind, pointing to self-perceived inadequacies.[261] Public engagement efforts include science museums, outreach programs, and media initiatives aimed at bridging these gaps. Institutions like museums host exhibits and events to foster hands-on interaction, with studies showing such activities can enhance understanding and interest, though attendance varies and impact on deep literacy is debated.[262] Scientists increasingly use social media and public dialogues to communicate findings, as emphasized in calls for broader involvement to counter misinformation and build trust.[263] However, partisan divides complicate engagement; for instance, surveys reveal stark differences in beliefs about topics like global warming, with Democrats far more likely than Republicans to affirm its occurrence and attribute responsibility to industry, reflecting how ideological filters influence science reception.[259] Challenges persist due to persistent misconceptions, inadequate curricula, and external biases. Students often enter education with alternative conceptions—such as viewing forces as properties rather than interactions—that resist correction without targeted strategies like conceptual change teaching.[264] Declines in performance correlate with disruptions like the COVID-19 pandemic but also stem from systemic issues, including curricula prioritizing rote memorization over critical inquiry.[265] Ideological influences in academia and media, often favoring certain narratives over empirical scrutiny, exacerbate low literacy, making publics vulnerable to pseudoscience and politicized claims.[266] Effective engagement requires addressing these by emphasizing evidence-based reasoning and transparency about source biases to cultivate informed skepticism.[267]Ethical and Moral Dimensions
Scientific inquiry, as a method for understanding natural phenomena through empirical observation and testable hypotheses, is inherently value-neutral in its core methodology. However, the conduct of research and its applications frequently intersect with moral considerations, particularly regarding harm to participants, societal risks, and the allocation of benefits. Ethical frameworks have evolved primarily in response to historical abuses, emphasizing principles such as informed consent, minimization of harm, and equitable distribution of research outcomes.[268] These dimensions underscore the tension between pursuing knowledge and preventing unintended consequences, with regulations often lagging behind technological advances.[269] In human subjects research, foundational ethical standards emerged from post-World War II reckonings with atrocities. The Nuremberg Code of 1947, arising from the Doctors' Trial at the Nuremberg Military Tribunals, established ten principles, including the absolute requirement for voluntary, informed consent and the necessity for experiments to yield results unprocurable by other means while avoiding unnecessary suffering.[270] This code directly addressed Nazi medical experiments on prisoners, which involved non-consensual procedures causing severe harm or death for data on hypothermia, high-altitude effects, and infectious diseases.[271] Building on this, the World Medical Association's Declaration of Helsinki in 1964 extended ethical guidelines to clinical research, mandating that protocols prioritize participant welfare over scientific interests and require independent ethical review.[272] Persistent violations highlighted the need for domestic reforms. The U.S. Public Health Service's Tuskegee Syphilis Study (1932–1972) withheld penicillin treatment from 399 African American men with syphilis after 1947, deceiving them into believing they received care while observing disease progression, resulting in at least 28 deaths and infections in spouses and children.[273] Public exposure in 1972 prompted the 1979 Belmont Report, which codified three core principles—respect for persons (encompassing autonomy and consent), beneficence (maximizing benefits while minimizing harms), and justice (fair subject selection and benefit distribution)—forming the basis for U.S. federal regulations like 45 CFR 46.[268] Animal experimentation raises distinct moral questions about speciesism and sentience, with practices dating to ancient vivisections but intensifying in the 19th century amid physiological advances. Regulations, such as the U.K.'s Cruelty to Animals Act of 1876 and the U.S. Animal Welfare Act of 1966, impose oversight, while the 3Rs framework (replacement, reduction, refinement) proposed by Russell and Burch in 1959 seeks to minimize animal use without forgoing necessary data.[274] Critics argue that alternatives like in vitro models or computational simulations remain underdeveloped, yet empirical evidence shows animal models have been indispensable for vaccines (e.g., polio) and drug safety testing, though over-reliance persists due to incomplete human-animal physiological analogies.[275] Dual-use research exemplifies moral trade-offs where benign intentions enable misuse. Defined as studies with knowledge, products, or technologies reasonably anticipated for both beneficial and harmful applications, examples include 2011 experiments enhancing H5N1 avian flu transmissibility among mammals, debated for pandemic risk versus preparedness gains.[276] U.S. policy since 2012 requires oversight for 15 agents and seven experimental categories posing biosecurity threats.[277] Similarly, J. Robert Oppenheimer's leadership of the Manhattan Project (1942–1945) yielded the atomic bomb, deployed on Hiroshima and Nagasaki in 1945, killing over 200,000 civilians; Oppenheimer later expressed remorse, quoting the Bhagavad Gita—"Now I am become Death, the destroyer of worlds"—and opposed the hydrogen bomb, illustrating scientists' post-hoc ethical burdens amid wartime imperatives.[278] Contemporary biotechnology amplifies these dilemmas, as seen in He Jiankui's 2018 editing of human embryos using CRISPR-Cas9 to disable the CCR5 gene for HIV resistance, resulting in twin girls' births. Condemned for bypassing germline editing bans, lacking safety data, and risking off-target mutations, He was sentenced to three years in prison by Chinese authorities in 2019, prompting global calls for moratoriums despite potential therapeutic merits.[279] Such cases reveal causal realities: unchecked innovation can confer heritable changes with unknown long-term effects, challenging the moral neutrality of pure research while underscoring the need for rigorous, precedent-based ethical deliberation over ideological impositions.[280]Economic and Policy Influences
Scientific research funding derives primarily from public and private sectors, with governments compensating for market failures in basic research, where private returns are diffuse and long-term. In 2022, the U.S. federal government accounted for 40% of basic research funding, compared to 37% from businesses, while the business sector dominated applied research at 75% of total R&D performance.[281] [204] Globally, government R&D shares vary, with the U.S. at 10%, China at 8%, and higher in Europe like the UK's 20%.[199] Public investments demonstrate high economic returns, driving productivity and growth. NIH grants yield $2.30–$2.46 per dollar in economic activity, while broader federal non-defense R&D contributes 140–210% returns to business-sector total factor productivity, accounting for one-fifth of such gains.[282] [283] NSF-supported research similarly returns 150–300% on investment, fostering jobs and innovation spillovers.[284] Policy-induced cuts, such as a 20% reduction in federal R&D, could subtract over $700 billion from U.S. GDP over 10 years relative to sustained levels.[285] Policies like grant allocations, tax credits, and fiscal instruments direct research priorities and amplify corporate innovation. Government R&D spending enhances firm-level technological progress via subsidies and procurement, though it risks short-term economic activity over transformative outcomes.[286] [287] Intellectual property policies, including patents, incentivize private funding—over 40% of university patents with private assignees link to industry sponsors—but skew efforts toward patentable domains like drugs and devices, sidelining unpatentable basic inquiries.[288] [289] Industry funding introduces selection biases, as sponsors prioritize proprietary-aligned topics, potentially distorting scientific agendas away from public goods.[290] [291] Heightened competition for limited grants further pressures researchers toward incremental, low-risk projects, undermining novelty despite formal emphasis on high-impact work.[292] Internationally, policies fuel competition; China's R&D surge outpaces OECD stagnation, with government support rising for energy and defense, reshaping global innovation flows.[293] U.S. policies, historically emphasizing federal basic research, sustain leadership but face erosion from declining shares amid rising private applied focus.[294]Cultural and Political Interactions
Governments exert significant influence over scientific research through funding allocations, which constituted approximately 40% of basic research expenditures in the United States in 2022.[281] In the U.S., Democratic administrations have historically increased federal science funding more than Republican ones, reflecting partisan priorities in policy areas like environmental and health research.[295] [296] This funding dynamic can steer research toward politically favored topics, such as climate initiatives under Democratic leadership, while conservative skepticism toward government intervention often correlates with resistance to expansive regulatory science.[297] Extreme historical cases illustrate the risks of ideological override. In the Soviet Union from the 1930s to 1960s, Trofim Lysenko's rejection of Mendelian genetics in favor of environmentally acquired inheritance traits—aligned with Marxist ideology—led to disastrous agricultural policies, contributing to famines that killed millions.[298] [299] Lysenkoism suppressed dissenting geneticists, including the execution or imprisonment of figures like Nikolai Vavilov, demonstrating how political doctrine can eclipse empirical evidence and cause widespread harm.[299] In contemporary democracies, political interference manifests in subtler forms, such as selective suppression of agency findings. Across U.S. administrations from Bush to Trump, over 300 documented instances occurred where political appointees altered or delayed scientific reports on topics like environmental protection and public health, often to align with policy agendas.[300] Organizations tracking these events, like the Union of Concerned Scientists, highlight patterns but reflect institutional perspectives that may emphasize regulatory science over market-oriented critiques.[300] Cultural interactions often arise from tensions between scientific consensus and traditional beliefs. The theory of evolution by natural selection has sparked enduring conflicts, particularly in the U.S., where creationist views rooted in religious literalism challenge public school curricula; the 1925 Scopes Trial exemplified early legal battles, with ongoing debates leading to "intelligent design" proposals as alternatives.[301] These disputes underscore a broader paradigm clash, as science relies on testable mechanisms while supernatural explanations invoke unobservable causation, fostering mutual incompatibility in educational and societal spheres.[302] Modern politicization exacerbates divides, notably in climate science, where public trust correlates strongly with ideology: a 2022 Pew poll found 78% of Democrats viewing global warming as a major threat versus 23% of Republicans, with similar gaps in attributing responsibility to human activity or industry.[303] This partisan asymmetry stems partly from academia's left-leaning composition, where surveys indicate overwhelming progressive orientations among researchers, potentially prioritizing hypotheses aligned with environmental activism over contrarian analyses of data uncertainties or economic trade-offs.[152] [304] Such biases, documented in fields beyond the natural sciences, can manifest in peer review and funding decisions, undermining claims of institutional neutrality and fueling public skepticism from conservative viewpoints that perceive science as co-opted for policy advocacy.[152]Controversies and Challenges
Replication Crisis and Reproducibility Issues
The replication crisis denotes the systematic inability to reproduce a significant portion of published scientific findings, particularly in psychology, biomedical research, and social sciences, challenging the reliability of empirical claims central to scientific knowledge.[305] Large-scale replication efforts since the early 2010s have revealed reproducibility rates often below 50%, with original studies typically reporting strong statistical significance while replications yield weaker or null effects.[6] This issue stems from methodological flaws and systemic pressures rather than isolated errors, as evidenced by coordinated projects involving independent researchers adhering closely to original protocols.[306] In psychology, the Open Science Collaboration's 2015 project replicated 100 experiments from three high-impact journals published in 2008, achieving significant results in only 36% of cases compared to 97% in the originals; replicated effect sizes averaged half the original magnitude.[306] [307] Similar failures occurred in other domains: Amgen researchers in 2012 confirmed just 11% (6 out of 53) of landmark preclinical cancer studies, often due to discrepancies in data handling and statistical reporting despite direct methodological emulation.[308] Bayer reported comparable irreproducibility in 2011 for 18-25% of targeted studies across physiology and oncology.[309] Economics showed higher rates at 61% in a 2018 multi-lab effort, yet still highlighted variability tied to original effect strength rather than replication rigor.[149] These patterns indicate domain-specific severity, with "soft" sciences like psychology exhibiting lower reproducibility due to higher variability in human subjects and smaller sample sizes.[8] Primary causes include publication bias, where journals preferentially accept positive results, inflating the apparent prevalence of true effects; low statistical power from underpowered studies (often below 50% to detect true effects); and questionable research practices such as p-hacking—selective analysis until p-values fall below 0.05—and HARKing (hypothesizing after results are known).[310] [311] Incentives exacerbate these: academic "publish or perish" cultures reward novel, significant findings over rigorous replication, with tenure and funding tied to high-impact publications that rarely prioritize null outcomes.[190] Systemic biases in peer review and institutional evaluation further discourage transparency, as raw data sharing was historically rare, enabling post-hoc adjustments undetected by reviewers.[312] Responses have emphasized procedural reforms, including pre-registration of hypotheses, methods, and analysis plans on platforms like the Open Science Framework to curb flexibility in data interpretation and distinguish confirmatory from exploratory work.[313] [314] Mandates for open data, code, and materials in journals, alongside incentives like badges for reproducible practices, have increased adoption; for instance, the Reproducibility Project's follow-ups showed pre-registered replications yielding more consistent estimates.[315] [7] Multi-lab collaborations and larger sample sizes via consortia have boosted power, though challenges persist: adoption remains uneven, especially in resource-constrained fields, and pre-registration does not fully eliminate bias if not rigorously enforced.[316] Despite progress, the crisis underscores that reproducibility demands cultural shifts beyond tools, prioritizing verification over novelty to restore empirical foundations.[317]Fraud, Misconduct, and Incentives
Scientific misconduct encompasses fabrication, falsification, plagiarism, and other practices that undermine research integrity, with surveys indicating varying prevalence rates. A meta-analysis of 21 surveys estimated that approximately 1.97% of scientists admit to falsifying or fabricating data, while broader questionable research practices (QRPs) such as selective reporting or failing to disclose conflicts are more common, affecting up to one in three researchers in some studies.[318][319] Self-reported misconduct rates among NSF fellows stood at 3.7%, with 11.9% aware of colleagues engaging in it, though underreporting due to career risks likely understates true figures.[320] Retractions provide a proxy for detected misconduct, with 67.4% of cases from 1996 to 2015 attributed to fraud, suspected fraud, duplicate publication, or plagiarism, rather than honest error.[321] Biomedical retractions have quadrupled over the past two decades, reaching over 5,500 in 2022, with misconduct driving nearly 67% of them; this surge reflects improved detection but also escalating fraud, including organized "paper mills" producing fabricated papers for sale.[322][323][324] Global networks, often resilient and profit-driven, have industrialized fraud, infiltrating journals and exploiting open-access models, with hotspots in countries like China, the US, and India showing elevated retraction rates tied to misconduct.[325][326] Institutional incentives exacerbate these issues through the "publish or perish" paradigm, where career progression, tenure, and funding hinge predominantly on publication quantity and impact factors rather than methodological rigor or replicability.[327] This pressure favors novel, positive results over null findings or incremental work, incentivizing QRPs like p-hacking or data dredging, as grants and promotions reward high-output productivity metrics over truth-seeking verification.[328] Modeling studies demonstrate that such systems can sustain fraudulent equilibria, where misconduct thrives because honest replication yields fewer publications, eroding collective trustworthiness unless incentives shift toward quality assurance.[329] In fields like biomedicine, where federal funding ties to preliminary promising data, the rush for breakthroughs amplifies risks, as seen in retracted high-profile claims from manipulated images or datasets.[330] Efforts to mitigate include enhanced statistical training, preregistration of studies, and incentives for replication, but systemic reforms lag, as academic hierarchies prioritize prestige over accountability.[331] While outright fraud remains a minority, the cumulative effect of incentivized corner-cutting distorts scientific knowledge, particularly in policy-influencing areas like medicine and climate science, where undetected biases compound errors.[332] Retraction databases and AI detection tools have improved vigilance, yet the incentive structure's causal role in fostering misconduct underscores the need for reevaluating reward systems rooted in verifiable outputs over mere publication counts.[333]Ideological Biases and Politicization
Scientific communities exhibit a pronounced left-leaning ideological skew, with surveys indicating that 55% of American Association for the Advancement of Science members identified as Democrats in 2009, compared to only 6% as Republicans.[334] This disparity extends to political donations, where scientists contributing to federal candidates overwhelmingly favor Democrats over Republicans, reflecting broader polarization within academia.[335] Such homogeneity raises concerns about groupthink and selective hypothesis testing, particularly in fields intersecting with policy, as empirical research shows that ideological alignment influences research evaluations and peer review outcomes.[336] Politicization manifests in policy-relevant sciences like climate change, where public acceptance correlates strongly with political affiliation; for instance, Democrats are far more likely than Republicans to affirm anthropogenic global warming and attribute responsibility to fossil fuel industries.[337] Nonetheless, the scientific consensus among climate scientists holds that human activities are the primary driver of recent global warming, with over 97% of actively publishing experts agreeing based on multiple peer-reviewed assessments.[338][339] This partisan divide has intensified, with Republican trust in scientists declining sharply from 87% in 2019 to around 35% by 2023, amid controversies over COVID-19 policies and origins research.[340] In social sciences, ideological biases affect study design and interpretation, as evidenced by the backlash against James Damore's 2017 Google memo, which cited peer-reviewed evidence on sex differences in vocational interests—supported by meta-analyses showing greater male variability and interest disparities—yet prompted his dismissal and widespread condemnation despite endorsements from psychologists affirming the underlying science.[341] Academic institutions' systemic left-wing orientation, documented in faculty surveys across disciplines, contributes to underrepresentation of conservative viewpoints, potentially skewing funding priorities and suppressing dissenting research, such as early dismissals of the COVID-19 lab-leak hypothesis as conspiratorial despite later acknowledgments of its plausibility by agencies like the U.S. Department of Energy.[335] Mainstream media and academic outlets, often aligned with progressive narratives, amplify this by framing ideological nonconformity as misinformation, eroding public trust across political spectra; Pew data reveals Democrats viewing scientists as honest at 80% versus 52% for Republicans in 2024.[342] While self-selection into science may partly explain the skew—attraction to empirical rigor over ideology—causal evidence from donation patterns and policy advocacy indicates that this imbalance incentivizes conformity, hindering objective inquiry in contested domains like gender differences and environmental modeling.[343]Anti-Science Movements and Public Skepticism
Anti-science movements encompass organized efforts to reject or undermine established scientific findings, often rooted in ideological, religious, or economic motivations. Historical examples include 19th-century opposition to smallpox vaccination in England and the United States, where critics argued against mandatory inoculation on grounds of personal liberty and safety concerns despite empirical evidence of efficacy.[344] In the Soviet Union, Lysenkoism promoted pseudoscientific agricultural theories under Stalin, leading to famines and the suppression of genetics research, illustrating how political ideology can eclipse evidence-based biology.[345] Modern instances feature anti-vaccination campaigns, amplified during the COVID-19 pandemic, with groups questioning vaccine safety and efficacy amid reports of rare adverse events and policy mandates.[346] These movements frequently cite isolated fraud cases, such as Andrew Wakefield's retracted 1998 study linking MMR vaccine to autism, to fuel broader distrust, though subsequent large-scale studies affirm vaccine safety.[347] Public skepticism toward science manifests in declining confidence in institutions, particularly along partisan lines. A 2024 Pew Research Center survey found 76% of Americans express confidence in scientists acting in the public's interest, yet trust has eroded since the early 2000s, with Republicans showing steeper declines—only 66% held a great deal or fair amount of confidence in 2023 compared to 87% of Democrats.[342] [340] This divergence, evident since the 1990s, correlates with politicized issues like climate change, where 2021 surveys revealed 90% of Democrats affirming global warming's occurrence versus 60% of Republicans, with even larger gaps on attributing responsibility to industry.[348] Factors include perceived ideological biases in academia and media, where left-leaning consensus on topics like gender differences or environmental policy marginalizes dissenting empirical research, fostering perceptions of science as partisan advocacy rather than neutral inquiry.[297] The replication crisis exacerbates skepticism by highlighting systemic flaws in scientific practice. Failures to reproduce landmark findings in psychology and medicine—estimated at 50% non-replicability in some fields—undermine claims of robustness, as seen in the Open Science Collaboration's 2015 replication of only 36% of 100 psychological studies.[7] Public awareness remains low, but exposure erodes trust, particularly when non-replicable results influence policy, such as in behavioral economics or nutrition guidelines.[349] Incentives favoring novel, positive results over mundane replications, coupled with publication biases, contribute causally to this crisis, prompting calls for preregistration and open data to restore credibility.[350] Legitimate skepticism, distinct from irrational denial, arises from such evidence of overhyping tentative findings, as in early COVID-19 mask efficacy debates where initial uncertainty gave way to evolving consensus amid conflicting trials.[351] Broader drivers of distrust include disinformation amplified by social media and regulatory capture, where industry funding influences outcomes, as alleged in pharmaceutical trials.[352] Conservative skepticism often stems from opposition to government overreach via science-backed regulations, viewing them as pretext for control rather than evidence-driven policy.[297] Conversely, mainstream portrayals sometimes conflate critique of specific consensuses—like overreliance on observational data in epidemiology—with wholesale anti-science, ignoring causal inference challenges. Efforts to counter movements, such as over 420 anti-vaccine and fluoride bills in U.S. statehouses by 2025, risk deepening divides by prioritizing enforcement over transparent engagement.[353] Restoring trust demands addressing root causes like perverse incentives and politicization, rather than dismissing skeptics as uninformed.[354]Achievements, Impacts, and Future Directions
Major Discoveries and Technological Outcomes
Isaac Newton's Philosophiæ Naturalis Principia Mathematica, published in 1687, formulated the three laws of motion and the law of universal gravitation, providing the mechanical foundation for engineering advancements that powered the Industrial Revolution, including steam engines and railway systems that transformed global transportation and manufacturing by the mid-19th century.[355] These principles enabled precise calculations for projectile motion and orbital mechanics, directly contributing to the development of rocketry and space exploration technologies, such as Robert Goddard's liquid-fueled rockets in 1926 and subsequent NASA missions.[356] In electromagnetism, James Clerk Maxwell's equations, unified in 1865, described the behavior of electric and magnetic fields, laying the groundwork for wireless communication technologies including radio transmission pioneered by Guglielmo Marconi in 1895 and modern telecommunications infrastructure.[356] Quantum mechanics discoveries, beginning with Max Planck's quantization of energy in 1900 and Albert Einstein's explanation of the photoelectric effect in 1905, enabled the invention of semiconductors and transistors in 1947 by Bell Labs researchers, which revolutionized computing and led to the integrated circuits powering personal computers and smartphones by the 1970s and beyond.[357] Biological breakthroughs include Charles Darwin's theory of evolution by natural selection, outlined in On the Origin of Species in 1859, which informed selective breeding practices and modern genetics, culminating in the agricultural green revolution that increased crop yields through hybrid varieties developed in the 20th century.[356] The elucidation of DNA's double-helix structure by James Watson and Francis Crick in 1953 facilitated recombinant DNA technology in the 1970s, enabling biotechnology industries producing insulin and vaccines, with further advancements like CRISPR-Cas9 gene editing, discovered in 2012, yielding FDA-approved therapies for sickle cell disease in 2023.[358] [359] Medical discoveries such as Louis Pasteur's germ theory in the 1860s demonstrated microorganisms as disease causes, leading to sterilization techniques and the antibiotic era initiated by Alexander Fleming's penicillin discovery in 1928, which has saved millions of lives by reducing infection mortality rates from over 50% pre-1940s to under 1% for treatable bacterial infections today.[356] [360] Edward Jenner's smallpox vaccine in 1796 exemplified immunology principles, contributing to the eradication of the disease in 1980 and informing mRNA vaccine platforms used in COVID-19 responses starting 2020, which achieved over 95% efficacy in trials.[356] [360] Nuclear physics progressed with the discovery of fission by Otto Hahn and Fritz Strassmann in 1938, enabling atomic energy production, as demonstrated by the first controlled chain reaction in 1942 under Enrico Fermi, which powered nuclear reactors supplying about 10% of global electricity by 2023 while also yielding applications in medicine like isotope-based cancer treatments.[138] These outcomes underscore science's causal role in technological progress, where empirical validations of natural laws have iteratively driven innovations enhancing human productivity, health, and exploration.[361]Societal Benefits and Unintended Consequences
Scientific advancements have substantially increased global life expectancy, which rose from about 32 years for newborns in 1900 to 71 years by 2021, driven largely by medical innovations such as vaccines and antibiotics, improved sanitation, and nutritional science.[362][363] These gains stem from empirical reductions in infant mortality and infectious diseases, with modern medicine enabling survival rates past age 65 for most in developed nations.[364] Economically, scientific research fuels productivity and growth; U.S. public R&D investments in 2018 directly and indirectly supported 1.6 million jobs, $126 billion in labor income, and $197 billion in added economic output.[365] Econometric models project that halving nondefense public R&D spending would diminish long-term U.S. GDP by 7.6%, underscoring causal links between innovation and aggregate output via spillovers to private sector technology adoption.[366] Public perceptions align with these metrics, as 73% of American adults in a 2019 survey attributed a net positive societal effect to science.[367] Despite these advantages, scientific progress has produced unintended negative outcomes through misapplication or overlooked externalities. The Industrial Revolution's reliance on scientific principles in steam power and chemistry accelerated environmental harm, including coal-induced air pollution that caused urban smog and water contamination from factory effluents, depleting resources and altering ecosystems on a global scale.[368][369] Such industrialization contributed to ongoing issues like greenhouse gas emissions, with 2017 estimates valuing annual damages from industrial emissions at €277–433 billion in health and ecological costs across Europe.[370] Fundamental physics research on nuclear fission, pursued in the 1930s for atomic insights, enabled atomic weapons development, culminating in the 1945 Hiroshima and Nagasaki bombings that killed over 200,000 people and introduced proliferation risks persisting into the 21st century.[371][372] Pioneers like Leo Szilard, who conceptualized chain reactions in 1933, later opposed weaponization, highlighting how curiosity-driven inquiry can yield destructive applications absent deliberate safeguards.[372] These cases illustrate causal chains where scientific knowledge, once disseminated, escapes original intent, amplifying risks in policy and military domains.[373]Quantitative Measures of Progress
Scientific progress can be quantified through metrics such as the volume of peer-reviewed publications, research and development (R&D) expenditures, citation rates, and enhancements in computational capabilities that enable complex simulations and data analysis. These indicators primarily capture inputs to and outputs from the scientific enterprise, though they do not directly measure the accumulation of verified knowledge, which is harder to quantify due to factors like paradigm shifts and error correction. For instance, the exponential growth in publications suggests heightened productivity, but it may also reflect incentives for quantity over quality in academic evaluation systems.[374][375] The number of scientific publications has grown exponentially, with an average annual rate of approximately 4-5.6% since the mid-20th century, corresponding to a doubling time of 13-17 years. Between 2012 and 2022, global publication totals increased by 59%, driven largely by expansions in China and the United States, the two largest producers. In the life sciences, doubling times have been as short as 10 years in recent decades, with journals publishing more papers per issue—from an average of 74 in 1999 to 99.6 by 2018. Citation volumes have similarly expanded at about 7% annually since the 19th century, indicating broader dissemination and building upon prior work. However, this surge raises concerns about dilution of impact, as not all outputs contribute equally to foundational advances.[374][375][376] Global R&D spending, a key input metric, reached nearly $2.5 trillion in 2022 (adjusted for purchasing power), having tripled in real terms since 1990. Projections for 2024 estimate $2.53 trillion, reflecting an 8.3% increase from prior forecasts amid post-pandemic recovery. As a percentage of GDP, R&D intensity varies by country—around 2.8% in the U.S. and higher in Israel (5%)—but has remained relatively stable in advanced economies while growing in emerging ones like China, which overtook the U.S. in total spending by 2017. These investments correlate with output metrics but face scrutiny for inefficiencies, such as diminishing returns in crowded fields.[377][378][198] Advancements in computational power, governed by Moore's Law until recently, have exponentially boosted scientific capabilities by doubling transistor density roughly every two years from 1965 onward, reducing costs and enabling genome sequencing, climate modeling, and particle simulations that were infeasible decades prior. This has facilitated data-intensive discoveries, such as the Human Genome Project's completion in 2003, which required processing petabytes of data. Although Moore's Law has slowed since the 2010s due to physical limits, its legacy underscores how hardware scaling has amplified progress across disciplines, though software and algorithmic innovations now drive further gains. Nobel Prizes in science categories (Physics, Chemistry, Physiology or Medicine) have been awarded annually at a steady rate of 2-3 per field since 1901 (with wartime interruptions), totaling over 200 laureates this century, but their fixed volume limits their utility as a growth metric compared to expanding publication and funding trends.[379][380][381]| Metric | Historical Trend | Key Data Point | Source |
|---|---|---|---|
| Publications | 4-5.6% annual growth | Doubling every 13-17 years; +59% (2012-2022) | [374] [376] |
| R&D Spending | Tripled since 1990 | $2.5T global (2022) | [377] [198] |
| Citations | ~7% annual growth | Since 19th century | [382] |
| Transistors (Moore's Law) | Doubled every ~2 years | From 1965-2010s | [379] |
