Hubbry Logo
Electronic engineeringElectronic engineeringMain
Open search
Electronic engineering
Community hub
Electronic engineering
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Electronic engineering
Electronic engineering
from Wikipedia

Electronic engineering is a sub-discipline of electrical engineering that emerged in the early 20th century and is distinguished by the additional use of active components such as semiconductor devices to amplify and control electric current flow. Previously electrical engineering only used passive devices such as mechanical switches, resistors, inductors, and capacitors.

It covers fields such as analog electronics, digital electronics, consumer electronics, embedded systems and power electronics. It is also involved in many related fields, for example solid-state physics, radio engineering, telecommunications, control systems, signal processing, systems engineering, computer engineering, instrumentation engineering, electric power control, photonics and robotics.

The Institute of Electrical and Electronics Engineers (IEEE) is one of the most important professional bodies for electronics engineers in the US; the equivalent body in the UK is the Institution of Engineering and Technology (IET). The International Electrotechnical Commission (IEC) publishes electrical standards including those for electronics engineering.

History and development

[edit]

Electronics engineering as a profession emerged following Karl Ferdinand Braun´s development of the crystal detector, the first semiconductor device, in 1874 and the identification of the electron in 1897 and the subsequent invention of the vacuum tube which could amplify and rectify small electrical signals, that inaugurated the field of electronics.[1][2] Practical applications started with the invention of the diode by Ambrose Fleming and the triode by Lee De Forest in the early 1900s, which made the detection of small electrical voltages such as radio signals from a radio antenna possible with a non-mechanical device. The growth of electronics was rapid. By the early 1920s, commercial radio broadcasting and communications were becoming widespread and electronic amplifiers were being used in such diverse applications as long-distance telephony and the music recording industry.

The discipline was further enhanced by the large amount of electronic systems development during World War II in such as radar and sonar, and the subsequent peace-time consumer revolution following the invention of transistor by William Shockley, John Bardeen and Walter Brattain.

Specialist areas

[edit]

Electronics engineering has many subfields. This section describes some of the most popular.

Electronic signal processing deals with the analysis and manipulation of signals. Signals can be either analog, in which case the signal varies continuously according to the information, or digital, in which case the signal varies according to a series of discrete values representing the information.

For analog signals, signal processing may involve the amplification and filtering of audio signals for audio equipment and the modulation and demodulation of radio frequency signals for telecommunications. For digital signals, signal processing may involve compression, error checking and error detection, and correction.

Telecommunications engineering deals with the transmission of information across a medium such as a co-axial cable, an optical fiber, or free space. Transmissions across free space require information to be encoded in a carrier wave in order to be transmitted, this is known as modulation. Popular analog modulation techniques include amplitude modulation and frequency modulation.

Once the transmission characteristics of a system are determined, telecommunication engineers design the transmitters and receivers needed for such systems. These two are sometimes combined to form a two-way communication device known as a transceiver. A key consideration in the design of transmitters is their power consumption as this is closely related to their signal strength. If the signal strength of a transmitter is insufficient the signal's information will be corrupted by noise.

Aviation-electronics engineering and Aviation-telecommunications engineering, are concerned with aerospace applications. Aviation-telecommunication engineers include specialists who work on airborne avionics in the aircraft or ground equipment. Specialists in this field mainly need knowledge of computer, networking, IT, and sensors. These courses are offered at such as Civil Aviation Technology Colleges.[3][4]

Control engineering has a wide range of electronic applications from the flight and propulsion systems of commercial airplanes to the cruise control present in many modern cars. It also plays an important role in industrial automation. Control engineers often use feedback when designing control systems.

Instrumentation engineering deals with the design of devices to measure physical quantities such as pressure, flow, and temperature. The design of such instrumentation requires a good understanding of electronics engineering and physics; for example, radar guns use the Doppler effect to measure the speed of oncoming vehicles. Similarly, thermocouples use the Peltier–Seebeck effect to measure the temperature difference between two points.

Often instrumentation is not used by itself, but instead as the sensors of larger electrical systems. For example, a thermocouple might be used to help ensure a furnace's temperature remains constant. For this reason, instrumentation engineering is often viewed as the counterpart of control engineering.[5]

Computer engineering deals with the design of computers and computer systems. This may involve the design of new computer hardware, the design of PDAs or the use of computers to control an industrial plant. Development of embedded systems—systems made for specific tasks (e.g., mobile phones)—is also included in this field. This field includes the microcontroller and its applications. Computer engineers may also work on a system's software. However, the design of complex software systems is often the domain of software engineering which falls under computer science, which is usually considered a separate discipline.

VLSI design engineering VLSI stands for very large-scale integration. It deals with fabrication of ICs and various electronic components. In designing an integrated circuit, electronics engineers first construct circuit schematics that specify the electrical components and describe the interconnections between them. When completed, VLSI engineers convert the schematics into actual layouts, which map the layers of various conductor and semiconductor materials needed to construct the circuit.

Education and training

[edit]

Electronics is a subfield within the wider electrical engineering academic subject. In electronics engineering ceramics are materials used to create electronic components. Ceramics are used for the creation of connectors, elements for encapsulation, multilayer capacitors, resistors, and sensors.[6]Electronics engineers typically possess an academic degree with a major in electronics engineering. The length of study for such a degree is usually three or four years and the completed degree may be designated as a Bachelor of Engineering, Bachelor of Science, Bachelor of Applied Science, or Bachelor of Technology depending upon the university. During a bachelor’s degree, students usually complete a capstone course at the end of their degree. The capstone project involves designing and completing a real world project using knowledge from previous courses.[7][8]Many UK universities also offer Master of Engineering (MEng) degrees at the graduate level.

Some electronics engineers also choose to pursue a postgraduate degree such as a Master of Science, Doctor of Philosophy in Engineering, or an Engineering Doctorate. The master's degree is being introduced in some European and American Universities as a first degree and the differentiation of an engineer with graduate and postgraduate studies is often difficult. In these cases, experience is taken into account. The master's degree may consist of either research, coursework or a mixture of the two. The Doctor of Philosophy consists of a significant research component and is often viewed as the entry point to academia.

In most countries, a bachelor's degree in engineering represents the first step towards certification and the degree program itself is certified by a professional body. Certification allows engineers to legally sign off on plans for projects affecting public safety.[9] After completing a certified degree program, the engineer must satisfy a range of requirements, including work experience requirements, before being certified. Once certified the engineer is designated the title of Professional Engineer (in the United States, Canada, and South Africa), Chartered Engineer or Incorporated Engineer (in the United Kingdom, Ireland, India, and Zimbabwe), Chartered Professional Engineer (in Australia and New Zealand) or European Engineer (in much of the European Union).

A degree in electronics generally includes units covering physics, chemistry, mathematics, project management and specific topics in electrical engineering. Initially, such topics cover most, if not all, of the subfields of electronics engineering. Students then choose to specialize in one or more subfields towards the end of the degree.

Fundamental to the discipline are the sciences of physics and mathematics as these help to obtain both a qualitative and quantitative description of how such systems will work. Today, most engineering work involves the use of computers and it is commonplace to use computer-aided design and simulation software programs when designing electronic systems. Although most electronic engineers will understand basic circuit theory, the theories employed by engineers generally depend upon the work they do. For example, quantum mechanics and solid-state physics might be relevant to an engineer working on VLSI but are largely irrelevant to engineers working with embedded systems.

Apart from electromagnetics and network theory, other items in the syllabus are particular to electronic engineering courses. Electrical engineering courses have other specialisms such as machines, power generation, and distribution. This list does not include the extensive engineering mathematics curriculum that is a prerequisite to a degree.[10][11]

Various universities have updated their electrical and electronics programs to include renewable energy courses. The courses are being created because the world is shifting towards becoming more energy efficient.[12][13]

Labs

[edit]

Labs are essential for electronics engineering providing students with hands on experience to understand their other electronics classes. Lab activities may involve:

Breadboarding: Building basic circuits to learn components symbols involving leds, diodes, and resistors.[14]

Microcontrollers: Programming hardware devices such as Arduino boards to control other components.[15][16]

Soldering: Placing components on a printed circuit board and securing them using solder.[17]

Renewable energy labs may involve: [18]

Photovoltaic Energy: Using panel simulators to learn the properties of solar energy conversion.

Wind Power: Applying aerodynamics, rotor dynamics, and power generation characteristics to design and enhance wind energy systems.

Water Energy: Simulating water flow using turbines for better understanding of using water for energy.

Smart Grids: Utilizing smart technologies for advancement of electrical power systems. Involving simulation and hardware of grids from renewable energy sources like solar photovoltaic and wind turbines.

Supporting knowledge areas

[edit]

The huge breadth of electronics engineering has led to the use of a large number of specialists supporting knowledge areas.

Elements of vector calculus: divergence and curl; Gauss' and Stokes' theorems, Maxwell's equations: differential and integral forms. Wave equation, Poynting vector. Plane waves: propagation through various media; reflection and refraction; phase and group velocity; skin depth. Transmission lines: characteristic impedance; impedance transformation; Smith chart; impedance matching; pulse excitation. Waveguides: modes in rectangular waveguides; boundary conditions; cut-off frequencies; dispersion relations. Antennas: Dipole antennas; antenna arrays; radiation pattern; reciprocity theorem, antenna gain.[19][20]

Network graphs: matrices associated with graphs; incidence, fundamental cut set, and fundamental circuit matrices. Solution methods: nodal and mesh analysis. Network theorems: superposition, Thevenin and Norton's maximum power transfer, Wye-Delta transformation.[21] Steady state sinusoidal analysis using phasors. Linear constant coefficient differential equations; time domain analysis of simple RLC circuits, Solution of network equations using Laplace transform: frequency domain analysis of RLC circuits. 2-port network parameters: driving point and transfer functions. State equations for networks.[22]

Electronic devices: Energy bands in silicon, intrinsic and extrinsic silicon. Carrier transport in silicon: diffusion current, drift current, mobility, resistivity. Generation and recombination of carriers. p-n junction diode, Zener diode, tunnel diode, BJT, JFET, MOS capacitor, MOSFET, LED, p-i-n and avalanche photo diode, LASERs. Device technology: integrated circuit fabrication process, oxidation, diffusion, ion implantation, photolithography, n-tub, p-tub and twin-tub CMOS process.[23][24]

Analog circuits: Equivalent circuits (large and small-signal) of diodes, BJT, JFETs, and MOSFETs. Simple diode circuits, clipping, clamping, rectifier. Biasing and bias stability of transistor and FET amplifiers. Amplifiers: single-and multi-stage, differential, operational, feedback and power. Analysis of amplifiers; frequency response of amplifiers. Simple op-amp circuits. Filters. Sinusoidal oscillators; criterion for oscillation; single-transistor and op-amp configurations. Function generators and wave-shaping circuits, Power supplies.[25]

Digital circuits: Boolean functions (NOT, AND, OR, XOR,...). Logic gates digital IC families (DTL, TTL, ECL, MOS, CMOS). Combinational circuits: arithmetic circuits, code converters, multiplexers, and decoders. Sequential circuits: latches and flip-flops, counters, and shift-registers. Sample and hold circuits, ADCs, DACs. Semiconductor memories. Microprocessor 8086: architecture, programming, memory, and I/O interfacing.[26][27]

Signals and systems: Definitions and properties of Laplace transform, continuous-time and discrete-time Fourier series, continuous-time and discrete-time Fourier Transform, z-transform. Sampling theorems. Linear Time-Invariant (LTI) Systems: definitions and properties; causality, stability, impulse response, convolution, poles and zeros frequency response, group delay and phase delay. Signal transmission through LTI systems. Random signals and noise: probability, random variables, probability density function, autocorrelation, power spectral density, and function analogy between vectors & functions.[28][29]

Electronic Control systems

[edit]

Basic control system components; block diagrammatic description, reduction of block diagrams — Mason's rule. Open loop and closed loop (negative unity feedback) systems and stability analysis of these systems. Signal flow graphs and their use in determining transfer functions of systems; transient and steady-state analysis of LTI control systems and frequency response. Analysis of steady-state disturbance rejection and noise sensitivity.

Tools and techniques for LTI control system analysis and design: root loci, Routh–Hurwitz stability criterion, Bode and Nyquist plots. Control system compensators: elements of lead and lag compensation, elements of proportional–integral–derivative (PID) control. Discretization of continuous-time systems using zero-order hold and ADCs for digital controller implementation. Limitations of digital controllers: aliasing. State variable representation and solution of state equation of LTI control systems. Linearization of Nonlinear dynamical systems with state-space realizations in both frequency and time domains. Fundamental concepts of controllability and observability for MIMO LTI systems. State space realizations: observable and controllable canonical form. Ackermann's formula for state-feedback pole placement. Design of full order and reduced order estimators.[30][31]

Communications

[edit]

Analog communication systems: amplitude and angle modulation and demodulation systems, spectral analysis of these operations, superheterodyne noise conditions.

Digital communication systems: pulse-code modulation (PCM), differential pulse-code modulation (DPCM), delta modulation (DM), digital modulation – amplitude, phase- and frequency-shift keying schemes (ASK, PSK, FSK), matched-filter receivers, bandwidth consideration and probability of error calculations for these schemes, GSM, TDMA.[32][33]

Professional bodies

[edit]

Professional bodies of note for electrical engineers USA's Institute of Electrical and Electronics Engineers (IEEE) and the UK's Institution of Engineering and Technology (IET). Members of the Institution of Engineering and Technology (MIET) are recognized professionally in Europe, as electrical and computer engineers. The IEEE claims to produce 30 percent of the world's literature in electrical and electronics engineering, has over 430,000 members, and holds more than 450 IEEE sponsored or cosponsored conferences worldwide each year. Senior membership of the IEEE is a recognised professional designation in the United States.

Project engineering

[edit]

For most engineers not involved at the cutting edge of system design and development, technical work accounts for only a fraction of the work they do. A lot of time is also spent on tasks such as discussing proposals with clients, preparing budgets and determining project schedules. Many senior engineers manage a team of technicians or other engineers and for this reason, project management skills are important. Most engineering projects involve some form of documentation and strong written communication skills are therefore very important.

The workplaces of electronics engineers are just as varied as the types of work they do. Electronics engineers may be found in the pristine laboratory environment of a fabrication plant, the offices of a consulting firm or in a research laboratory. During their working life, electronics engineers may find themselves supervising a wide range of individuals including scientists, electricians, programmers, and other engineers.

Obsolescence of technical skills is a serious concern for electronics engineers. Membership and participation in technical societies, regular reviews of periodicals in the field, and a habit of continued learning are therefore essential to maintaining proficiency, which is even more crucial in the field of consumer electronics products.[34]

Technical Skills

[edit]

Technical skills such as knowledge of circuit design and testing circuits are incorporated in software such as LTSpice and Eagle.[35] LTSpice is used for simulating and examining electronic circuits.[36] Eagle is used to view and design printed circuit boards.[37]

See also

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Electronic engineering is a sub-discipline of that focuses on the research, design, development, testing, and application of electronic circuits, devices, components, and systems, often involving smaller-scale electronics for applications in communications, , products, and . Unlike broader , which emphasizes power generation and distribution, electronic engineering centers on the behavior and control of electrons in circuits and semiconductors to create functional systems. Practitioners, known as electronic engineers, typically hold a and work in industries such as , , healthcare, and , where they solve complex problems using principles from physics, , and . The roots of electronic engineering lie in the broader field of , which emerged in the late with inventions like the and motor, but the discipline distinctly formed in the early through advancements in radio technology. A pivotal development was the invention of the diode in 1904 by , followed by the in 1906 by , which enabled signal amplification and laid the foundation for analog and . The field transformed dramatically in 1947 with the invention of the at Bell Laboratories by , Walter Brattain, and , which replaced bulky s with compact solid-state devices, enabling miniaturization and higher reliability in electronic systems. This breakthrough earned the trio the 1956 and spurred the . Subsequent innovations, such as Jack Kilby's 1958 demonstration of the first at —a monolithic chip containing a , resistors, a , and other components—further accelerated progress, leading to modern microprocessors and very-large-scale integration (VLSI). Electronic engineering encompasses key subfields including analog and digital circuit design, microelectronics and semiconductors, embedded systems, signal processing, telecommunications, control systems, and RF/microwave engineering, where engineers develop technologies from smartphones and medical imaging devices to satellite communications and autonomous vehicles. These areas rely on tools like computer-aided design (CAD) software, simulation models, and fabrication techniques to prototype and optimize systems for efficiency, performance, and cost. The profession's importance stems from its role in driving innovation across sectors; for instance, electronic engineers contribute to renewable energy systems, artificial intelligence hardware, and 5G networks, with the U.S. Bureau of Labor Statistics projecting 7% job growth from 2024 to 2034, faster than the average for all occupations, due to increasing demand for electronic systems in emerging technologies. Professional organizations like the Institute of Electrical and Electronics Engineers (IEEE) support the field through standards, education, and research, ensuring ongoing advancements in areas like nanotechnology and quantum electronics.

Overview

Definition and Scope

Electronic engineering is a discipline within that concentrates on the , development, fabrication, and application of electronic circuits, devices, and systems, with a particular emphasis on active components such as transistors, diodes, and integrated circuits. This field integrates principles from physics, , and to create hardware solutions that process, transmit, and control electrical signals at low to moderate power levels. The scope of electronic engineering spans from individual electronic components to complex integrated systems, including areas like , embedded systems, and communication interfaces, while typically excluding large-scale power generation, transmission, and distribution, which fall under . It focuses on the behavior and manipulation of electrons in devices such as semiconductors and vacuum tubes, enabling innovations in , , and . In contrast to electrical engineering's emphasis on high-voltage and high-power systems for energy infrastructure, electronic engineering prioritizes precision in low-power, signal-oriented applications to achieve functionality in compact and efficient designs. The term "" originated in the early , derived from "," a term coined in to describe the fundamental particle, and evolved to describe the of electron behavior in vacuums, gases, and semiconductors by 1910. This etymology reflects the field's foundational reliance on understanding electron flow and interaction, distinguishing it from broader electrical phenomena.

Importance and Applications

Electronic engineering plays a pivotal role in modern society by underpinning essential technologies that enhance connectivity, , and . It enables advancements in , , medical devices, and , fundamentally shaping daily life and global infrastructure. For instance, the , a cornerstone of electronic engineering, reached a global market size of $533 billion in 2023, expanding to $681 billion in 2024, driving innovations that power everything from personal devices to large-scale data centers. In the United States, the electronics manufacturing sector contributes significantly to the , adding $853 billion to GDP while supporting over 5.2 million jobs as of 2024. Key applications of electronic engineering span diverse industries, demonstrating its versatility and impact. In , it facilitates the design of smartphones and televisions, integrating complex circuits for seamless user experiences. The automotive sector relies on electronic systems like electronic control units (ECUs) and sensors for advanced driver-assistance features and propulsion. In , avionics systems ensure reliable navigation and communication in . Healthcare benefits from electronic engineering through imaging technologies such as MRI machines, which provide non-invasive diagnostics. Additionally, in , electronic interfaces optimize power conversion and grid integration for solar and systems. Economically, electronic engineering fuels innovation hubs like , where research and development in integrated circuits and systems propel technological leadership. It sustains a robust , with approximately 287,900 electrical and electronics engineers employed in the U.S. as of 2024, according to data. This field not only generates high-wage jobs but also stimulates related sectors, contributing to overall economic growth through exports and productivity gains. Despite its successes, electronic engineering addresses ongoing challenges such as , which allows for compact devices but increases vulnerability to environmental stresses like vibrations and temperature extremes, impacting long-term reliability. Ensuring reliability in harsh environments, such as automotive or applications, requires robust design practices to prevent failures. Furthermore, seamless integration with software demands interdisciplinary approaches to handle complexity in embedded systems and real-time processing.

History

Early Developments

The foundations of electronic engineering trace back to key discoveries in during the . In 1831, demonstrated by showing that a changing could induce an in a nearby circuit, a principle that became essential for generating and harnessing electrical power. This experimental breakthrough provided the empirical basis for later theoretical advancements. Building on Faraday's work, James Clerk Maxwell developed a set of equations in the 1860s that unified electricity and magnetism into a coherent electromagnetic theory, predicting the existence of electromagnetic waves and laying the groundwork for understanding electron behavior in fields. These equations, published in their definitive form in 1873, established the mathematical framework for all subsequent electronic phenomena, including signal propagation. Early inventions in the late 19th and early 20th centuries transformed these theoretical insights into practical devices. In 1883, observed the Edison effect, where heated filaments in a emitted electrons to an adjacent electrode, marking the first documented and the precursor to technology. This phenomenon enabled the development of electronic control devices. In 1904, patented the two-electrode , or , which rectified into by allowing electron flow in one direction only, serving as the first electronic for radio detection. Just two years later, in 1906, invented the , or , by adding a to the structure, enabling voltage-controlled amplification of weak signals and oscillation for generating radio frequencies. The 's ability to amplify electrical signals revolutionized communication systems by making long-distance transmission feasible. Significant milestones in electronic applications emerged alongside these inventions. Alexander Graham Bell's in 1876 demonstrated the transmission of voice over wires using electromagnetic principles, establishing as a cornerstone of electronic communication. In 1895, achieved the first wireless transmission of radio signals over a distance of about 2 kilometers using , pioneering radio engineering by adapting electromagnetic wave theory to practical without wires. These developments highlighted the potential of for information transfer. Institutional advancements supported the field's growth in the early 20th century. The Massachusetts Institute of Technology (MIT) introduced the first dedicated electrical engineering degree program in the United States in 1882, evolving into a formal Department of Electrical Engineering by 1902, which trained the initial generation of engineers in electromagnetic theory and . In 1925, Bell Telephone Laboratories was formed as a joint venture between and , consolidating research efforts to advance and radio technologies through dedicated scientific investigation. These institutions fostered systematic , bridging academic theory with industrial application.

20th-Century Advancements

The invention of the marked a pivotal shift in electronic engineering from bulky vacuum tubes to compact solid-state devices. In December 1947, researchers and Walter Brattain at Bell Laboratories demonstrated the first , a capable of amplifying electrical signals, under the direction of . This breakthrough relied on the principles of semiconductor physics, where doped materials control electron flow to enable amplification and switching functions. In 1948, Shockley developed the more practical junction transistor, which used p-n junctions for improved reliability and manufacturability, laying the foundation for modern electronics. The development of integrated circuits (ICs) further revolutionized the field by allowing multiple transistors and components to be fabricated on a single chip. In September 1958, at created the first IC prototype, a monolithic device integrating resistors, capacitors, and transistors on germanium, addressing the "tyranny of numbers" in wiring discrete components. Building on this, at introduced the planar process in 1959, enabling silicon-based ICs with diffused interconnections protected by an oxide layer, which facilitated and scalability. These innovations culminated in Gordon Moore's 1965 observation, known as , that the number of transistors on an IC would roughly double every 18 to 24 months, driving exponential growth in computing power while costs declined. Key applications during the exemplified the practical impact of these advancements. The , developed in the 1960s by MIT and for , utilized ICs to provide real-time navigation and control for lunar missions, featuring about 5,600 ICs in its compact design despite operating with limited memory of 74 kilobytes total. This system's reliability under harsh conditions accelerated IC adoption in . In the realm of personal computing, the , released in 1971, integrated 2,300 transistors on a single chip to perform arithmetic and logic operations, enabling the first programmable calculators and paving the way for desktop computers. Standardization efforts also advanced rapidly, with the formation of the Institute of Electrical and Electronics Engineers (IEEE) in 1963 through the merger of the and the Institute of Radio Engineers, fostering collaboration on technical standards. Early IEEE standards, such as those for circuit testing and established in the , ensured and safety in electronic systems, supporting the proliferation of transistor-based technologies across industries.

Modern Innovations

The digital era in electronic engineering has been profoundly shaped by very-large-scale integration (VLSI) scaling, which began accelerating in the with advancements in and metal-oxide-semiconductor (MOS) technologies, enabling the integration of millions of transistors onto single chips and driving the miniaturization of computing systems. By the 1990s, innovations like and state-machine compilers in tools further streamlined VLSI development, reducing design times and costs for complex circuits. This scaling, guided by , continued into the 21st century, facilitating the proliferation of portable devices and embedded systems that underpin modern electronics. A pivotal milestone was the introduction of the in 2007, which revolutionized design by integrating capacitive screens, accelerometers, and system-on-chip (SoC) architectures, setting new standards for user interfaces and power. This innovation spurred a global mobile boom, with shipments exceeding 1.4 billion units annually by the mid-2010s, transforming electronic engineering toward energy-efficient, multifunctional devices that combine analog and digital components. Complementing this, the (IoT) proliferated post-2010, driven by low-power wireless protocols like and , connecting 14.4 billion devices in 2022 and enabling smart homes, industrial automation, and sensor networks. Projections indicate IoT connections will reach 39 billion by 2030, emphasizing scalable, secure embedded systems in electronic design. Recent advancements include the global deployment of networks starting in 2019, which by 2025 supported over 2.25 billion connections worldwide, offering peak speeds up to 20 Gbps and low latency under 1 ms to enable real-time applications in communications engineering. In AI hardware, NVIDIA's GPU innovations in the , such as the Fermi architecture in 2010 and the introduction of tensor cores in the Volta series by 2017, optimized parallel processing for , accelerating AI model training by orders of magnitude and establishing GPUs as essential for data centers. Flexible electronics advanced notably with organic light-emitting diode () displays in the ; Samsung's 4.5-inch flexible prototype in 2010 paved the way for rollable and foldable screens, enhancing portability and durability in consumer devices through substrate innovations like . These milestones reflect interdisciplinary integrations, such as communications engineering 5G's role in IoT ecosystems. Global shifts have repositioned semiconductor production, with Taiwan Semiconductor Manufacturing Company (TSMC), founded in 1987 as the world's first pure-play , emerging as a dominant hub by fabricating over 50% of advanced chips globally by the through process nodes down to 3 nm. In Europe, the program (2021-2027), succeeding Horizon 2020, allocated €95.5 billion for research and innovation, funding electronics R&D in areas like sustainable semiconductors and quantum technologies to bolster regional competitiveness. However, challenges arose from the 2020-2022 global , triggered by pandemic-induced demand surges for and automotive chips alongside supply constraints from factory shutdowns, which significantly increased prices and delayed production across industries. Post-2020, sustainability initiatives gained momentum, with efforts like the EU's Circular Electronics Initiative promoting recyclable materials and energy-efficient designs, aiming to reduce e-waste, which reached 62 million tonnes in 2022 and is projected to reach 82 million tonnes by 2030. Companies such as advanced these through zero-waste manufacturing goals and adoption in fabrication by 2025.

Subfields

Analog Electronics

Analog electronics encompasses the design and application of circuits that process continuous-time signals, contrasting with discrete digital methods by maintaining through linear operations on varying voltages or currents. These systems rely on components that amplify, filter, and modulate analog waveforms, enabling applications where natural phenomena—such as sound waves or outputs—are represented as smooth, time-varying electrical quantities. Fundamental to this field is the use of active devices to achieve precise control over signal characteristics without introducing quantization errors inherent in digital processing. A cornerstone component in analog is the (), a high-gain that forms the basis for numerous signal-processing functions. Ideal op-amps are modeled with infinite open-loop voltage gain (typically denoted as AA \to \infty), infinite (preventing loading of the signal source), zero (allowing ideal voltage driving), and infinite bandwidth (ensuring flat across all frequencies). These idealized traits simplify analysis and design, assuming no or bias currents in the model. In practice, real op-amps approximate these characteristics closely enough for most applications, with bipolar junction transistors (BJTs) often serving as the internal amplifying elements. Op-amps are configured in basic amplifier topologies to perform amplification tailored to specific needs. The inverting amplifier connects the input signal to the inverting terminal through an input RinR_{in}, with feedback RfR_f from output to inverting input, yielding an output voltage of Vout=RfRinVinV_{out} = -\frac{R_f}{R_{in}} V_{in} and inverting the signal polarity. Conversely, the non-inverting amplifier applies the input to the non-inverting terminal, with feedback to the inverting input, producing Vout=(1+RfRin)VinV_{out} = \left(1 + \frac{R_f}{R_{in}}\right) V_{in} while preserving phase. These configurations provide voltage gains from unity to hundreds, depending on resistor ratios, and are essential for scaling weak signals to usable levels. Key concepts in analog electronics include amplification, which boosts signal while ideally preserving , and filtering, which selectively attenuates frequency components to shape the signal . Low-pass filters, often implemented with RC networks in active configurations using op-amps, allow low frequencies to pass while attenuating higher ones; a first-order active has a transfer function H(s)=11+sRCH(s) = \frac{1}{1 + sRC}, with fc=12πRCf_c = \frac{1}{2\pi RC}. High-pass filters, employing capacitors in series with the signal path, block low frequencies and pass high ones, as in H(s)=sRC1+sRCH(s) = \frac{sRC}{1 + sRC}. More complex RLC circuits extend these to second-order responses for sharper roll-offs. Modulation techniques further manipulate signals for transmission: (AM) varies the carrier amplitude proportionally to the message signal, while (FM) alters the carrier frequency, offering improved noise immunity in radio systems. Applications of analog electronics are prominent in audio systems, where op-amp-based amplifiers and filters process acoustic signals for reproduction, ensuring faithful fidelity from to speakers. Sensor interfaces similarly employ analog circuits to condition low-level outputs from devices like thermocouples or strain gauges, amplifying and filtering them to mitigate environmental interference before further processing. analysis via Bode plots visualizes these behaviors, plotting magnitude and phase in decibels and degrees against logarithmic to reveal gain flatness, points, and stability margins—critical for designing filters that maintain across operational bands. Design considerations in analog electronics emphasize , , and bandwidth limitations to ensure reliable performance. , arising from thermal agitation in resistors or in semiconductors, is minimized through techniques like low-noise op-amp selection, shielding, and grounding strategies that reduce . ensures the output faithfully scales with input without harmonic distortion, quantified by metrics such as (THD) below 0.1% in high-fidelity applications. Bandwidth is constrained by the op-amp's gain-bandwidth product (typically 1-100 MHz), dictating trade-offs where higher gain reduces usable range, necessitating careful component selection for specific operational demands.

Digital Electronics

Digital electronics is a subfield of electronic engineering that focuses on circuits and systems processing discrete binary signals, typically represented as 0 (low voltage) and 1 (high voltage), to perform logical operations. These systems form the foundation of modern and digital devices, enabling reliable information processing through deterministic logic rather than continuous variations. The core building blocks are , which implement basic functions. The outputs 1 only if all inputs are 1, as defined by its :
ABA AND B
000
010
100
111
The OR gate outputs 1 if at least one input is 1, the NOT gate inverts its single input, and the (NOT AND) outputs the inverse of the AND function, serving as a universal gate capable of implementing any logic function alone. provides the mathematical framework for designing and optimizing digital circuits, allowing expressions to be simplified to minimize the number of gates required. Developed by in the 19th century, its application to electrical switching circuits was pioneered by in 1938, who demonstrated that operations could directly map to and switch configurations, revolutionizing . Key simplification tools include De Morgan's theorems, which state that the complement of a sum equals the sum of complements (A+B=AˉBˉ\overline{A + B} = \bar{A} \cdot \bar{B}) and the complement of a product equals the product of complements (AB=Aˉ+Bˉ\overline{A \cdot B} = \bar{A} + \bar{B}); these enable transformations between AND/OR and NAND/NOR implementations without altering functionality. For example, applying De Morgan's theorem can convert a complex expression like (A+B)C\overline{(A + B) \cdot C} to AˉBˉ+Cˉ\bar{A} \cdot \bar{B} + \bar{C}, reducing hardware complexity in practice. Digital circuits are classified as combinational or sequential based on their dependence on input history. Combinational logic produces outputs solely from current inputs via gates, with no memory, ensuring immediate response but limited to static functions like adders. In contrast, sequential logic incorporates memory elements to store states, making outputs dependent on both current inputs and prior states, which enables dynamic behaviors such as counting or decision-making. Clocking synchronizes these state changes using a periodic signal, typically a square wave, where transitions (edges) trigger updates in edge-triggered designs; the clock period must exceed the circuit's maximum propagation delay—the time for a signal change to propagate through the logic—to prevent errors from timing violations. Propagation delay arises from gate switching times, often on the order of nanoseconds in modern CMOS technology, and is critical for determining maximum operating frequencies. Sequential circuits rely on flip-flops as fundamental memory units, invented as the Eccles-Jordan trigger circuit in 1918, which bistably stores one bit by latching between two stable states. The SR (Set-Reset) flip-flop uses Set and Reset inputs to toggle states, but suffers from ambiguity when both are active; the JK flip-flop resolves this by allowing toggle on J=K=1, with its specifying inputs for state transitions (e.g., to hold a state, J=0, K=0). The (Data) flip-flop simplifies to a single input that captures the value on clock edge, widely used for registers due to its predictability. Counters, built from cascaded flip-flops, increment or decrement binary values on clock pulses, such as a 4-bit ripple counter where each flip-flop clocks the next, though asynchronous designs introduce propagation delays across stages. State machines model sequential behavior via finite states and transitions, represented in state diagrams showing inputs/outputs per state; timing diagrams illustrate signal evolution over clock cycles, highlighting setup/hold times to ensure reliable latching. At a higher level, microprocessors integrate sequential and combinational elements into a (CPU) following the , outlined in John von Neumann's 1945 report, which proposed a single memory for both instructions and accessed sequentially. The (ALU) performs operations like addition and bitwise logic on binary operands, using combinational circuits for parallel computation. Registers, arrays of D flip-flops, temporarily store , addresses, and instructions; key ones include the (PC) for instruction fetching and the accumulator for ALU results. This architecture enables the fetch-decode-execute cycle, where the orchestrates operations, forming the basis for general-purpose computing in devices from embedded systems to supercomputers.

Power Electronics

Power electronics is a subfield of electronic engineering that focuses on the efficient conversion and control of electrical power using solid-state electronic devices, particularly in applications requiring high power levels, typically from tens of watts to megawatts. This discipline enables the transformation of between different forms, such as DC to AC or varying voltage levels, while minimizing losses and ensuring reliable operation. Key advancements in power electronics have been driven by the development of switching devices that operate at high voltages and currents, allowing for compact and efficient systems compared to traditional mechanical or electromechanical alternatives. Central to power electronics are power semiconductor devices such as thyristors, insulated-gate bipolar transistors (IGBTs), and metal-oxide-semiconductor field-effect transistors (MOSFETs), which serve as high-speed switches to control power flow. Thyristors, including silicon-controlled rectifiers, provide robust latching behavior for high-voltage applications but require external commutation for turn-off. IGBTs combine the high of MOSFETs with the low on-state of bipolar transistors, making them ideal for medium- to high-power switching up to several kilohertz. Power MOSFETs excel in high-frequency operation due to their fast switching speeds and low gate drive requirements, though they are limited to lower voltages without . These devices are often controlled using (PWM) techniques, which vary the of switching pulses to regulate output voltage and current while reducing harmonic distortion. Seminal PWM methods, such as sinusoidal PWM, have been foundational since the 1970s for achieving precise control in converters. Power electronic converters form the core building blocks for energy transformation, including DC-DC converters like buck and boost topologies, AC-DC rectifiers, and DC-AC inverters. Buck converters step down DC voltage by controlling the switch to store and release in an , while boost converters achieve voltage step-up through similar inductive transfer. AC-DC rectifiers convert to , often using bridges or active switches for improved , and inverters synthesize AC waveforms from DC sources via PWM-modulated switching. The of these converters is quantified as η = P_out / P_in, where P_out is the output power and P_in is the input power, with modern designs achieving over 95% through minimized conduction and switching losses. Applications of power electronics span motor drives, , and (EV) infrastructure. In motor drives, variable-frequency inverters using IGBTs or MOSFETs enable precise speed control for induction and permanent magnet motors in industrial automation and traction systems. For renewable energy, inverters convert DC from solar photovoltaic (PV) panels to grid-compatible AC, incorporating to optimize energy harvest under varying . EV chargers rely on AC-DC rectifiers and DC-DC converters to deliver high-power fast charging, supporting bidirectional power flow for integration. Effective thermal management is essential in to dissipate generated during operation, preventing device failure and maintaining efficiency. Heat sinks, often finned aluminum or structures, provide convective cooling by increasing surface area for to ambient air or liquids. Switching losses, a major source, arise from the energy dissipated during device transitions and can be approximated as P = f * C * V^2, where f is the switching frequency, C is the device (such as output or gate ), and V is the voltage swing. Advanced cooling techniques, including liquid immersion and thermal interface materials, further enhance reliability in high-density modules.

Control Systems

Control systems in electronic engineering involve the and of systems that regulate the behavior of dynamic processes through feedback mechanisms to achieve desired performance criteria such as stability, accuracy, and responsiveness. These systems are to electronic engineering as they enable the and precise control of electrical and electromechanical devices by processing inputs and generating outputs. Fundamental to this field is the use of mathematical models to predict and optimize system behavior, ensuring reliable operation in real-world applications. Feedback principles form the cornerstone of control , distinguishing between open-loop and closed-loop configurations. In an open-loop , the control action is independent of the output, relying solely on predefined inputs without or correction, which makes it simpler but susceptible to disturbances and variations. In contrast, a closed-loop incorporates feedback by comparing the actual output to a reference input via a , adjusting the control signal to minimize errors and enhance robustness against uncertainties. This feedback loop improves accuracy and stability, though it introduces potential issues like oscillations if not properly designed. Stability analysis is critical in closed-loop systems to ensure bounded outputs for bounded inputs, often assessed using the Routh-Hurwitz criterion. This algebraic method examines the coefficients of the derived from the system's to determine if all roots have negative real parts, indicating asymptotic stability. For a p(s)=ansn+an1sn1++a0p(s) = a_n s^n + a_{n-1} s^{n-1} + \dots + a_0, the criterion constructs a Routh array where stability requires no sign changes in the first column. Developed independently by Edward Routh in 1877 and in 1895, it provides a necessary and sufficient condition for stability without solving for roots explicitly. Transfer functions provide a frequency-domain representation of , facilitating through block diagrams and s. A transfer function G(s)=Y(s)U(s)G(s) = \frac{Y(s)}{U(s)} relates the Laplace transform of the output Y(s)Y(s) to the input U(s)U(s), assuming zero initial conditions, and is derived by applying the Laplace transform to the system's differential equations. Block diagrams visually decompose the system into interconnected components, such as integrators and gains, allowing and . For instance, in a second-order system like a mass-spring-damper, the transfer function is G(s)=1ms2+cs+kG(s) = \frac{1}{ms^2 + cs + k}, where mm, cc, and kk represent , , and . PID controllers are widely adopted for their simplicity and effectiveness in regulating systems by combining proportional, integral, and derivative actions. The proportional term Kpe(t)K_p e(t) responds to the current error e(t)e(t), providing fast correction but risking steady-state offset; the integral term Kie(t)dtK_i \int e(t) \, dt eliminates offset by accumulating past errors; and the derivative term Kdde(t)dtK_d \frac{de(t)}{dt} anticipates future errors for damping./09:Proportional-Integral-Derivative(PID)_Control/9.03:_PID_Tuning_via_Classical_Methods) Tuning these gains, often via the Ziegler-Nichols method, involves setting integral and derivative to zero, increasing proportional gain to the ultimate gain KuK_u at oscillation onset with period PuP_u, then applying rules such as Kp=0.6KuK_p = 0.6 K_u, Ki=2Kp/PuK_i = 2 K_p / P_u, and Kd=KpPu/8K_d = K_p P_u / 8 for PID. This heuristic, proposed in 1942, balances performance and stability across diverse systems. Control systems commonly model linear time-invariant (LTI) systems, where linearity ensures superposition—responses to scaled or summed inputs are scaled or summed outputs—and time-invariance means shifting inputs shifts outputs identically. These properties simplify analysis using convolution or frequency responses. For stability assessment in LTI systems, the root locus method plots the migration of closed-loop poles as a gain parameter varies, revealing regions of instability via pole-zero configurations. Introduced by Walter R. Evans in 1948, it starts from open-loop poles and ends at zeros, with rules for asymptotes and departures aiding design. Complementing this, the Nyquist stability criterion evaluates encirclements of the critical point (-1,0) in the complex plane by the open-loop frequency response plot G(jω)H(jω)G(j\omega)H(j\omega), where the number of clockwise encirclements equals the number of right-half-plane poles for instability prediction. Named after Harry Nyquist's 1932 work, it quantifies gain and phase margins from Bode or Nyquist plots. Applications of control systems span , control, and , leveraging feedback for precision and . In , closed-loop control using PID algorithms enables trajectory tracking and force regulation, as in industrial arms for assembly tasks where sensors provide position feedback to correct deviations. control employs LTI models and stability analyses like Nyquist to maintain variables such as temperature in chemical plants, ensuring efficient operation via feedback loops. In automotive systems, uses root locus-designed controllers to adjust brake and throttle based on yaw rate sensors, preventing skids and enhancing vehicle handling.

Communications Engineering

Communications engineering, a core subfield of electronic engineering, encompasses the design, analysis, and implementation of systems for transmitting and receiving signals over various media, ensuring reliable exchange in applications ranging from to . This discipline integrates electronic circuits, hardware, and electromagnetic principles to modulate, propagate, and demodulate signals while mitigating and interference. Key challenges include optimizing bandwidth usage, minimizing signal during transmission, and adapting to diverse environmental conditions. Central to communications engineering are modulation techniques that encode digital data onto carrier signals for efficient transmission. (ASK) varies the amplitude of the to represent , offering simplicity but susceptibility to . (FSK) shifts the carrier frequency between discrete values for '0' and '1' bits, providing better noise immunity at the cost of increased bandwidth. (PSK), particularly Binary PSK (BPSK) and Quadrature PSK (QPSK), modulates the phase of the carrier, enabling higher data rates and robustness in channels. These techniques are implemented using analog circuits such as mixers and oscillators to generate modulated signals. Multiplexing methods allow multiple signals to share a single , enhancing . Time Division Multiplexing (TDM) allocates distinct time slots to each signal, commonly used in digital telephony for synchronized interleaving. (FDM) separates signals into non-overlapping frequency bands, foundational for analog radio and TV . (CDMA) employs unique orthogonal codes to spread signals across the full bandwidth, enabling simultaneous transmission with interference rejection via , as seen in early cellular networks. Transmission media in communications systems are categorized as wired or wireless, each with distinct electronic characteristics. Wired media include cables, which use a central conductor surrounded by a shield to minimize and support high-frequency signals up to several GHz for cable TV and . optic cables transmit signals via pulses through glass cores, offering ultra-high bandwidths exceeding 100 Gbps over long distances with negligible , ideal for backbone networks. In wireless media, (RF) involves electromagnetic waves traveling through free space, affected by , multipath , and absorption. Antennas are critical for RF systems, with gain measuring the amplification of radiated power in a preferred direction relative to an , and quantifying the concentration of energy in that direction, typically expressed in dBi. At the protocol level, communications engineering focuses on the of the Open Systems Interconnection (OSI) model, which handles bit-level transmission over the medium, including signal encoding, , and modulation schemes. Error correction mechanisms, such as Hamming codes, add parity bits to detect and correct single-bit errors in transmitted data, enhancing reliability in noisy channels like wireless links. These codes operate by calculating syndrome bits to identify error positions, widely applied in and communication hardware. Modern communications systems leverage these principles in standardized wireless technologies. The family, known as , defines physical and medium access layers for local area networks operating in 2.4 GHz, 5 GHz, and 6 GHz bands, supporting data rates up to 9.6 Gbps in the latest 802.11ax () amendment through (OFDM). , standardized as IEEE 802.15.1, enables short-range personal area networks at 2.4 GHz with low power consumption, facilitating device pairing and data transfer in applications like wearables and audio streaming. Satellite communications extend coverage globally using geostationary or low-Earth orbit platforms, where electronic transponders amplify and frequency-convert uplink signals for downlink to ground stations, supporting internet and with link budgets accounting for high path losses.

Optoelectronics and Photonics

Optoelectronics encompasses the field of electronic devices and systems that interact with through the conversion between electrical and optical signals, primarily via semiconductors. This discipline leverages the , where photons incident on a material eject electrons, enabling light detection in devices like photodiodes. The foundational , theoretically explained by in 1905, underpins the operation of many optoelectronic components by quantifying the energy threshold for electron emission as E=hνϕE = h\nu - \phi, where hh is Planck's constant, ν\nu is the photon frequency, and ϕ\phi is the . Wave-particle duality further governs behavior in these systems, allowing photons to exhibit both particle-like absorption and wave-like , essential for phenomena such as interference in optical waveguides. In fiber optics, adheres to electromagnetic principles, with typical in silica fibers around 0.2 dB/km at 1550 nm due to intrinsic material absorption and , limiting signal distance without amplification. Key devices in include light-emitting diodes (LEDs), lasers, and photodiodes, each exploiting electron-photon interactions for signal generation or detection. LEDs produce light through , where electron-hole recombination in a p-n junction releases photons, with external quantum efficiency η=photons outelectrons in\eta = \frac{\text{photons out}}{\text{electrons in}} often exceeding 50% in modern GaN-based blue LEDs for efficient visible emission. lasers, such as vertical-cavity surface-emitting lasers (VCSELs), achieve coherent light output via in a resonant cavity, enabling low-threshold operation at thresholds below 1 mA and circular beam profiles ideal for integration. Photodiodes, conversely, convert incident photons to electrical current through the , with up to 0.8 A/W in variants for near-infrared detection, where quantum efficiency η=electrons generatedphotons absorbed\eta = \frac{\text{electrons generated}}{\text{photons absorbed}} approaches unity in optimized PIN structures. Applications of optoelectronics span high-speed data transmission, visual displays, and precision sensing. In optical communications, dense wavelength-division multiplexing (DWDM) utilizes multiple laser channels spaced at 0.8 nm intervals on a single fiber, supporting aggregate capacities over 100 Tbps in long-haul networks by leveraging erbium-doped fiber amplifiers to counteract attenuation. For displays, liquid crystal displays (LCDs) modulate polarized backlight through nematic liquid crystals aligned via electric fields, achieving high resolution but requiring backlighting for contrast ratios up to 1000:1, while organic light-emitting diode (OLED) displays enable self-emissive pixels with perfect blacks and viewing angles over 170 degrees due to direct electroluminescence in organic thin films. In sensing, light detection and ranging (LiDAR) systems employ pulsed lasers and avalanche photodiodes to measure time-of-flight, providing 3D mapping with resolutions down to centimeters over kilometers, critical for autonomous vehicles and environmental monitoring. Integration advances have led to electro-optic modulators and photonic integrated circuits (PICs) that compactly combine optoelectronic functions. Electro-optic modulators exploit the in materials like , where an applied voltage induces changes to phase-shift light at speeds over 100 GHz, enabling data modulation rates beyond 400 Gbps per channel. Photonic integrated circuits, fabricated on platforms such as or , monolithically integrate lasers, modulators, and detectors on a single chip, reducing size by orders of magnitude while achieving insertion losses below 3 dB and supporting scalable photonic networks for datacenters.

Foundational Knowledge

Circuit Theory

Circuit theory forms the cornerstone of electronic engineering by providing the mathematical tools to model, analyze, and electrical circuits under the lumped-element approximation, where components like resistors, capacitors, and inductors are treated as discrete elements without considering distributed effects. This approach enables engineers to predict circuit behavior using algebraic and differential equations derived from physical principles. The foundational laws of circuit theory are and Kirchhoff's laws. states that the VV across a conductor is directly proportional to the current II flowing through it and the resistance RR of the conductor, expressed as V=IRV = IR. This relationship was empirically established by Georg Simon Ohm in his 1827 publication Die galvanische Kette, mathematisch bearbeitet. Kirchhoff's current law (KCL) asserts that the algebraic sum of currents entering and leaving a node is zero, I=0\sum I = 0, reflecting conservation of charge. Kirchhoff's voltage law (KVL) states that the algebraic sum of voltages around any closed loop is zero, V=0\sum V = 0, based on . These laws were formulated by Gustav Robert Kirchhoff in 1845 through his work on electrical networks. Analysis methods for DC circuits include nodal and mesh analysis, which systematically apply Kirchhoff's laws to solve for voltages and currents. Nodal analysis involves writing KCL equations at each non-reference node to determine node voltages, offering efficiency for circuits with fewer nodes than meshes. Mesh analysis applies KVL to independent loops, or meshes, to find loop currents, particularly useful in planar circuits. These techniques were developed as extensions of Kirchhoff's laws in the late 19th and early 20th centuries to handle complex networks. Equivalent circuit theorems simplify analysis: Thévenin's theorem replaces any linear network seen from two terminals with a voltage source VthV_{th} in series with impedance ZthZ_{th}, where VthV_{th} is the open-circuit voltage and ZthZ_{th} is the equivalent impedance with sources deactivated. This was proposed by Léon Charles Thévenin in 1883. Norton's theorem equivalently uses a current source InI_n in parallel with ZthZ_{th}, where InI_n is the short-circuit current; it was independently derived by Edward Lawry Norton in 1926. Transient response analysis examines how circuits respond to sudden changes, such as switching, in first- and second-order systems. In RC and RL circuits, the time constant τ=RC\tau = RC or τ=L/R\tau = L/R characterizes the exponential decay or growth toward steady state, typically reaching 63% of the final value after one τ\tau. For second-order RLC circuits, the response depends on damping: underdamped cases exhibit oscillations with natural frequency ω0=1/LC\omega_0 = 1/\sqrt{LC}
Add your contribution
Related Hubs
User Avatar
No comments yet.