Recent from talks
Nothing was collected or created yet.
Website
View on Wikipedia

A website (also written as a web site) is any web page whose content is identified by a common domain name and is published on at least one web server. Websites are typically dedicated to a particular topic or purpose, such as news, education, commerce, entertainment, or social media. Hyperlinking between web pages guides the navigation of the site, which often starts with a home page. The most-visited sites are Google, YouTube, and Facebook.[1][2]
All publicly-accessible websites collectively constitute the World Wide Web. There are also private websites that can only be accessed on a private network, such as a company's internal website for its employees. Users can access websites on a range of devices, including desktops, laptops, tablets, and smartphones. The app used on these devices is called a web browser.
Background
[edit]
The World Wide Web (WWW) was created in 1989 by the British CERN computer scientist Tim Berners-Lee.[3][4] On 30 April 1993, CERN announced that the World Wide Web would be free to use for anyone, contributing to the immense growth of the Web.[5] Before the introduction of the Hypertext Transfer Protocol (HTTP), other protocols such as File Transfer Protocol and the gopher protocol were used to retrieve individual files from a server. These protocols offer a simple directory structure in which the user navigates and where they choose files to download. Documents were most often presented as plain text files without formatting or were encoded in word processor formats.
History
[edit]While "web site" was the original spelling (sometimes capitalized "Web site", since "Web" is a proper noun when referring to the World Wide Web), this variant has become rarely used, and "website" has become the standard spelling. All major style guides, such as The Chicago Manual of Style[6] and the AP Stylebook,[7] have reflected this change.
In February 2009, Netcraft, an Internet monitoring company that has tracked Web growth since 1995, reported that there were 215,675,903 websites with domain names and content on them in 2009, compared to just 19,732 websites in August 1995.[8] After reaching 1 billion websites in September 2014, a milestone confirmed by Netcraft in its October 2014 Web Server Survey and that Internet Live Stats was the first to announce—as attested by this tweet from the inventor of the World Wide Web himself, Tim Berners-Lee—the number of websites in the world have subsequently declined, reverting to a level below 1 billion. This is due to the monthly fluctuations in the count of inactive websites. The number of websites continued growing to over 1 billion by March 2016 and has continued growing since.[9] Netcraft Web Server Survey in January 2020 reported that there are 1,295,973,827 websites and in April 2021 reported that there are 1,212,139,815 sites across 10,939,637 web-facing computers, and 264,469,666 unique domains.[10] An estimated 85 percent of all websites are inactive.[11]
Static website
[edit]A static website is one that has Web pages stored on the server in the format that is sent to a client Web browser. It is primarily coded in Hypertext Markup Language (HTML); Cascading Style Sheets (CSS) are used to control appearance beyond basic HTML. Images are commonly used to create the desired appearance and as part of the main content. Audio or video might also be considered "static" content if it plays automatically or is generally non-interactive. This type of website usually displays the same information to all visitors. Similar to handing out a printed brochure to customers or clients, a static website will generally provide consistent, standard information for an extended period of time. Although the website owner may make updates periodically, it is a manual process to edit the text, photos, and other content and may require basic website design skills and software. Simple forms or marketing examples of websites, such as a classic website, a five-page website or a brochure website are often static websites, because they present pre-defined, static information to the user. This may include information about a company and its products and services through text, photos, animations, audio/video, and navigation menus.
Static websites may still use server side includes (SSI) as an editing convenience, such as sharing a common menu bar across many pages. As the site's behavior to the reader is still static, this is not considered a dynamic site.
Dynamic website
[edit]A dynamic website is one that changes or customizes itself frequently and automatically. Server-side dynamic pages are generated "on the fly" by computer code that produces the HTML (CSS are responsible for appearance and thus, are static files). There are a wide range of software systems, such as CGI, Java Servlets and Java Server Pages (JSP), Active Server Pages and ColdFusion (CFML) that are available to generate dynamic Web systems and dynamic sites. Various Web application frameworks and Web template systems are available for general-use programming languages like Perl, PHP, Python and Ruby to make it faster and easier to create complex dynamic websites.
A site can display the current state of a dialogue between users, monitor a changing situation, or provide information in some way personalized to the requirements of the individual user. For example, when the front page of a news site is requested, the code running on the webserver might combine stored HTML fragments with news stories retrieved from a database or another website via RSS to produce a page that includes the latest information. Dynamic sites can be interactive by using HTML forms, storing and reading back browser cookies, or by creating a series of pages that reflect the previous history of clicks. Another example of dynamic content is when a retail website with a database of media products allows a user to input a search request, e.g. for the keyword Beatles. In response, the content of the Web page will spontaneously change the way it looked before, and will then display a list of Beatles products like CDs, DVDs, and books. Dynamic HTML uses JavaScript code to instruct the Web browser how to interactively modify the page contents. One way to simulate a certain type of dynamic website while avoiding the performance loss of initiating the dynamic engine on a per-user or per-connection basis is to periodically automatically regenerate a large series of static pages.
Multimedia and interactive content
[edit]Early websites had only text, and soon after, images. Web browser plug-ins were then used to add audio, video, and interactivity (such as for a rich Web application that mirrors the complexity of a desktop application like a word processor). Examples of such plug-ins are Microsoft Silverlight, Adobe Flash Player, Adobe Shockwave Player, and Java SE. HTML 5 includes provisions for audio and video without plugins. JavaScript is also built into most modern web browsers, and allows for website creators to send code to the web browser that instructs it how to interactively modify page content and communicate with the web server if needed. The browser's internal representation of the content is known as the Document Object Model (DOM).
WebGL (Web Graphics Library) is a modern JavaScript API for rendering interactive 3D graphics without the use of plug-ins. It allows interactive content such as 3D animations, visualizations and video explainers to presented users in the most intuitive way.[12]
A 2010-era trend in websites called "responsive design" has given the best viewing experience as it provides a device-based layout for users. These websites change their layout according to the device or mobile platform, thus giving a rich user experience.[13]
Types
[edit]Websites can be divided into two broad categories—static and interactive. Interactive sites are part of the Web 2.0 community of sites and allow for interactivity between the site owner and site visitors or users. Static sites serve or capture information but do not allow engagement with the audience or users directly. Some websites are informational or produced by enthusiasts or for personal use or entertainment. Many websites do aim to make money using one or more business models, including:
- Posting interesting content and selling contextual advertising either through direct sales or through an advertising network.
- E-commerce: products or services are purchased directly through the website
- Advertising products or services available at a brick-and-mortar business
- Freemium: basic content is available free, but premium content requires a payment (e.g., WordPress website, it is an open-source platform to build a blog or website).
- Some websites require user registration or subscription to access the content. Examples of subscription websites include many business sites, news websites, academic journal websites, gaming websites, file-sharing websites, message boards, Web-based email, social networking websites, websites providing real-time stock market data, as well as sites providing various other services.
See also
[edit]References
[edit]- ^ "Top Websites Ranking". Similarweb.
- ^ "Top websites". Semrush.
- ^ "Tim Berners-Lee". W3C. Archived from the original on 27 September 2021. Retrieved 17 November 2021.
- ^ "home of the first website". info.cern.ch. Archived from the original on 10 June 2017. Retrieved 30 August 2008.
- ^ Cailliau, Robert. "A Little History of the World Wide Web". W3C. Archived from the original on 6 May 2013. Retrieved 16 February 2007.
- ^ "Internet, Web, and Other Post-Watergate Concerns". The Chicago Manual of Style. University of Chicago. Archived from the original on 20 February 2010. Retrieved 18 September 2010.
- ^ AP Stylebook [@APStylebook] (16 April 2010). "Responding to reader input, we are changing Web site to website. This appears on Stylebook Online today and in the 2010 book next month" (Tweet). Retrieved 18 March 2019 – via Twitter.
- ^ "Web Server Survey". Netcraft. Archived from the original on 20 August 2011. Retrieved 13 March 2017.
- ^ "Total number of Websites". Internet Live Stats. Archived from the original on 20 July 2017. Retrieved 14 April 2015.
- ^ "Web Server Survey". Netcraft News. Archived from the original on 24 July 2018. Retrieved 17 May 2021.
- ^ Deon (26 May 2020). "How Many Websites Are There Around the World? [2021]". Siteefy. Archived from the original on 17 May 2021. Retrieved 17 May 2021.
- ^ "OpenGL ES for the Web". khronos.org. 19 July 2011. Archived from the original on 15 December 2009. Retrieved 1 April 2019.
- ^ Pete LePage. "Responsive Web Design Basics - Web". Google Developers. Archived from the original on 5 March 2017. Retrieved 13 March 2017.
External links
[edit]Website
View on GrokipediaOverview
Definition and Purpose
A website is a collection of interconnected web pages and related content, including multimedia elements such as images, videos, and interactive features, that share a common domain name and are hosted on at least one web server for access over the internet or a private network.[1] This structure allows users to navigate between pages via hyperlinks, forming a cohesive digital presence managed by an individual, organization, or entity.[6] Websites originated from Tim Berners-Lee's 1989 proposal for an information management system at CERN, which laid the groundwork for sharing and linking documents across distributed environments.[7] Websites serve diverse purposes, primarily facilitating information dissemination, commerce, and communication on a global scale. Informational websites, such as news platforms like BBC News, provide timely updates and educational resources to inform the public.[4] Commercial websites, exemplified by e-commerce sites like Amazon, enable online transactions, product browsing, and customer engagement to drive sales and business growth.[4] Educational websites, such as those from universities or platforms like Coursera, deliver structured learning materials, courses, and research access to support academic and professional development.[8] Entertainment websites, including streaming services like Netflix, offer multimedia content for leisure and audience interaction.[9] Personal blogging sites, such as those powered by WordPress, allow individuals to share opinions, experiences, and creative work with a broad audience.[10] As of 2025, there are approximately 1.2 billion websites worldwide, reflecting the medium's vast expansion, though only about 16% remain active with regular updates.[11] Traffic is heavily concentrated among leading platforms, with Google receiving over 98 billion monthly visits and YouTube following as the second most-visited site, underscoring their dominant roles in search, video sharing, and user engagement.[12]Key Components
A website's core elements begin with web pages, which provide the fundamental structure for content presentation using Hypertext Markup Language (HTML). HTML defines the semantic structure of documents, including headings, paragraphs, lists, and embedded media, enabling browsers to render text, images, and interactive components in a standardized format.[13] Hyperlinks, implemented via HTML's<a> element, facilitate navigation between web pages or external resources by specifying a destination URI, allowing users to traverse interconnected content seamlessly.[14]
Domain names serve as human-readable addresses for websites, resolved to IP addresses through the Domain Name System (DNS), a hierarchical distributed database that maps names like "example.com" to numerical locations via recursive queries from root servers to authoritative name servers.[15]
Web servers host website files and respond to client requests by delivering content over the Hypertext Transfer Protocol (HTTP) or its secure variant (HTTPS), which encapsulates messages in a request-response cycle to transfer resources like HTML documents and associated media.[16]
These elements interconnect within the client-server model, where a user's web browser (client) initiates an HTTP request to a server upon entering a URL, prompting the server to process and return the corresponding response, typically an HTML page with embedded assets.
Uniform Resource Locators (URLs) structure this interaction by providing a standardized syntax for locating resources, comprising a scheme (e.g., "https"), authority (host and port), path, query parameters, and fragment, as defined in the generic URI syntax, enabling precise addressing and retrieval across the web.[17]
Websites rely on various file types to deliver content: static files include HTML for markup, CSS for styling presentation, and image formats like JPEG or PNG for visual elements, which remain unchanged regardless of user context; in contrast, dynamic scripts, such as JavaScript files, execute on the client side to generate or modify content interactively based on runtime conditions.[13]
History
Origins and Early Development
In March 1989, British computer scientist Tim Berners-Lee, while working at CERN, submitted a proposal for a global hypertext system to facilitate information sharing among scientists in a large, international research organization facing high staff turnover and information silos.[18] The proposal outlined a distributed network of nodes and links to manage documents, projects, and personal data without relying on rigid hierarchies or centralized databases, integrating with existing tools like email and file systems.[18] This concept, initially called the "Mesh," evolved into the World Wide Web, with Berners-Lee advocating for a prototype developed by a small team over six to twelve months.[19] Between 1990 and 1991, Berners-Lee led the development of the foundational technologies, including the Hypertext Transfer Protocol (HTTP) for data exchange, Hypertext Markup Language (HTML) for structuring content, and the first web browser and server software.[19] The inaugural website, hosted on Berners-Lee's NeXT computer at CERN, went live on August 6, 1991, at the URL http://info.cern.ch; it served as an informational page describing the World Wide Web project itself and provided instructions for accessing and contributing to it.[19] This site marked the practical realization of the hypertext system, enabling basic navigation through linked documents primarily for CERN's research community.[3] A pivotal milestone occurred on April 30, 1993, when CERN declared the World Wide Web software—encompassing the line-mode browser, basic server, and common code library—into the public domain, relinquishing all intellectual property rights to encourage unrestricted use, modification, and distribution.[20] This open release accelerated adoption beyond CERN. Concurrently, the Mosaic browser, developed by Marc Andreessen and Eric Bina at the National Center for Supercomputing Applications (NCSA) in 1993, introduced a graphical user interface that integrated text and images seamlessly, making the web more accessible and visually engaging compared to prior text-only browsers.[21] Despite these advances, the early web faced significant constraints that limited its reach and capabilities. It remained largely confined to academic and research institutions, with usage dominated by scientists in fields like high-energy physics due to restricted internet access and the absence of commercial infrastructure.[22] Bandwidth limitations from slow dial-up modems and network bottlenecks restricted content to predominantly text-based formats, as incorporating images or other media was inefficient and time-consuming, often resulting in prolonged load times even for simple pages.[22] These technical hurdles, combined with a small initial user base, positioned the web as an experimental tool rather than a widespread platform in its formative years.[23]Growth and Milestones
The growth of websites accelerated dramatically in the 1990s, transforming the World Wide Web from an academic tool into a mainstream phenomenon. The release of Netscape Navigator in December 1994 played a pivotal role in popularizing web browsing by providing an intuitive graphical interface that made accessing websites accessible to non-technical users, leading to a surge in web adoption.[24] This momentum fueled the dot-com bubble from 1995 to 2000, a period of explosive investment in internet-based businesses, exemplified by the launches of Amazon.com on July 16, 1995, as an online bookstore, and eBay (initially AuctionWeb) in September 1995, as a peer-to-peer auction platform.[25][26][27] The number of websites grew from approximately 23,500 in 1995 to over 17 million by 2000, reflecting the rapid commercialization and expansion of online presence.[28] In the 2000s and 2010s, the advent of Web 2.0, a term coined by Tim O'Reilly in 2004, marked a shift toward interactive platforms that emphasized user-generated content and collaboration, fundamentally altering website dynamics.[29] Key examples include Wikipedia, launched on January 15, 2001, which allowed volunteers worldwide to collaboratively edit and expand an open encyclopedia, and Facebook, founded on February 4, 2004, which enabled users to share personal updates, photos, and connections on a social networking site.[30] The introduction of the iPhone on June 29, 2007, further catalyzed growth by making mobile web access seamless and intuitive, driving smartphone ownership in the U.S. from 4% of the mobile market in 2007 to over 50% by 2012 and boosting global mobile internet traffic exponentially.[31][32] By 2016, the total number of websites had surpassed 1 billion, and this expansion continued, with Netcraft's October 2025 survey reporting 1.35 billion sites, underscoring the web's enduring scale despite fluctuations in active usage.[33][34] Parallel to this expansion, the terminology for websites evolved in the 2000s, with the two-word "web site" giving way to the one-word "website" as the preferred spelling in major style guides, reflecting the term's maturation into everyday language.[35] For instance, while early usage favored the hyphenated or separated form, publications increasingly adopted "website" by the mid-2000s, with the Associated Press Stylebook officially endorsing it in 2011 to align with common practice.[36]Website Architecture
Static Websites
A static website is one where the content is pre-generated and remains unchanged regardless of user interactions, consisting primarily of fixed HTML, CSS, and JavaScript files served directly from a web server to the client's browser without any server-side processing or database involvement.[37] The mechanics involve building the site at development time, where markup languages like Markdown or templates are converted into static HTML pages, which are then uploaded to a hosting server; subsequent updates require manual editing of source files, rebuilding the site, and re-uploading the changes.[38] This approach ensures that every visitor receives identical content for a given page, relying on client-side JavaScript for any limited interactivity, such as animations or form validations.[39] One key advantage of static websites is their superior loading speed, as there is no need for real-time content generation or database queries, resulting in reduced latency and better performance on content delivery networks (CDNs).[39] They also offer lower hosting costs, since they can be deployed on inexpensive file-based servers or services like AWS S3 without requiring complex backend infrastructure.[40] Additionally, static sites provide enhanced security, with fewer vulnerabilities exposed due to the absence of server-side scripting languages or dynamic data handling that could be exploited.[41] However, a primary disadvantage is the limited scalability for content-heavy sites needing frequent updates, as changes involve rebuilding and redeploying the entire site, which can be time-consuming for non-technical users.[42] They are less suitable for applications requiring user-specific personalization or real-time data, potentially leading to higher maintenance efforts for evolving content.[43] To streamline development, static site generators (SSGs) automate the build process by combining content files, templates, and data into static HTML output, improving efficiency over manual file creation.[38] Popular tools include Jekyll, an open-source SSG written in Ruby that converts plain text files into fully formed websites, particularly integrated with GitHub Pages for free hosting.[44] Another widely adopted option is Hugo, a Go-based generator renowned for its exceptional build speed, capable of rendering large sites with thousands of pages in seconds, making it ideal for blogs and documentation.[45] These tools enable developers to manage content via version control systems like Git, facilitating collaborative workflows while maintaining the static nature of the output.[39] Static websites are commonly employed for personal portfolios, where designers or developers showcase fixed work samples and bios, such as the portfolio of web designer Mike Matas, which highlights creative projects without dynamic elements.[46] They also suit brochure-style sites for small businesses or organizations, presenting unchanging information like services, contact details, and company overviews, exemplified by simple informational pages for local consultancies.[47]Dynamic Websites
Dynamic websites generate content in real-time based on user inputs, data from external sources, or database queries, enabling interactive and personalized experiences that evolve with each visit. Unlike pre-built static pages, dynamic sites construct responses on the fly, often combining server-side processing with client-side enhancements to deliver tailored outputs. This architecture supports features like user authentication, search functionalities, and content updates without requiring manual file modifications.[48][49] The core mechanics involve server-side scripting languages such as PHP, which executes code on the web server to handle requests and generate HTML, or Node.js, a JavaScript runtime that enables asynchronous, event-driven processing for efficient handling of multiple connections. These scripts typically integrate with relational databases like MySQL to store, retrieve, and manipulate data—such as user profiles or product inventories—ensuring content is fetched dynamically during runtime. On the client side, JavaScript frameworks like React facilitate responsive interfaces by updating the Document Object Model (DOM) in response to user events, allowing seamless interactions without full page reloads. This hybrid approach—server-side for data-heavy operations and client-side for UI fluidity—powers the adaptability of modern web applications.[50][51][52] One key advantage of dynamic websites is their ability to provide personalization, where content adapts to individual user preferences, location, or behavior, fostering higher engagement on platforms like social media sites such as Twitter (now X), which generates real-time feeds based on user follows and interactions. Scalability is another benefit, particularly for e-commerce platforms like Shopify, which handle varying traffic loads by dynamically pulling inventory and processing transactions from databases, supporting business growth without static limitations. However, these sites introduce higher development complexity due to the need for robust backend infrastructure and ongoing maintenance, often requiring specialized skills to integrate scripting, databases, and security measures. Additionally, they pose greater security risks, as server-side scripts and database connections create potential vulnerabilities to attacks like SQL injection if not properly safeguarded.[53][54][55] Content management systems (CMS) simplify the creation and maintenance of dynamic websites by abstracting much of the underlying scripting and database interactions into user-friendly interfaces. WordPress, first launched on May 27, 2003, exemplifies this by using PHP for server-side rendering and MySQL for data storage, allowing non-technical users to publish, edit, and organize content through a dashboard while enabling plugins for advanced dynamic features like e-commerce or forums. By 2025, WordPress has become the dominant CMS, powering 43.4% of all websites on the internet, underscoring its role in democratizing dynamic web development and supporting diverse applications from blogs to enterprise sites.[56][57]Content and Features
Multimedia Integration
The integration of multimedia into websites began in the early 1990s with the introduction of inline images via the HTML<img> tag, enabled by the NCSA Mosaic browser in 1993, which allowed images to display directly within text rather than as separate files.[58] This marked a shift from text-only pages to visually enriched content, though support was initially limited to formats like GIF and JPEG. By the late 1990s, plugins such as Adobe Flash dominated for richer media like animations and video, filling gaps in native browser capabilities, but these required user installation and raised security concerns.[59]
The advent of HTML5 in the late 2000s revolutionized multimedia embedding by introducing native <audio> and <video> elements, which eliminated the need for plugins and enabled direct browser playback. These tags support key formats including MP4 (using H.264 codec for video and AAC for audio) and WebM (with VP8 or VP9 video and Vorbis or Opus audio), chosen for their balance of quality, compression, and open-source availability to promote interoperability across browsers. For images, the srcset attribute in HTML5 allows responsive delivery by specifying multiple image sources based on device resolution or viewport size, optimizing loading for mobile and high-density displays without JavaScript.
Accessibility standards, as outlined in the Web Content Accessibility Guidelines (WCAG) 2.1 by the W3C, mandate features like alt attributes for images to provide textual descriptions for screen readers, and <track> elements for video and audio to include timed captions or subtitles.[60] These ensure non-text media is perceivable to users with disabilities, such as closed captions for deaf individuals or audio descriptions for the visually impaired.[61]
A prominent example of multimedia integration is YouTube, launched in 2005, which pioneered user-generated video streaming using progressive download and later adaptive bitrate streaming to handle varying network conditions. However, challenges persist, including bandwidth optimization—addressed through techniques like video compression and content delivery networks (CDNs) to reduce load times on low-speed connections—and copyright issues, where embedding third-party media requires licensing to avoid infringement under laws like the Digital Millennium Copyright Act (DMCA).
Interactivity and User Engagement
Interactivity on websites enables users to engage actively with content through dynamic responses to inputs, transforming passive viewing into participatory experiences. This is achieved primarily through client-side scripting and server communication protocols that update the page without full reloads, fostering immersion and personalization.[62] JavaScript serves as the foundational language for interactivity by manipulating the Document Object Model (DOM), which represents the webpage's structure as a tree of nodes accessible via APIs. Developers use methods likequerySelector and addEventListener to select elements, modify their content or attributes, and handle events such as clicks or key presses, allowing real-time changes to the user interface. HTML forms complement this by providing structured input controls, including text fields, checkboxes, and buttons, which capture user data for submission via the <form> element, often validated client-side with JavaScript to enhance usability. For seamless updates, Asynchronous JavaScript and XML (AJAX) facilitates background HTTP requests to servers, exchanging data—typically in JSON format—without interrupting the user's view, as seen in auto-complete search features.[62]
Real-time interactivity extends further with WebSockets, a protocol establishing persistent, bidirectional connections between browser and server, enabling low-latency exchanges for applications like live chat or collaborative editing. Unlike polling methods, WebSockets reduce overhead by maintaining an open channel, supporting features in tools such as online multiplayer games or instant messaging platforms.[63]
Advanced elements elevate engagement through visual and spatial interactions. CSS transitions animate property changes, such as opacity or position, over specified durations and easing functions, creating smooth effects like hover fades or slide-ins that guide user attention without JavaScript overhead.[64] For immersive experiences, WebGL leverages the browser's graphics hardware to render 3D graphics directly in HTML5 canvases, powering interactive visualizations like virtual tours or data models in scientific websites.[65]
Examples of these technologies in action include gamified sites that incorporate progress bars, badges, and quizzes—such as Duolingo's language learning platform, which uses JavaScript-driven challenges and animations to motivate repeated visits—and collaborative tools like Google Docs, where WebSockets synchronize edits across users in real time.[66][67] Such implementations boost user retention; studies show that higher interactivity levels, through elements like polls and comment sections, increase site stickiness by enhancing perceived satisfaction and emotional involvement.[68][69]
The rise of single-page applications (SPAs), built with frameworks like React or Vue.js, further amplifies engagement by loading a single HTML shell and dynamically updating content via AJAX or WebSockets, mimicking native app fluidity and reducing navigation friction to improve session lengths and conversion rates.[70]
Classifications
By Purpose and Audience
Websites can be classified by their primary purpose, which determines the type of content, functionality, and user interaction they offer. Informational websites aim to deliver factual, educational, or reference material to educate or inform users, such as encyclopedias, news portals, or directories that aggregate data like product prices or health resources.[4] For instance, sites like Wikipedia serve as comprehensive encyclopedias, while WebMD targets users with health information. Commercial websites focus on promoting and selling products or services to generate revenue, often through e-commerce platforms or marketplaces. Examples include online retailers like Amazon, which facilitate direct purchases, and broader marketplaces such as eBay that connect buyers and sellers.[4] These sites typically integrate shopping carts, payment gateways, and marketing tools to drive transactions. Governmental websites provide public services, policy information, and administrative tools, often under country-specific domains like .gov, to support citizen engagement and compliance. Portals such as Data.gov enable access to e-services like public data access or procurement, bridging government-to-citizen (G2C) and government-to-business (G2B) interactions.[4] Non-profit websites advance advocacy, fundraising, or community causes without profit motives, featuring donation tools and awareness campaigns; platforms like the World Wildlife Fund (WWF) website exemplify this by supporting conservation efforts through global campaigns.[71] Classifications also extend to target audiences, influencing design and content tailoring. Business-to-business (B2B) websites cater to corporate users with tools for partnerships, such as supplier directories or industry forums, contrasting with business-to-consumer (B2C) sites like Amazon that prioritize user-friendly shopping for individuals.[4] Audience scope further divides sites into global versus localized variants: global platforms reach broad, international users through standardized content, while localized ones adapt via multilingual interfaces and cultural relevance to serve regional needs.[72] Educational platforms like Khan Academy exemplify audience-specific design for learners worldwide, offering interactive lessons in multiple languages, and social networks such as Facebook target diverse general audiences with personalized feeds. A key trend in website design is the shift toward user-centric approaches that accommodate diverse audiences, including those with disabilities, through inclusive practices like alternative text for images and keyboard navigation. The Web Accessibility Initiative (WAI) emphasizes guidelines such as WCAG 2.2 to ensure equitable access, reflecting broader adoption of maturity models for organizational compliance.[73] This evolution prioritizes usability across demographics, enhancing engagement for global and specialized users alike.[74]By Technology and Functionality
Websites can be classified by their underlying technology stacks, which determine how content is generated, delivered, and interacted with, as well as by their core operational functionalities that leverage specific technical capabilities. This categorization highlights the diversity in how websites handle rendering, data processing, and user interactions, influencing performance, scalability, and maintainability. One primary technological distinction is between client-side rendered (CSR) websites and server-side rendered (SSR) websites. In CSR approaches, the browser handles most of the rendering using technologies like vanilla JavaScript or frameworks such as React, where the server delivers a minimal HTML shell and JavaScript bundles that dynamically generate the page content upon user interaction. This enables rich, interactive experiences but can lead to slower initial load times on low-bandwidth connections. In contrast, SSR websites, often built with server technologies like ASP.NET or PHP, generate complete HTML pages on the server before sending them to the client, prioritizing faster initial rendering and better search engine optimization, though they may require more server resources for dynamic updates. Hybrid models, such as the Jamstack architecture, combine static site generation with client-side dynamism; sites are pre-built into static files served via a content delivery network (CDN), while APIs handle dynamic elements like user authentication, reducing server load and enhancing security through decoupled front-end and back-end components. Functionality types further delineate websites based on how technology supports specific operations. E-commerce websites integrate payment gateways and shopping carts using secure protocols like HTTPS and APIs from providers such as Stripe or PayPal, enabling real-time transaction processing and inventory management through backend databases like SQL. Blogs typically employ RSS feeds for content syndication, generated server-side with tools like WordPress, allowing automated distribution of updates to subscribers and aggregators while supporting lightweight client-side enhancements for reading experiences. Portals aggregate content from multiple sources using technologies like XML parsing and JavaScript for real-time feeds, often relying on server-side scripting to curate personalized dashboards, as seen in platforms like Yahoo or enterprise intranets. These functionalities are enabled by the underlying tech stack, ensuring seamless data flow and user interaction without overlapping into user-centric purposes. Architectural examples illustrate these classifications in practice. Progressive enhancement builds websites starting with core functionality accessible via basic HTML and CSS, then layering JavaScript for advanced features, ensuring compatibility across devices and browsers by prioritizing content delivery over scripted behaviors. Single-page applications (SPAs), a client-side dominant architecture, load a single HTML page and update content dynamically via AJAX or Fetch API calls, reducing page reloads for fluid navigation, as exemplified by Gmail's interface. Multi-page applications (MPAs), conversely, rely on server-side navigation between distinct HTML pages, supporting complex state management in e-commerce flows but potentially increasing latency.Modern Developments
Web Standards and Technologies
Web standards form the foundational protocols, languages, and guidelines that ensure websites are interoperable, accessible, and performant across diverse devices and browsers. Organizations like the World Wide Web Consortium (W3C) and the Internet Engineering Task Force (IETF) develop these standards to promote consistency in web development. Core technologies such as HTML, CSS, and JavaScript, along with communication protocols like HTTP, enable structured content delivery, styling, and dynamic behavior while adhering to best practices for security and usability. HTML5, standardized by the W3C as a Recommendation on October 28, 2014, serves as the primary markup language for structuring web content, introducing semantic elements like<article> and <section> for better document outlining, as well as native support for multimedia through <video> and <audio> tags without plugins.[75] CSS3, developed modularly by the W3C since the early 2000s, allows developers to apply styles through independent modules such as the CSS Syntax Module Level 3 (published December 24, 2021), which defines stylesheet parsing, and others handling layouts, animations, and typography for enhanced visual presentation.[76] ECMAScript, the scripting language standard maintained by Ecma International, reached its 2025 edition in June 2025, providing the basis for JavaScript implementations that enable client-side interactivity, with features like async/await for asynchronous operations and temporal APIs for date handling.[77]
Accessibility standards, crucial for inclusive web experiences, are outlined in the W3C's Web Content Accessibility Guidelines (WCAG) 2.2, released as a Recommendation on October 5, 2023, which expands on prior versions by adding nine new success criteria addressing mobility, low vision, cognitive limitations, and focus visibility, aiming for conformance levels A, AA, or AAA to ensure usability for people with disabilities.
Communication protocols underpin website efficiency and security. HTTP/2, defined in RFC 7540 by the IETF in May 2015, improves upon HTTP/1.1 by introducing multiplexing, header compression, and server push to reduce latency and enhance page load times, particularly for resource-heavy sites.[78] HTTPS, which encrypts HTTP traffic using TLS, saw widespread adoption in the 2010s, rising from about 40% of top websites in 2014 to over 90% by 2020, driven by browser warnings for non-secure sites and free certificate authorities like Let's Encrypt launched in 2015.[79] For search engine optimization (SEO), foundational practices include using meta tags like <title> and <meta name="description"> to provide concise page summaries for crawlers, and XML sitemaps to map site structure, as recommended by Google to improve indexing and visibility in search results.[80]
Mobile-first design principles emphasize adaptability to varying screen sizes. Responsive design, enabled by CSS media queries in the W3C's Media Queries Level 3 specification (updated May 21, 2024), allows stylesheets to adapt layouts based on device characteristics like width or orientation, using rules such as @media (max-width: 600px) to reflow content fluidly.[81] Progressive Web Apps (PWAs) extend this by leveraging service workers—JavaScript scripts defined in the W3C's Service Workers specification (updated March 6, 2025)—to cache assets and enable offline functionality, combined with the Web App Manifest for installable, app-like experiences that work across platforms without native app stores.[82]