Hubbry Logo
ShadertoyShadertoyMain
Open search
Shadertoy
Community hub
Shadertoy
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Shadertoy
Shadertoy
from Wikipedia
Shadertoy
Original authorsInigo Quilez & Pol Jeremias
Initial releaseFebruary 14, 2013 (2013-02-14)
Stable release
Release 0.8.3 / March 3, 2016
Written inGLSL, JavaScript, PHP
Type3D computer graphics tool community
Websitewww.shadertoy.com

Shadertoy is an online community and tool for creating and sharing shaders through WebGL, used for both learning and teaching 3D computer graphics in a web browser.

Overview

[edit]
A procedural image made in Shadertoy with distance fields, modeled, shaded, lit and rendered in realtime

Shadertoy is an online community and platform for computer graphics professionals, academics[1] and enthusiasts who share, learn and experiment with rendering techniques and procedural art through GLSL code[citation needed]. There are more than 52 thousand public contributions as of mid-2021 coming from thousands of users. WebGL[2] allows Shadertoy to access the compute power of the GPU to generate procedural art, animation, models, lighting, state based logic and sound.

History

[edit]

Shadertoy was created by Pol Jeremias and Inigo Quilez in January 2013 and came online in February the same year.

The roots of the effort are in Inigo's "Shadertoy" section [3] in his computer graphics educational website.[4] With the arrival of the initial WebGL implementation by Mozilla's Firefox in 2009, Quilez created the first online live coding environment and curated repository of procedural shaders. This content was donated by 18 authors from the Demoscene and showcased advanced real-time and interactive animations never seen in the Web before, such as raymarched metaballs, fractals and tunnel effects.

After having worked together in several real-time rendering projects together for years, in December 2012 Quilez and Pol decided to create a new Shadertoy site that would follow the tradition of the original Shadertoy page with its demoscene flavored resource and size constrained real-time graphics content, but would add social and community features and embrace an open-source attitude.

The page came out with the live editor, real-time playback, browsing and searching capabilities, tagging and commenting features. Content wise, Shadertoy provided a fixed and limited set of textures for its users to utilize in creative ways. Over the years Shadertoy added extra features, such as webcam and microphone input support, video, music, Virtual Reality rendering and multi-pass rendering.

There are over 31 thousand contributions in total from thousands of users, several of which are referenced in academic papers [citation needed]. Shadertoy also hosts annual competitions and events. [5]

Features

[edit]
  • Editing: syntax highlighted editor with immediate visual feedback
  • Social: commenting on shadertoys, voting (liking)
  • Sharing: permanent URLs, embedded in other websites, private shader sharing
  • Rendering: floating point buffer based multipass and history
  • Media inputs: microphone, webcam, keyboard, mouse, VR HMDs, soundcloud, video, textures

Mentions

[edit]

Shadertoy is referenced in several sources:

  • NVidia developer blog, Jun 2016, Shadertoy Contest 2016 Announced.[6]
  • Siggraph Real-Time Live!, 2015, an interactive sound visualizing project.[7]
  • Hacker News, 2014, Shadertoy adds procedural GPU-generated music in the browser.[8]
  • Numerical Methods for Ray Tracing Implicitly Defined Surfaces,[9]
  • CS 371 Course at Williams College, 2014, Inspiration for CS 371[10]
  • Real-Time Rendering, Aug 2015, Seven Things for August 20, 2015.[11]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Shadertoy is a free online platform and community dedicated to the creation, sharing, and exploration of through programmable shaders written in GLSL and rendered using in web browsers. Co-founded by software engineers Iñigo Quilez and Pol Jeremias Vila under their company Beautypi, the site launched publicly in February 2013, building on earlier experiments dating back to 2009. It provides an with syntax-highlighted code editing, immediate visual feedback, and support for advanced features including multi-pass rendering, procedural sound generation, output, texture and audio inputs, and keyboard interactions. Users can existing shaders, comment on submissions, rate them via likes, and discover content through galleries, fostering a collaborative space for artists, programmers, and educators to experiment with , fractals, , and other graphics techniques. As of 2018, Shadertoy had attracted creators from more than 100 countries and thousands of shared artworks and tools, emphasizing and crowdsourced innovation in browser-based , as recognized in its nomination for the STARTS Prize.

Introduction

Overview

Shadertoy is an and platform that enables users to create, edit, and share procedural s directly in web browsers using and GLSL (). It functions as a collaborative space for developers, artists, and hobbyists to experiment with real-time programming without requiring specialized software installations. The core purpose of Shadertoy revolves around generating procedural content, where users write code to produce dynamic visual outputs such as artistic patterns, fluid animations, ray-traced 3D models, complex simulations, and visuals that respond to audio inputs in real time. These capabilities make it a versatile tool for exploring mathematical and algorithmic approaches to . As of , Shadertoy features over public shaders contributed by tens of thousands of users worldwide, underscoring its significance in making GPU-accelerated graphics accessible to a broad audience beyond professional environments. This scale reflects its community-driven evolution, fostering inspiration and learning through shared examples. At its technical foundation, Shadertoy leverages Canvas elements combined with for rendering, allowing shaders to execute on the user's GPU directly within the browser for immediate feedback and high-performance computation.

History

Shadertoy was co-founded by Iñigo Quilez, a veteran known for procedural graphics productions since the late , and Pol Jeremias Vila under their company Beautypi, both motivated by Quilez's early experiments beginning in 2009. Quilez's work on the 4KB demoscene entry Elevated for the Function 2009 competition demonstrated real-time procedural rendering in browsers, evolving from personal shader demos hosted on his website into a shared platform concept by late 2012. The site officially launched in February 2013 as a collaborative space for creating and sharing live-coded fragment shaders, drawing inspiration from traditions of compact, algorithmic visuals. Initially focused on single-pass rendering with inputs like time, , and audio, it quickly attracted contributors interested in procedural art and graphics experimentation via and . Key milestones included the addition of multi-pass rendering in early 2016, enabling complex simulations by allowing shaders to reference previous frame outputs across up to four buffers. Around the same period, support for was introduced through a dedicated mainVR entry point, facilitating stereoscopic rendering for headsets like the and expanding applications to immersive environments. By 2016, the platform hosted thousands of user-submitted shaders, reflecting growing interest in browser-based graphics amid WebGL's maturation. Shadertoy's growth was propelled by advancements in standards and the rising popularity of techniques, fostering a community-driven repository that integrated seamlessly with emerging web technologies for real-time visuals.

Technical Features

Shader Editor

The Shader Editor in Shadertoy serves as the primary interface for developing GLSL fragment , providing a web-based environment that facilitates real-time code editing and visualization. The layout features a preview pane utilizing a to render shader outputs instantaneously, alongside a code editor with for GLSL. This setup allows users to observe dynamic effects as they type, with automatic compilation upon code modifications for immediate feedback. Basic controls in the preview area include options to pause animation, record output, and toggle full-screen mode. The basic workflow centers on the "Image" tab, where users author fragment shaders that compute pixel colors on a per-fragment basis. Editing occurs directly in the browser, with live rendering updating the preview pane to reflect changes, enabling iterative experimentation without manual compilation steps. Error highlighting appears in the editor to identify syntax issues, streamlining . Shaders are saved and shared via unique URLs generated upon publication. Shadertoy primarily supports fragment shaders written in GLSL ES 1.00, targeting WebGL 1.0 or higher for browser compatibility, with no software download required. Predefined uniforms simplify common inputs, such as iTime for elapsed seconds since shader start (enabling animations), iResolution for viewport dimensions in pixels, and iMouse for mouse coordinates and click state (x, y for position; z, w for click detection). These are automatically provided and cannot be modified by users. Mobile browser support exists but is limited by input handling differences, such as touch gestures not fully mapping to mouse events. A typical shader structure begins with uniform declarations followed by the mandatory entry point function void mainImage(out vec4 fragColor, in vec2 fragCoord). Within this function, computations determine the output color for each based on fragCoord (normalized coordinates from 0.0 to 1.0). For instance:

[uniform](/page/Uniform) vec3 iResolution; // [viewport](/page/Viewport) resolution (in [pixel](/page/Pixel)s) [uniform](/page/Uniform) float iTime; // shader playback time (in seconds) [uniform](/page/Uniform) vec4 iMouse; // mouse position (x, y) and click (z, w) void mainImage( out vec4 fragColor, in vec2 fragCoord ) { // Normalized pixel coordinates (from 0 to 1) vec2 uv = fragCoord/iResolution.xy; // Time-varying color vec3 col = 0.5 + 0.5*cos(iTime + uv.xyx + vec3(0,2,4)); fragColor = vec4(col, 1.0); }

[uniform](/page/Uniform) vec3 iResolution; // [viewport](/page/Viewport) resolution (in [pixel](/page/Pixel)s) [uniform](/page/Uniform) float iTime; // shader playback time (in seconds) [uniform](/page/Uniform) vec4 iMouse; // mouse position (x, y) and click (z, w) void mainImage( out vec4 fragColor, in vec2 fragCoord ) { // Normalized pixel coordinates (from 0 to 1) vec2 uv = fragCoord/iResolution.xy; // Time-varying color vec3 col = 0.5 + 0.5*cos(iTime + uv.xyx + vec3(0,2,4)); fragColor = vec4(col, 1.0); }

This example produces a simple animated color gradient, illustrating how uniforms integrate with core computations.

Rendering and Inputs

Shadertoy's rendering pipeline employs a full-screen quad approach, where the GPU renders a quadrilateral spanning the entire canvas using two triangles. Each invocation of the fragment shader computes the color for a single pixel based on its UV coordinates, typically accessed via the built-in fragCoord variable normalized against the canvas resolution. This setup allows shaders to generate procedural visuals directly in the fragment stage without vertex processing or geometry, focusing computation on per-pixel effects. Standard inputs are provided through built-in uniforms that supply essential runtime data to the . The iTime uniform delivers the elapsed time in seconds since the started, enabling animations and temporal effects. iResolution is a vec3 representing the dimensions in pixels, with the z-component indicating (usually 1.0). iFrame tracks the current frame number as an , useful for frame-specific logic, while iChannelTime[4] offers per-channel timing in seconds for synchronized multi-input scenarios. Additional uniforms like iTimeDelta (frame render time in seconds), iFrameRate (frames per second), iMouse (mouse coordinates and click state as vec4), iDate (current date and time as vec4), and iSampleRate (audio sample rate, typically 44100 Hz) further support interactive and time-based rendering. These are accessible across image, buffer, and sound types. Media inputs enhance shaders with external data via the iChannel samplers (up to four: iChannel0–3), each a sampler2D texture. Webcam capture is supported by assigning the device to an iChannel (e.g., iChannel0), providing real-time video as a rectangular texture whose resolution is queryable via iChannelResolution[4]. Uploaded images serve as static 2D textures, often in sRGB format requiring gamma correction (e.g., pow(texture(iChannelX, uv).rgb, vec3(2.2)) for linear space). Videos function similarly but evolve over time, with no texture wrapping or mipmapping by default. Microphone audio is integrated as a texture (typically 512×2 pixels) assigned to an iChannel, where the first row encodes FFT spectrum data (frequency magnitudes) and the second row holds raw waveform samples; FFT access uses UV y=0.25, with x mapping frequency normalized by half the sample rate. These inputs are sampled using texture(iChannelX, uv), where uv ranges from 0 to 1, allowing integration into pixel computations for effects like audio-reactive visuals or image-based distortions. Output is handled directly within the shader's mainImage(out vec4 fragColor, in vec2 fragCoord) function, where the computed color is assigned to fragColor (e.g., fragColor = vec4(finalColor, 1.0);) and written to the . In basic mode, this represents a single-pass render without an integrated post-processing chain, ensuring immediate display of the fragment results. Execution is inherently GPU-bound, leveraging parallel fragment processing for real-time performance, with a typical target of 60 frames per second on capable hardware. Complex shaders may drop below this on lower-end GPUs due to intensive computations, prompting optimizations like reduced sampling or algorithmic simplifications, though fallbacks ensure compatibility across devices.

Advanced Capabilities

Shadertoy enables multi-pass rendering through four dedicated buffers (A through D), which facilitate intermediate computations by allowing the output of one buffer to serve as input for subsequent passes or the main image shader. These buffers operate on a frame-by-frame basis, supporting techniques such as ray marching in Buffer A to generate volumetric data that feeds into Buffer B for compositing or post-processing. This pipeline structure enhances complexity without requiring external tools, though buffers reset upon resolution changes or shader reloads. The Common tab provides a mechanism for sharing code snippets across all buffers, the image shader, sound output, and cubemaps, promoting reusability of functions like procedural noise generation or utility . Specialized inputs extend functionality, including cube maps accessed via the iChannel uniforms (e.g., iChannel0 as samplerCube) when set to a cubemap buffer such as A, where developers generate or load six-face textures for seamless spherical mapping. Additionally, VR mode, introduced in 2015, integrated head tracking through the WebVR API to deliver stereoscopic rendering for immersive experiences but was discontinued by around 2024. Audio capabilities support real-time reactive shaders via FFT buffers, which process frequency spectra from built-in music tracks or generated waveforms, enabling synchronized . Microphone input, available through the miscellaneous options, treats live audio as a texture resource for of visuals based on user-captured . Export features include downloading the shader code in GLSL format for offline use or , alongside embeddable iframes that allow integration into external websites while preserving interactive playback. Despite these advances, Shadertoy imposes limitations to maintain browser compatibility, such as exclusive use of fragment shaders without vertex shader support, strict adherence to GLSL ES 1.00 syntax, and lack of persistent storage for edits by non-logged-in users, requiring account login for saving or forking creations.

Community and Content

User Contributions

Users on Shadertoy create a wide array of content, primarily focusing on techniques. Common types include procedural art such as fractals and landscapes, technical demos featuring ray-traced scenes and particle systems, simple interactive games implemented within shaders, and educational examples demonstrating noise functions or lighting models. The platform hosts tens of thousands of public shaders contributed by thousands of users worldwide, showcasing significant scale and diversity. Shaders are organized into categories like "Popular," "New," and "Trending" to highlight favorites, recent uploads, and rising works. Users maintain profiles that serve as portfolios, allowing them to curate and display their collections of shaders. Notable trends in user contributions include the increasing adoption of techniques combined with signed distance fields (SDFs) for rendering complex scenes, a development that gained momentum shortly after the platform's launch in 2013. Past events, such as the 2017 Shadertoy competition, encouraged focused creation with weekly themes or ties to global challenges to foster innovation. relies on community mechanisms rather than formal moderation, where shaders gain visibility through accumulated views and likes, promoting high-engagement content. The forking feature enables users to iterate on existing shaders, building upon others' work to refine techniques and outputs collaboratively. Iconic examples include Quilez's demonstrations of distance functions, which illustrate foundational SDF concepts for , and various community-driven ray tracers that showcase advanced real-time rendering effects.

Social and Sharing Features

Shadertoy facilitates user interaction and collaboration through a suite of integrated tools designed for and disseminating procedural graphics creations. Each published receives a unique, permanent , enabling straightforward sharing via hyperlinks across platforms. Additionally, the platform generates embed codes that allow shaders to be incorporated into external websites using standard iframe elements, broadening their reach beyond the native environment. A core sharing mechanism is the fork functionality, which permits users to duplicate an existing as a starting point for modifications, preserving the original while enabling iterative development and variant creation. This feature aligns with the platform's open-source principles, where all shader code remains fully visible and editable upon forking, fostering a branching model akin to without integrated git-like tracking. Interaction among users is supported by comment sections attached to every shader, providing spaces for feedback, technical discussions, and community suggestions directly on the creation. Shaders also feature a liking system, where users can upvote appreciated works to boost their prominence in popularity-based rankings and explorations. Playlists offer a curation tool, allowing registered users to assemble and share themed collections of shaders for organized viewing or recommendation. User accounts enhance personalization and management capabilities, including the ability to save unpublished drafts, designate shaders as private or unlisted to control visibility, and follow creators' outputs through profile subscriptions. While the platform is entirely free to access, logging in is optional for browsing and viewing public shaders but mandatory for , , or uploading new content. Forum-like discussions primarily unfold within these comment threads, supplemented by code visibility that encourages direct and learning from peers.

Impact and Applications

Educational Uses

Shadertoy provides an accessible entry point for beginners to explore core GLSL concepts, including vectors, matrices, and techniques, without requiring local software installation or hardware configuration. The platform's browser-based environment delivers instant visual feedback on code modifications, allowing learners to iteratively build and refine shaders focused on fragment processing. This setup is particularly effective for hands-on experimentation with real-time graphics, as highlighted in its design for live-coding procedural effects. In academic settings, Shadertoy has been integrated into university curricula and workshops to teach principles. For instance, Carnegie Mellon University's CS 15-462 course employs it for assignments on real-time shading, where students implement lighting models like diffuse, specular, Fresnel, and ambient components using provided GLSL templates. Similarly, the University of Cambridge's Introduction to Graphics course recommends Shadertoy as a tool for experimenting with programming. workshops, starting from 2015, have featured dedicated sessions on using Shadertoy to teach advanced topics such as raymarching, procedural texturing, and volumetric rendering through live demonstrations. For self-study, Shadertoy offers built-in example that serve as starting points for topics like noise functions and fractals, complemented by detailed articles from co-founder Quilez on techniques such as raymarching and distance fields. Community-driven resources, including extensions like the 2023 VSCode Shader Toy plugin, enable seamless integration with development environments for offline editing and previewing. These tools support gradual progression from basic pixel manipulation to complex procedural animations. The platform's pedagogical strengths lie in its emphasis on visual immediacy, which aids comprehension of GPU parallelism and shader execution, while inherent constraints—such as browser compatibility and limits—foster creative problem-solving. By 2025, Shadertoy has been referenced in numerous academic papers on procedural education, underscoring its impact in fostering intuitive learning of real-time rendering concepts.

Notable Mentions and Integrations

Shadertoy has garnered significant industry recognition through sponsored contests and prestigious events. In 2016, sponsored the Shadertoy Competition, a series of challenges designed to highlight advanced programming techniques, culminating in winners selected based on creativity and technical prowess. The platform's shaders have been showcased at Real-Time Live! events since 2015, where top entries from dedicated competitions were presented interactively to audiences, emphasizing real-time graphics innovations. Early buzz appeared on in 2014, with discussions praising features like procedural GPU-generated music that pushed browser-based rendering boundaries. Developer blogs have similarly noted Shadertoy's role in democratizing access to complex graphics experiments. In media and cultural spheres, Shadertoy has influenced the , a creative focused on audiovisual programming. It was prominently featured in the 2024 Revision demoparty's shader jam, where participants live-coded effects using the platform's tools during networked sessions. Shaders from the site have inspired procedural art techniques in games and films, adapting mathematical models for dynamic visuals like generation and in production environments. Integrations extend Shadertoy's reach beyond the web. The Kodi visualization addon, which renders audio-reactive from the platform, has seen updates through 2025, supporting versions up to Kodi 21 with ongoing maintenance for compatibility. Developers commonly GLSL code from Shadertoy for porting to frameworks like Unity and , enabling reuse in 3D scenes via ShaderMaterial and uniform mappings. In 2025, efforts to port popular Shadertoy shaders to GPU have further advanced its use in compute shader research and cross-platform development. Browser enhancements include unofficial Chrome extensions, such as the Shadertoy plugin, which adds forking capabilities, draft saving, and options to streamline . Adaptations for offline use include desktop tools like C++-based fragment shader editors, which replicate Shadertoy's environment for local development and as of 2023. By 2025, Shadertoy has been referenced in numerous academic publications, advancing web-based through examples in procedural rendering and .

References

Add your contribution
Related Hubs
User Avatar
No comments yet.