Recent from talks
Nothing was collected or created yet.
Shadertoy
View on Wikipedia A major contributor to this article appears to have a close connection with its subject. (August 2016) |
| Shadertoy | |
|---|---|
| Original authors | Inigo Quilez & Pol Jeremias |
| Initial release | February 14, 2013 |
| Stable release | Release 0.8.3
/ March 3, 2016 |
| Written in | GLSL, JavaScript, PHP |
| Type | 3D computer graphics tool community |
| Website | www |
Shadertoy is an online community and tool for creating and sharing shaders through WebGL, used for both learning and teaching 3D computer graphics in a web browser.
Overview
[edit]
Shadertoy is an online community and platform for computer graphics professionals, academics[1] and enthusiasts who share, learn and experiment with rendering techniques and procedural art through GLSL code[citation needed]. There are more than 52 thousand public contributions as of mid-2021 coming from thousands of users. WebGL[2] allows Shadertoy to access the compute power of the GPU to generate procedural art, animation, models, lighting, state based logic and sound.
History
[edit]Shadertoy was created by Pol Jeremias and Inigo Quilez in January 2013 and came online in February the same year.
The roots of the effort are in Inigo's "Shadertoy" section [3] in his computer graphics educational website.[4] With the arrival of the initial WebGL implementation by Mozilla's Firefox in 2009, Quilez created the first online live coding environment and curated repository of procedural shaders. This content was donated by 18 authors from the Demoscene and showcased advanced real-time and interactive animations never seen in the Web before, such as raymarched metaballs, fractals and tunnel effects.
After having worked together in several real-time rendering projects together for years, in December 2012 Quilez and Pol decided to create a new Shadertoy site that would follow the tradition of the original Shadertoy page with its demoscene flavored resource and size constrained real-time graphics content, but would add social and community features and embrace an open-source attitude.
The page came out with the live editor, real-time playback, browsing and searching capabilities, tagging and commenting features. Content wise, Shadertoy provided a fixed and limited set of textures for its users to utilize in creative ways. Over the years Shadertoy added extra features, such as webcam and microphone input support, video, music, Virtual Reality rendering and multi-pass rendering.
There are over 31 thousand contributions in total from thousands of users, several of which are referenced in academic papers [citation needed]. Shadertoy also hosts annual competitions and events. [5]
Features
[edit]- Editing: syntax highlighted editor with immediate visual feedback
- Social: commenting on shadertoys, voting (liking)
- Sharing: permanent URLs, embedded in other websites, private shader sharing
- Rendering: floating point buffer based multipass and history
- Media inputs: microphone, webcam, keyboard, mouse, VR HMDs, soundcloud, video, textures
Mentions
[edit]Shadertoy is referenced in several sources:
- NVidia developer blog, Jun 2016, Shadertoy Contest 2016 Announced.[6]
- Siggraph Real-Time Live!, 2015, an interactive sound visualizing project.[7]
- Hacker News, 2014, Shadertoy adds procedural GPU-generated music in the browser.[8]
- Numerical Methods for Ray Tracing Implicitly Defined Surfaces,[9]
- CS 371 Course at Williams College, 2014, Inspiration for CS 371[10]
- Real-Time Rendering, Aug 2015, Seven Things for August 20, 2015.[11]
References
[edit]- ^ McGuire, Morgan. "Midterm Inspiration" (PDF). CS371: Computational Graphics [Fall 2014]. Archived (PDF) from the original on 2020-11-15. Retrieved 29 June 2024.
- ^ "Khronos Releases Final WebGL 1.0 Specification". Khronos Group. March 3, 2011. Retrieved 2 June 2012.
- ^ "Shader Toy". www.iquilezles.org. Archived from the original on 2019-06-30. Retrieved 2016-08-07.
- ^ "Inigo Quilez".
- ^ "Siggraph 2015 Shadertoy Competition". Archived from the original on 2016-09-10. Retrieved 2016-08-07.
- ^ "NVidia developer blog". 2016. Retrieved 2 June 2016.
- ^ "Shadertoy Competition at Siggraph 2015 . Real-Time Live!". Archived from the original on 2016-09-10. Retrieved 2015-08-13.
- ^ "Hacker News". ycombinator. Retrieved 2020-08-31.
- ^ "Numerical Methods for Ray Tracing Implicitly Defined Surfaces" (PDF). Williams College. Archived from the original (PDF) on 2015-09-06. Retrieved 2014-09-25.
- ^ "CS 371" (PDF). Williams College.[dead link]
- ^ "Real-Time Rendering - Seven Things for August 20, 2015". realtimerendering.com. 2015. Retrieved 20 August 2015.
External links
[edit]Shadertoy
View on GrokipediaIntroduction
Overview
Shadertoy is an online community and platform that enables users to create, edit, and share procedural shaders directly in web browsers using WebGL and GLSL (OpenGL Shading Language).[3] It functions as a collaborative space for developers, artists, and hobbyists to experiment with real-time graphics programming without requiring specialized software installations. The core purpose of Shadertoy revolves around generating procedural content, where users write code to produce dynamic visual outputs such as artistic patterns, fluid animations, ray-traced 3D models, complex lighting simulations, and visuals that respond to audio inputs in real time.[3][4] These capabilities make it a versatile tool for exploring mathematical and algorithmic approaches to computer graphics.[5] As of 2024, Shadertoy features over 80,000 public shaders contributed by tens of thousands of users worldwide, underscoring its significance in making GPU-accelerated graphics accessible to a broad audience beyond professional environments.[4] This scale reflects its community-driven evolution, fostering inspiration and learning through shared examples.[6] At its technical foundation, Shadertoy leverages HTML5 Canvas elements combined with WebGL for rendering, allowing shaders to execute on the user's GPU directly within the browser for immediate feedback and high-performance computation.[3]History
Shadertoy was co-founded by Iñigo Quilez, a demoscene veteran known for procedural graphics productions since the late 1990s, and Pol Jeremias Vila under their company Beautypi, both motivated by Quilez's early WebGL experiments beginning in 2009.[7][2] Quilez's work on the 4KB WebGL demoscene entry Elevated for the Function 2009 competition demonstrated real-time procedural rendering in browsers, evolving from personal shader demos hosted on his website into a shared platform concept by late 2012.[7] The site officially launched in February 2013 as a collaborative space for creating and sharing live-coded fragment shaders, drawing inspiration from demoscene traditions of compact, algorithmic visuals.[2] Initially focused on single-pass rendering with inputs like time, mouse, and audio, it quickly attracted contributors interested in procedural art and graphics experimentation via HTML5 and WebGL.[8] Key milestones included the addition of multi-pass rendering in early 2016, enabling complex simulations by allowing shaders to reference previous frame outputs across up to four buffers.[9] Around the same period, support for virtual reality was introduced through a dedicatedmainVR entry point, facilitating stereoscopic rendering for headsets like the Oculus Rift and expanding applications to immersive environments.[10] By 2016, the platform hosted thousands of user-submitted shaders, reflecting growing interest in browser-based graphics amid WebGL's maturation.[7]
Shadertoy's growth was propelled by advancements in WebGL standards and the rising popularity of procedural generation techniques, fostering a community-driven repository that integrated seamlessly with emerging web technologies for real-time visuals.[8]
Technical Features
Shader Editor
The Shader Editor in Shadertoy serves as the primary interface for developing GLSL fragment shaders, providing a web-based environment that facilitates real-time code editing and visualization. The layout features a preview pane utilizing a WebGL canvas to render shader outputs instantaneously, alongside a code editor with syntax highlighting for GLSL. This setup allows users to observe dynamic effects as they type, with automatic compilation upon code modifications for immediate feedback. Basic controls in the preview area include options to pause animation, record output, and toggle full-screen mode.[11] The basic workflow centers on the "Image" tab, where users author fragment shaders that compute pixel colors on a per-fragment basis. Editing occurs directly in the browser, with live rendering updating the preview pane to reflect changes, enabling iterative experimentation without manual compilation steps. Error highlighting appears in the editor to identify syntax issues, streamlining debugging. Shaders are saved and shared via unique URLs generated upon publication.[12] Shadertoy primarily supports fragment shaders written in GLSL ES 1.00, targeting WebGL 1.0 or higher for browser compatibility, with no software download required. Predefined uniforms simplify common inputs, such asiTime for elapsed seconds since shader start (enabling animations), iResolution for viewport dimensions in pixels, and iMouse for mouse coordinates and click state (x, y for position; z, w for click detection). These are automatically provided and cannot be modified by users. Mobile browser support exists but is limited by input handling differences, such as touch gestures not fully mapping to mouse events.[12][13]
A typical shader structure begins with uniform declarations followed by the mandatory entry point function void mainImage(out vec4 fragColor, in vec2 fragCoord). Within this function, computations determine the output color for each pixel based on fragCoord (normalized coordinates from 0.0 to 1.0). For instance:
[uniform](/page/Uniform) vec3 iResolution; // [viewport](/page/Viewport) resolution (in [pixel](/page/Pixel)s)
[uniform](/page/Uniform) float iTime; // shader playback time (in seconds)
[uniform](/page/Uniform) vec4 iMouse; // mouse position (x, y) and click (z, w)
void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
// Normalized pixel coordinates (from 0 to 1)
vec2 uv = fragCoord/iResolution.xy;
// Time-varying color
vec3 col = 0.5 + 0.5*cos(iTime + uv.xyx + vec3(0,2,4));
fragColor = vec4(col, 1.0);
}
[uniform](/page/Uniform) vec3 iResolution; // [viewport](/page/Viewport) resolution (in [pixel](/page/Pixel)s)
[uniform](/page/Uniform) float iTime; // shader playback time (in seconds)
[uniform](/page/Uniform) vec4 iMouse; // mouse position (x, y) and click (z, w)
void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
// Normalized pixel coordinates (from 0 to 1)
vec2 uv = fragCoord/iResolution.xy;
// Time-varying color
vec3 col = 0.5 + 0.5*cos(iTime + uv.xyx + vec3(0,2,4));
fragColor = vec4(col, 1.0);
}
Rendering and Inputs
Shadertoy's rendering pipeline employs a full-screen quad approach, where the GPU renders a quadrilateral spanning the entire canvas using two triangles. Each invocation of the fragment shader computes the color for a single pixel based on its UV coordinates, typically accessed via the built-infragCoord variable normalized against the canvas resolution. This setup allows shaders to generate procedural visuals directly in the fragment stage without vertex processing or geometry, focusing computation on per-pixel effects.[14]
Standard inputs are provided through built-in uniforms that supply essential runtime data to the shader. The iTime uniform delivers the elapsed time in seconds since the shader started, enabling animations and temporal effects. iResolution is a vec3 representing the viewport dimensions in pixels, with the z-component indicating pixel aspect ratio (usually 1.0). iFrame tracks the current frame number as an integer, useful for frame-specific logic, while iChannelTime[4] offers per-channel timing in seconds for synchronized multi-input scenarios. Additional uniforms like iTimeDelta (frame render time in seconds), iFrameRate (frames per second), iMouse (mouse coordinates and click state as vec4), iDate (current date and time as vec4), and iSampleRate (audio sample rate, typically 44100 Hz) further support interactive and time-based rendering. These are accessible across image, buffer, and sound shader types.[14]
Media inputs enhance shaders with external data via the iChannel samplers (up to four: iChannel0–3), each a sampler2D texture. Webcam capture is supported by assigning the device to an iChannel (e.g., iChannel0), providing real-time video as a rectangular texture whose resolution is queryable via iChannelResolution[4]. Uploaded images serve as static 2D textures, often in sRGB format requiring gamma correction (e.g., pow(texture(iChannelX, uv).rgb, vec3(2.2)) for linear space). Videos function similarly but evolve over time, with no texture wrapping or mipmapping by default. Microphone audio is integrated as a texture (typically 512×2 pixels) assigned to an iChannel, where the first row encodes FFT spectrum data (frequency magnitudes) and the second row holds raw waveform samples; FFT access uses UV y=0.25, with x mapping frequency normalized by half the sample rate. These inputs are sampled using texture(iChannelX, uv), where uv ranges from 0 to 1, allowing integration into pixel computations for effects like audio-reactive visuals or image-based distortions.[14][15][16]
Output is handled directly within the shader's mainImage(out vec4 fragColor, in vec2 fragCoord) function, where the computed color is assigned to fragColor (e.g., fragColor = vec4(finalColor, 1.0);) and written to the canvas framebuffer. In basic mode, this represents a single-pass render without an integrated post-processing chain, ensuring immediate display of the fragment results.[14]
Execution is inherently GPU-bound, leveraging parallel fragment processing for real-time performance, with a typical target of 60 frames per second on capable hardware. Complex shaders may drop below this on lower-end GPUs due to intensive computations, prompting optimizations like reduced sampling or algorithmic simplifications, though fallbacks ensure compatibility across devices.[14]
