Examine individual changes
This page allows you to examine the variables generated by the Edit Filter for an individual change.
Variables generated for this change
Variable | Value |
---|---|
Edit count of the user (user_editcount ) | 4530 |
Name of the user account (user_name ) | 'Kjerish' |
Age of the user account (user_age ) | 293206139 |
Groups (including implicit) the user is in (user_groups ) | [
0 => 'extendedconfirmed',
1 => '*',
2 => 'user',
3 => 'autoconfirmed'
] |
Rights that the user has (user_rights ) | [
0 => 'extendedconfirmed',
1 => 'createaccount',
2 => 'read',
3 => 'edit',
4 => 'createtalk',
5 => 'writeapi',
6 => 'viewmyprivateinfo',
7 => 'editmyprivateinfo',
8 => 'editmyoptions',
9 => 'abusefilter-log-detail',
10 => 'urlshortener-create-url',
11 => 'centralauth-merge',
12 => 'abusefilter-view',
13 => 'abusefilter-log',
14 => 'vipsscaler-test',
15 => 'collectionsaveasuserpage',
16 => 'reupload-own',
17 => 'move-rootuserpages',
18 => 'createpage',
19 => 'minoredit',
20 => 'editmyusercss',
21 => 'editmyuserjson',
22 => 'editmyuserjs',
23 => 'purge',
24 => 'sendemail',
25 => 'applychangetags',
26 => 'viewmywatchlist',
27 => 'editmywatchlist',
28 => 'spamblacklistlog',
29 => 'mwoauthmanagemygrants',
30 => 'reupload',
31 => 'upload',
32 => 'move',
33 => 'autoconfirmed',
34 => 'editsemiprotected',
35 => 'skipcaptcha',
36 => 'ipinfo',
37 => 'ipinfo-view-basic',
38 => 'transcode-reset',
39 => 'transcode-status',
40 => 'createpagemainns',
41 => 'movestable',
42 => 'autoreview',
43 => 'enrollasmentor'
] |
Whether the user is editing from mobile app (user_app ) | false |
Whether or not a user is editing through the mobile interface (user_mobile ) | false |
Page ID (page_id ) | 74947242 |
Page namespace (page_namespace ) | 0 |
Page title without namespace (page_title ) | 'Gaussian splatting' |
Full page title (page_prefixedtitle ) | 'Gaussian splatting' |
Edit protection level of the page (page_restrictions_edit ) | [] |
Page age in seconds (page_age ) | 1561240 |
Action (action ) | 'edit' |
Edit summary/reason (summary ) | 'Started page' |
Old content model (old_content_model ) | 'wikitext' |
New content model (new_content_model ) | 'wikitext' |
Old page wikitext, before the edit (old_wikitext ) | '#REDIRECT [[Volume rendering#Splatting]]
{{Rcat shell|
{{R to section}}
}}' |
New page wikitext, after the edit (new_wikitext ) | ''''Gaussian splatting''' is a [[volume rendering]] technique that deals with the direct rendering of volume data without converting the data into surface or line [[Geometric primitive|primitives]].<ref name="splatting">{{cite web|last=Westover|first=Lee Alan|title=SPLATTING: A Parallel, Feed-Forward Volume Rendering Algorithm|url=http://www.cs.unc.edu/techreports/91-029.pdf|access-date=28 June 2012|date=July 1991}}{{dead link|date=September 2023}}</ref> The technique was originally introduced as splatting by Lee Westover in the early 1990s.<ref name="fastsplat">{{cite web|last=Huang|first=Jian|title=Splatting|url=http://web.eecs.utk.edu/~huangj/CS594S02/splatting.ppt|access-date=5 August 2011|format=PPT|date=Spring 2002}}</ref> With advancements in computer graphics, newer methods such as 3D and 4D Gaussian splatting have been developed to offer [[Real-time computer graphics|real-time]] radiance field rendering and dynamic scene rendering respectively.<ref name="3d"/><ref name="4d"/>
== Development ==
[[Volume rendering]] focuses on the generation of images from discrete samples of volume data that are usually sampled in three dimensions. This data can originate from a variety of sources, such as [[CT scan]]s or ozone density readings. Traditional methods transformed this volumetric data into lines and surface [[Geometric primitive|primitives]] to be displayed on computer graphics displays. However, this approach sometimes introduced [[Artifact (error)|artifacts]] and hampered interactive viewing.<ref name="splatting"/>
To address these issues, direct rendering techniques, which operate on the original volume data, were developed. These methods are classified into two primary categories: feed-backward methods and feed-forward methods. The splatting algorithm, being a feed-forward method, is directly concerned with the rendering of [[Rectilinear polygon|rectilinear volume]] meshes. Splatting can be customized to render volumes as either clouds or surfaces by altering the shading functions, providing flexibility in the rendering process.<ref name="splatting"/>
Since its inception, splatting has undergone various improvements. Some significant developments include textured splats, [[Spatial anti-aliasing|anti-aliasing]], image-aligned sheet-based splatting, post-classified splatting, and the introduction of a splat primitive known as FastSplats.<ref name="fastsplat"/>
=== 3D Splatting ===
Recent advancements in novel-view synthesis have showcased the utility of [[Neural radiance field|Radiance Field]] methods. To enhance visual quality while ensuring real-time display rates, a new method was introduced that uses 3D Gaussians. This method integrates sparse points produced during camera calibration, representing scenes with 3D Gaussians which retain properties of continuous volumetric radiance fields. Additionally, an [[Interleaving (data)|interleaved]] optimization/density control of the 3D Gaussians was introduced along with a rapid visibility-aware rendering algorithm supporting anisotropic splatting.<ref name="3d">{{cite arXiv|authors=Bernhard Kerbl, Georgios Kopanas, Thomas Leimkühler, George Drettakis|date=8 Aug 2023|title=3D Gaussian Splatting for Real-Time Radiance Field Rendering|eprint=2308.04079|class=cs.GR}}</ref> This technique has shown potential in synthesizing high-quality 3D scenes from 2D images in real-time.<ref name="hackaday">{{cite web|url=https://hackaday.com/2023/09/02/high-quality-3d-scene-generation-from-2d-source-in-realtime|title=HIGH QUALITY 3D SCENE GENERATION FROM 2D SOURCE, IN REALTIME|last=Papp|first=Donald|access-date=October 18, 2023|website=hackaday.com|publisher=[[Hackaday]]}}</ref>
=== 4D Gaussian Splatting ===
Extending the concept of 3D Gaussian Splatting, the 4D Gaussian Splatting incorporates a time component, allowing for dynamic scene rendering. It represents and renders dynamic scenes, with a focus on modeling complex motions while maintaining efficiency.<ref name="4d">{{cite arXiv|authors=Guanjun Wu, Taoran Yi, Jiemin Fang, Lingxi Xie, Xiaopeng Zhang, Wei Wei, Wenyu Liu, Qi Tian, Xinggang Wang|date=12 Oct 2023|title=4D Gaussian Splatting for Real-Time Dynamic Scene Rendering|eprint=2310.08528|class=cs.CV}}</ref> The method uses a HexPlane to connect different adjacent Gaussians, providing an accurate representation of position and shape deformations. By utilizing only a single set of canonical 3D Gaussians, and predictive analytics, the 4D Gaussian splatting method models how they move over different timestamps.<ref name="venturebeat">{{cite web|url=https://venturebeat.com/ai/actors-worst-fears-come-true-new-4d-gaussian-splatting-method-captures-human-motion|title=Actors’ worst fears come true? New 4D Gaussian splatting method captures human motion|last=Franzen|first=Carl|access-date=October 18, 2023|website=venturebeat.com|publisher=[[VentureBeat]]}}</ref>
Achievements of this technique include real-time rendering on dynamic scenes with high resolutions, while maintaining quality. It showcases potential applications for future developments in film and other media, although there are current limitations regarding the length of motion captured.<ref name="venturebeat"/>
== See Also ==
* [[Neural radiance field]]
* [[Volume rendering]]
* [[Computer graphics]]
== References ==
<references />
{{Computer vision footer}}' |
Unified diff of changes made by edit (edit_diff ) | '@@ -1,4 +1,27 @@
-#REDIRECT [[Volume rendering#Splatting]]
-{{Rcat shell|
-{{R to section}}
-}}
+'''Gaussian splatting''' is a [[volume rendering]] technique that deals with the direct rendering of volume data without converting the data into surface or line [[Geometric primitive|primitives]].<ref name="splatting">{{cite web|last=Westover|first=Lee Alan|title=SPLATTING: A Parallel, Feed-Forward Volume Rendering Algorithm|url=http://www.cs.unc.edu/techreports/91-029.pdf|access-date=28 June 2012|date=July 1991}}{{dead link|date=September 2023}}</ref> The technique was originally introduced as splatting by Lee Westover in the early 1990s.<ref name="fastsplat">{{cite web|last=Huang|first=Jian|title=Splatting|url=http://web.eecs.utk.edu/~huangj/CS594S02/splatting.ppt|access-date=5 August 2011|format=PPT|date=Spring 2002}}</ref> With advancements in computer graphics, newer methods such as 3D and 4D Gaussian splatting have been developed to offer [[Real-time computer graphics|real-time]] radiance field rendering and dynamic scene rendering respectively.<ref name="3d"/><ref name="4d"/>
+
+== Development ==
+
+[[Volume rendering]] focuses on the generation of images from discrete samples of volume data that are usually sampled in three dimensions. This data can originate from a variety of sources, such as [[CT scan]]s or ozone density readings. Traditional methods transformed this volumetric data into lines and surface [[Geometric primitive|primitives]] to be displayed on computer graphics displays. However, this approach sometimes introduced [[Artifact (error)|artifacts]] and hampered interactive viewing.<ref name="splatting"/>
+
+To address these issues, direct rendering techniques, which operate on the original volume data, were developed. These methods are classified into two primary categories: feed-backward methods and feed-forward methods. The splatting algorithm, being a feed-forward method, is directly concerned with the rendering of [[Rectilinear polygon|rectilinear volume]] meshes. Splatting can be customized to render volumes as either clouds or surfaces by altering the shading functions, providing flexibility in the rendering process.<ref name="splatting"/>
+
+Since its inception, splatting has undergone various improvements. Some significant developments include textured splats, [[Spatial anti-aliasing|anti-aliasing]], image-aligned sheet-based splatting, post-classified splatting, and the introduction of a splat primitive known as FastSplats.<ref name="fastsplat"/>
+
+=== 3D Splatting ===
+Recent advancements in novel-view synthesis have showcased the utility of [[Neural radiance field|Radiance Field]] methods. To enhance visual quality while ensuring real-time display rates, a new method was introduced that uses 3D Gaussians. This method integrates sparse points produced during camera calibration, representing scenes with 3D Gaussians which retain properties of continuous volumetric radiance fields. Additionally, an [[Interleaving (data)|interleaved]] optimization/density control of the 3D Gaussians was introduced along with a rapid visibility-aware rendering algorithm supporting anisotropic splatting.<ref name="3d">{{cite arXiv|authors=Bernhard Kerbl, Georgios Kopanas, Thomas Leimkühler, George Drettakis|date=8 Aug 2023|title=3D Gaussian Splatting for Real-Time Radiance Field Rendering|eprint=2308.04079|class=cs.GR}}</ref> This technique has shown potential in synthesizing high-quality 3D scenes from 2D images in real-time.<ref name="hackaday">{{cite web|url=https://hackaday.com/2023/09/02/high-quality-3d-scene-generation-from-2d-source-in-realtime|title=HIGH QUALITY 3D SCENE GENERATION FROM 2D SOURCE, IN REALTIME|last=Papp|first=Donald|access-date=October 18, 2023|website=hackaday.com|publisher=[[Hackaday]]}}</ref>
+
+=== 4D Gaussian Splatting ===
+Extending the concept of 3D Gaussian Splatting, the 4D Gaussian Splatting incorporates a time component, allowing for dynamic scene rendering. It represents and renders dynamic scenes, with a focus on modeling complex motions while maintaining efficiency.<ref name="4d">{{cite arXiv|authors=Guanjun Wu, Taoran Yi, Jiemin Fang, Lingxi Xie, Xiaopeng Zhang, Wei Wei, Wenyu Liu, Qi Tian, Xinggang Wang|date=12 Oct 2023|title=4D Gaussian Splatting for Real-Time Dynamic Scene Rendering|eprint=2310.08528|class=cs.CV}}</ref> The method uses a HexPlane to connect different adjacent Gaussians, providing an accurate representation of position and shape deformations. By utilizing only a single set of canonical 3D Gaussians, and predictive analytics, the 4D Gaussian splatting method models how they move over different timestamps.<ref name="venturebeat">{{cite web|url=https://venturebeat.com/ai/actors-worst-fears-come-true-new-4d-gaussian-splatting-method-captures-human-motion|title=Actors’ worst fears come true? New 4D Gaussian splatting method captures human motion|last=Franzen|first=Carl|access-date=October 18, 2023|website=venturebeat.com|publisher=[[VentureBeat]]}}</ref>
+
+Achievements of this technique include real-time rendering on dynamic scenes with high resolutions, while maintaining quality. It showcases potential applications for future developments in film and other media, although there are current limitations regarding the length of motion captured.<ref name="venturebeat"/>
+
+== See Also ==
+* [[Neural radiance field]]
+* [[Volume rendering]]
+* [[Computer graphics]]
+
+== References ==
+<references />
+
+{{Computer vision footer}}
' |
New page size (new_size ) | 5367 |
Old page size (old_size ) | 74 |
Size change in edit (edit_delta ) | 5293 |
Lines added in edit (added_lines ) | [
0 => ''''Gaussian splatting''' is a [[volume rendering]] technique that deals with the direct rendering of volume data without converting the data into surface or line [[Geometric primitive|primitives]].<ref name="splatting">{{cite web|last=Westover|first=Lee Alan|title=SPLATTING: A Parallel, Feed-Forward Volume Rendering Algorithm|url=http://www.cs.unc.edu/techreports/91-029.pdf|access-date=28 June 2012|date=July 1991}}{{dead link|date=September 2023}}</ref> The technique was originally introduced as splatting by Lee Westover in the early 1990s.<ref name="fastsplat">{{cite web|last=Huang|first=Jian|title=Splatting|url=http://web.eecs.utk.edu/~huangj/CS594S02/splatting.ppt|access-date=5 August 2011|format=PPT|date=Spring 2002}}</ref> With advancements in computer graphics, newer methods such as 3D and 4D Gaussian splatting have been developed to offer [[Real-time computer graphics|real-time]] radiance field rendering and dynamic scene rendering respectively.<ref name="3d"/><ref name="4d"/>',
1 => '',
2 => '== Development ==',
3 => '',
4 => '[[Volume rendering]] focuses on the generation of images from discrete samples of volume data that are usually sampled in three dimensions. This data can originate from a variety of sources, such as [[CT scan]]s or ozone density readings. Traditional methods transformed this volumetric data into lines and surface [[Geometric primitive|primitives]] to be displayed on computer graphics displays. However, this approach sometimes introduced [[Artifact (error)|artifacts]] and hampered interactive viewing.<ref name="splatting"/>',
5 => '',
6 => 'To address these issues, direct rendering techniques, which operate on the original volume data, were developed. These methods are classified into two primary categories: feed-backward methods and feed-forward methods. The splatting algorithm, being a feed-forward method, is directly concerned with the rendering of [[Rectilinear polygon|rectilinear volume]] meshes. Splatting can be customized to render volumes as either clouds or surfaces by altering the shading functions, providing flexibility in the rendering process.<ref name="splatting"/>',
7 => '',
8 => 'Since its inception, splatting has undergone various improvements. Some significant developments include textured splats, [[Spatial anti-aliasing|anti-aliasing]], image-aligned sheet-based splatting, post-classified splatting, and the introduction of a splat primitive known as FastSplats.<ref name="fastsplat"/>',
9 => '',
10 => '=== 3D Splatting ===',
11 => 'Recent advancements in novel-view synthesis have showcased the utility of [[Neural radiance field|Radiance Field]] methods. To enhance visual quality while ensuring real-time display rates, a new method was introduced that uses 3D Gaussians. This method integrates sparse points produced during camera calibration, representing scenes with 3D Gaussians which retain properties of continuous volumetric radiance fields. Additionally, an [[Interleaving (data)|interleaved]] optimization/density control of the 3D Gaussians was introduced along with a rapid visibility-aware rendering algorithm supporting anisotropic splatting.<ref name="3d">{{cite arXiv|authors=Bernhard Kerbl, Georgios Kopanas, Thomas Leimkühler, George Drettakis|date=8 Aug 2023|title=3D Gaussian Splatting for Real-Time Radiance Field Rendering|eprint=2308.04079|class=cs.GR}}</ref> This technique has shown potential in synthesizing high-quality 3D scenes from 2D images in real-time.<ref name="hackaday">{{cite web|url=https://hackaday.com/2023/09/02/high-quality-3d-scene-generation-from-2d-source-in-realtime|title=HIGH QUALITY 3D SCENE GENERATION FROM 2D SOURCE, IN REALTIME|last=Papp|first=Donald|access-date=October 18, 2023|website=hackaday.com|publisher=[[Hackaday]]}}</ref>',
12 => '',
13 => '=== 4D Gaussian Splatting ===',
14 => 'Extending the concept of 3D Gaussian Splatting, the 4D Gaussian Splatting incorporates a time component, allowing for dynamic scene rendering. It represents and renders dynamic scenes, with a focus on modeling complex motions while maintaining efficiency.<ref name="4d">{{cite arXiv|authors=Guanjun Wu, Taoran Yi, Jiemin Fang, Lingxi Xie, Xiaopeng Zhang, Wei Wei, Wenyu Liu, Qi Tian, Xinggang Wang|date=12 Oct 2023|title=4D Gaussian Splatting for Real-Time Dynamic Scene Rendering|eprint=2310.08528|class=cs.CV}}</ref> The method uses a HexPlane to connect different adjacent Gaussians, providing an accurate representation of position and shape deformations. By utilizing only a single set of canonical 3D Gaussians, and predictive analytics, the 4D Gaussian splatting method models how they move over different timestamps.<ref name="venturebeat">{{cite web|url=https://venturebeat.com/ai/actors-worst-fears-come-true-new-4d-gaussian-splatting-method-captures-human-motion|title=Actors’ worst fears come true? New 4D Gaussian splatting method captures human motion|last=Franzen|first=Carl|access-date=October 18, 2023|website=venturebeat.com|publisher=[[VentureBeat]]}}</ref>',
15 => '',
16 => 'Achievements of this technique include real-time rendering on dynamic scenes with high resolutions, while maintaining quality. It showcases potential applications for future developments in film and other media, although there are current limitations regarding the length of motion captured.<ref name="venturebeat"/>',
17 => '',
18 => '== See Also ==',
19 => '* [[Neural radiance field]]',
20 => '* [[Volume rendering]]',
21 => '* [[Computer graphics]]',
22 => '',
23 => '== References ==',
24 => '<references />',
25 => '',
26 => '{{Computer vision footer}}'
] |
Lines removed in edit (removed_lines ) | [
0 => '#REDIRECT [[Volume rendering#Splatting]]',
1 => '{{Rcat shell|',
2 => '{{R to section}}',
3 => '}}'
] |
Whether or not the change was made through a Tor exit node (tor_exit_node ) | false |
Unix timestamp of change (timestamp ) | '1697658800' |