\chapter{Methodology} \section{Introduction to the Methodology} % Write 1–2 paragraphs that: % - Explain why this chapter exists (to describe how the research was carried out). % - Summarize the main components (research design, tools, implementation, validation). % - Link back to the Problem Statement and Literature Review: this chapter shows *how* you addressed the problem. This chapter explains how the research was carried out and how the core algorithms were developed. The synthesis did not arise from a gradual accumulation of partial solutions, but from a decisive insight gained after years of experience with \(3\)D illustration. That insight was to frame the entire problem in terms of affine linear algebra, which provided the clarity and rigor needed to resolve clipping and occlusion systematically. The implementation was carried out in \LaTeX{} and Lua, not by accident but because I am a mathematics textbook illustrator, and these are the tools I know and use daily. This choice ensured that the algorithms were embedded directly in the environment where they are most relevant, while also demonstrating their applicability beyond it. The following sections present the design, implementation, and validation of the algorithms, showing how they directly address the fundamental gaps identified in the Problem Statement and Literature Review. \section{Research Design and Overall Approach} % Write 1–2 paragraphs that: % - State the overall research design (qualitative, quantitative, mixed-methods, computational, experimental). % - Justify why this design fits your research objectives. % - Mention whether the approach is exploratory, confirmatory, or developmental. The research design of this work is computational and developmental in nature. Its objective was not to test an existing theory, but to create and validate a new one. The approach is best described as exploratory, since the central algorithms were developed from first principles through problem-driven experimentation, guided by years of practical experience with \(3\)D illustration. This design is well-suited to the research objectives, which required the invention of novel methods for clipping and occlusion ordering of affine simplices. Rather than relying on preexisting frameworks, the work advances a new synthesis, implemented and tested in Lua and \LaTeX{}. The emphasis throughout is on demonstrating the feasibility and rigor of the approach, supported by concrete computational validation. \section{Detailed Description of the Proposed Approach} % Write several paragraphs that: % - Describe the methodology, algorithm, framework, or system step by step. % - Break it into subcomponents if necessary (e.g., preprocessing, computation, output). % - Keep the description precise but readable: focus on the "what" before the "why." % % Overview paragraph % Describe the full workflow: parametric objects → tessellation (with transformations) → clipping → occlusion → rendering % Mention that the following subsections describe each component in detail Users are provided with commands that automatically tessellate, clip, and occlusion-sort parametric objects with projective transformations applied. The following subsections describe each component of the system in detail. \subsection{How Users Interact with the Software} % Describe the LaTeX macros and Lua functions available to authors % Add placeholders for example commands, syntax, and usage patterns \subsubsection{Command Name: \texttt{\textbackslash displaysegments}} This command clips intersecting simplices produced by the tessellation of parametric objects and performs occlusion sorting on the resulting set. It ensures that overlapping simplices are rendered correctly according to their depth and spatial relationships. \subsubsection{Command Name: \texttt{\textbackslash appendpoint}} This command appends a projectively transformed \(0\)-dimensional affine simplex (a point) to the list of simplices, also called the list of segments. An example of its usage is as follows: \begin{verbatim} \appendpoint[ x = {cos(tau/6)} ,y = {tau/6} ,z = {0} ,fill options = { fill = red ,fill opacity = 0.7 } ,transformation = {euler(pi/2,pi/3,pi/6)} ] \end{verbatim} \subsubsection{Command Name: \texttt{\textbackslash appenlabel}} This command appends a label positioned at a projectively transformed \(0\)-dimensional affine simplex (a point) to the list of simplices. An example of its usage is as follows: \begin{verbatim} \appendlabel[ x = {cos(tau/6)} ,y = {tau/6} ,z = {0} ,name = {Hi!} ,transformation = {euler(pi/2,pi/3,pi/6)} ] \end{verbatim} Note: there are still struggles with adding math mode to labels. \subsubsection{Command Name: \texttt{\textbackslash appendsurface}} This command tessellates a projectively transformed \(2\)-dimensional parametric surface into triangles, and adds those triangles to the list of segments. An example of its usage is as follows: \begin{verbatim} \appendsurface[ ustart = {0} ,ustop = {1} ,usamples = {18} ,vstart = {0} ,vstop = {1} ,vsamples = {9} ,x = {sphere(tau*u/36,pi*v/18)[1][1]} ,y = {sphere(tau*u/36,pi*v/18)[1][2]} ,z = {sphere(tau*u/36,pi*v/18)[1][3]} ,fill options = { preaction = { fill = green ,fill opacity = 0.8 } ,postaction = { draw ,line join = round ,line cap = round } } ,transformation = {euler(pi/2,pi/3,pi/6)} ] \end{verbatim} \subsubsection{Command Name: \texttt{\textbackslash appendcurve}} This command tessellates a projectively transformed parametric curve into line segments, and adds those line segments to the list of segments. An example of its usage is as follows: \begin{verbatim} \appendcurve[ ustart = {0} ,ustop = {1} ,usamples = {36} ,x = {cos(tau*u)} ,y = {sin(tau*u)} ,z = {0} ,draw options = { draw ,red } ,transformation = {euler(pi/2,pi/3,pi/6)} ,arrow tip = {true} ,arrow tip options = {fill = green} ,arrow tail = {true} ] \end{verbatim} \subsubsection{Command Name: \texttt{\textbackslash appendsolid}} This command tessellates a projectively transformed parametric solid into boundary triangles, and adds those triangles to the list of segments. An example of its usage is as follows: \begin{verbatim} \appendsolid[ ustart = {0} ,ustop = {1} ,usamples = {18} ,vstart = {0} ,vstop = {1} ,vsamples = {6} ,wstart = {0} ,wstop = {1} ,wsamples = {2} ,x = {w*sphere(tau*u/36,pi*v/18)[1][1]} ,y = {w*sphere(tau*u/36,pi*v/18)[1][2]} ,z = {w*sphere(tau*u/36,pi*v/18)[1][3]} ,fill options = { preaction = { fill = green ,fill opacity = 0.8 } ,postaction = { draw ,line join = round ,line cap = round } } ,transformation = {euler(pi/2,pi/3,pi/6)} ] \end{verbatim} \subsection{How Parametric Objects Are Tessellated into Simplices with Transformations Applied} % Describe the representation of parametric objects % Explain how surfaces or curves are divided into points, line segments, and triangles % Explain how transformations (translation, rotation, scaling) are applied to simplices during tessellation % Add notes for adjustable resolution, user options, and affine coordinates A zero-dimensional parametric object requires only one sample and is tessellated by a single zero-dimensional affine simplex (a point). A one-dimensional parametric object (a curve) often requires more, and is tessellated by one-dimensional affine simplices (line segments). This is done using an even subdivision of the parameter into samples. Similarly, a two-dimensional parametric object is sampled along both input parameters to obtain a mesh, which is then triangulated. Triangles are used instead of quadrilaterals because quads are often not coplanar and thus ambiguous. A parametric solid is tessellated along three parameters in the same way. \subsection{How Intersections Between Simplices Are Detected and Resolved} % Explain the clipping algorithm for segments and triangles % Include comments for edge cases, affine operations, and examples Intersections are eliminated through minimal partitioning of simplices. To identify the cases, a bottom-up approach is taken: we first determine the meaningful ways to partition lower-dimensional simplices, and then extend this to higher dimensions. There is no meaningful way to partition a point by another point, so that case is omitted. A point can divide a line segment into two if they intersect, so this case is retained. A point and a triangle have no meaningful partitioning with respect to each other. A line segment can partition another line segment, and a line segment can also be partitioned by a triangle. Finally, a triangle can be partitioned by another triangle. For reasons of minimality, only one of the two intersecting simplices is partitioned in each case. \subsubsection{Partitioning a Line Segment by a Point} We first check whether a point lies on a line segment, and if it does, we partition the line segment at that point. To perform this test, the line segment is expressed as a one-dimensional affine basis by replacing the second endpoint with its difference from the first. We then compute the vector from the affine origin to the point. Next, we take the orthogonal projection of this vector onto the segmen's direction vector. If the sum of this projection with the affine origin is nearly coincident with the point, we proceed with testing; otherwise, the point is not on the segment. To confirm, we apply Gauss--Jordan elimination to express the point in terms of the line's affine basis. If the resulting coordinate lies within the unit interval, the point is indeed on the segment, and the segment is partitioned. \subsubsection{Partitioning a Line Segment by Another Line Segment} We first check whether two line segments intersect, and if they do, we partition one of them at the intersection point. To detect an intersection, each segment is expressed as a one-dimensional affine basis. The lines spanned by these bases are then intersected using Gauss--Jordan elimination. If the resulting parameters for both segments lie within their respective unit intervals, the segments intersect, and partitioning is performed. \subsubsection{Partitioning a Line Segment by a Triangle} If an intersection between the line segment and the triangle can be detected, we partition the line segment at that point. Both simplices are expressed as affine bases in their respective dimensions, and the intersection is obtained by solving the resulting linear system via Gauss--Jordan elimination. If the line segment’s coefficient lies within the unit interval, the candidate intersection is retained; otherwise, it is discarded. When the line segment does intersect the plane of the triangle, we determine whether the intersection point lies inside the triangle using the cross-product method \cite{51328}. In this approach, the triangle is tessellated into one-dimensional affine bases (its edges), and each vertex is connected to the point being tested, forming another one-dimensional affine basis. For every pair of affine bases emanating from a common vertex, we compute their rotationally ordered cross product. If all such cross products point in the same direction, the point lies inside the triangle. If the point lies outside, at least one rotationally ordered cross product will traverse its angle in the opposite orientation, producing an anticommutative result. \subsubsection{Partitioning a Triangle by Another Triangle} The partitioning of a triangle by another triangle builds upon the previous cases. The first step is to detect intersections between their edges. If two intersections are found (the expected outcome, though safeguards are included for degenerate cases), the cutting triangle partitions the other along the line defined by these two points. This produces a triangle and a quadrilateral, the latter of which is subdivided into two triangles. The intersections themselves are computed by treating each edge as a line segment and determining its intersection with the opposing triangle. \subsection{How Occlusion Ordering Is Determined} % Explain orthogonal projection onto the viewing plane % Describe the localized ordering of overlapping simplices % Explain how the partial transitive order is resolved into a global ordering % Add comments for pseudocode, diagrams, and examples Occlusion ordering in our system is performed by first orthogonally projecting the two simplices onto the viewing plane. If a point of overlap is detected, we sort them according to the inverse orthogonal projection of that point back onto each shape. \subsubsection{Point Versus Point} If the points are not coincident but their projections coincide, then they are ordered by depth. Otherwise, the test remains inconclusive. \subsubsection{Point Versus Line Segment} This routine determines the occlusion relationship between a point and a line segment. It first expresses the line segment in terms of an affine basis, given by its origin and direction vector. The point is then projected orthogonally onto the line defined by the segment, producing a candidate projection. If the projection of the point onto the viewing plane (its \(xy\)-coordinates) is nearly identical to the projection of this candidate, the test proceeds; otherwise, the point and line segment are considered not to occlude each other. Next, the algorithm checks whether the projection lies within the bounds of the segment itself by comparing vector signs and computing a normalized coefficient. If the projection falls within the unit interval of the segment, the algorithm reduces the problem to comparing the depth of the point and its projection on the line. If these conditions fail, the test is inconclusive. \subsubsection{Point Versus Triangle} This routine compares the occlusion relationship between a point and a triangle. The point and the triangle are first projected orthogonally onto the viewing plane, where a cross-product test is applied to check whether the projected point lies inside the projected triangle. If this test fails, the point and triangle are considered not to occlude one another. If the point lies inside the projection, the algorithm then projects the point vertically onto the plane of the triangle. Using an affine basis for the triangle, the barycentric coordinates of the point with respect to the triangle are solved via Gauss--Jordan elimination. If the solution lies within the unit square (ensuring the projection is inside the triangle), the algorithm reduces the problem to a point-point occlusion comparison between the original point and its projection on the triangle. If any step fails to satisfy these conditions, the test remains inconclusive. \subsubsection{Line Segment Versus Line Segment} This routine determines the occlusion relationship between two line segments. First, both segments are projected onto the viewing plane, and their direction vectors are computed. If the direction vectors are not parallel, the algorithm solves for parameters \(t\) and \(s\) in the affine equations of the two lines using Gauss--Jordan elimination. If both parameters lie within the unit interval, the intersection point of the two segments is found, and the occlusion is reduced to a point-point comparison of the coincident intersection. If the direction vectors are parallel, the algorithm instead falls back to endpoint tests: each endpoint of one segment is compared against the other segment using the point-line occlusion procedure. If any endpoint is found to occlude, the segments are ordered accordingly. If no consistent ordering can be determined, the test remains inconclusive. \subsubsection{Line Segment Versus Triangle} This routine compares the occlusion relationship between a line segment and a triangle. The algorithm begins by testing each endpoint of the segment against the triangle using the point-triangle occlusion procedure. It then tests the segment itself against each of the triangle's edges using the line-segment occlusion procedure. If any of these comparisons establishes a definite ordering (occluded in front or behind), that result is returned. If no consistent conclusion can be drawn from the endpoint and edge tests, the routine returns inconclusive. \subsubsection{Triangle Versus Triangle} This routine compares the occlusion relationship between two triangles. The algorithm first tests each edge of the first triangle against the second using the line-segment--triangle occlusion procedure. If any edge establishes a definite ordering (in front or behind), that result is immediately returned. If these edge tests are inconclusive, the algorithm proceeds by testing the vertices of the first triangle against the second, and the vertices of the second triangle against the first, using the point-triangle occlusion procedure. If any of these vertex tests determines a clear ordering, that result is returned. If neither the edge nor the vertex tests establish a consistent relationship, the routine concludes that the occlusion status is inconclusive. \subsection{How the Final Illustration Is Rendered in LaTeX} % Describe how ordered simplices are passed to TikZ via LuaTeX % Include notes for integration, examples, and figure placement Once the simplices are ordered, the command \verb|\displaysegments| identifies the type of each simplex along with any assigned options, and prints empty path statements that the user may customize with their desired options. \section{Rationale and Development Process of the Approach} % Write 1–2 paragraphs that: % - Explain the reasoning behind choosing this method (why this over alternatives). % - Discuss influences from existing literature or practice. % - If you adapted or extended something, explain how and why. I chose this method over alternatives because I wanted a reliable way to render low-resolution parametric objects that intersect accurately. Existing graphics software in the \TeX{} ecosystem often handled these cases incorrectly, which motivated me to build a system of my own. The goal was not only to address this gap within \TeX{} but also to contribute a more principled solution to computer graphics more broadly. \section{Data, Tools, and Resources Used} % Write 1–2 paragraphs that: % - Identify datasets (source, size, characteristics). % - List software libraries, frameworks, or languages. % - Note hardware or computational resources if relevant. % - Explain why these resources were chosen (availability, reliability, suitability). This project does not rely on external datasets; it operates entirely on geometric primitives generated within the system itself. The implementation is written primarily in Lua, chosen for its seamless integration with Lua\LaTeX{}. All major libraries for vector arithmetic, cross-product tests, and affine-basis manipulations were developed independently by the author. A few minor helper functions, notably the Gauss--Jordan solver and the topological sort, were assisted by ChatGPT. For visualization and rendering within \LaTeX{}, the Ti\textit{k}Z package was used to draw and display all simplices. \section{Implementation Details} % Write several paragraphs (with equations, pseudocode, or diagrams if appropriate) that: % - Provide enough technical detail for reproducibility. % - Specify parameters, configurations, or assumptions used. % - Use structured elements (tables, figures, algorithms) for clarity where needed. For the implementation details, please see the source code. \section{Validation Strategy} % Write 1–2 paragraphs that: % - Describe how you plan to test or evaluate the approach. % - Define benchmarks, comparison methods, or criteria for success. % - Explain whether validation is internal (self-testing), external (against existing methods), or both. To validate the algorithm, all clipping and occlusion cases will be tested systematically. This includes points, line segments, and triangles in intersecting and non-intersecting configurations, as well as triangle-triangle and line-triangle clipping scenarios. Practical examples will be provided to illustrate correctness. Validation is primarily internal, but results can be cross-checked against existing computational geometry tools for external verification. \section{Challenges Encountered and Solutions Implemented} % Write 2–3 paragraphs that: % - Describe obstacles (technical, theoretical, practical). % - Explain how you addressed them (workarounds, adjustments, innovations). % - Reflect on what these solutions reveal about the strengths/weaknesses of the method. Being the first to coherently orchestrate linear algebra for the clipping and occlusion sorting of \(0\)--\(2\)-dimensional affine simplices, I encountered several failed attempts before finally achieving a correct solution. The major breakthrough came when I realized that starting with low-dimensional cases and progressively working upward was the key to success. \section{Limitations of the Current Methodology} % Write 1–2 paragraphs that: % - Acknowledge shortcomings, constraints, or simplifying assumptions. % - Be transparent but constructive: limitations set up justification for future work. The current methodology does not account for cyclic overlap; this aspect is deferred to a future edition. \section{Remaining Challenges and Directions for Future Work} % Write 1–2 paragraphs that: % - Point out unresolved methodological questions. % - Suggest how future research could refine or extend the approach. % - Distinguish between what you couldn’t address due to scope vs. what remains genuinely open. The main remaining challenge is the elimination of cyclic overlap. Additionally, future work may include clipping objects by other objects and potentially working with their intersection sets. \section{Summary of the Methodology} % Write 1 short paragraph that: % - Recaps the key methodological choices. % - Restates why this methodology is appropriate for the research problem. % - Provides a smooth transition into the Results and Analysis chapter. In this chapter, a computational and developmental approach was presented for systematically tessellating, clipping, and occlusion-sorting zero- to two-dimensional affine simplices. The methodology leverages affine linear algebra to detect intersections and determine depth ordering, implemented entirely in Lua and \LaTeX{} with visualization via Ti\textit{k}Z. By addressing low-dimensional cases first and progressively building up, the system ensures accurate handling of points, line segments, and triangles, while all major clipping and occlusion scenarios are implemented. With the methodology fully specified and operational, the next chapter focuses on validating the approach: testing all relevant cases, examining correctness, and demonstrating the effectiveness of the algorithms in practice.