## An affirmation of a concept that exists in Calculus 2, the Integral of (1/x).

There are certain concepts in Calculus 2, which introduces definite and indefinite integrals, that are taught to College and University Students, and which are actually considered to be basic information in Higher Math. One of them is, that the integral of (1/x) is the natural logarithm of (x).

Yet, some people just like to go around and dispute such things, much as the concept is popular, that (2+2) does not equal (4). And so, what I have just done is to ignore the obvious fact, that people who studied Calculus at a much higher level than I have, have found an analytical proof, and to ask the question:

‘What would happen if the integrals of simple power functions were given, that have powers slightly more-negative and slightly more-positive than (-1), in relation to this accepted answer, the natural logarithm of (x)?’ The accepted answer should always fall between those two curves, even if some plausible arbitrary constant is added to each power-function integral, such as one which sets all the functions to equal zero, when the parameter equals one. Not only that, but it’s easy for me to plot some functions. And so, the following two worksheets have resulted:

Testing the Integral of (1/x) – EPUB File for Mobile Devices

Testing the Integral of (1/x) – PDF File for Desktop and Laptop Computers

Further, I’d just like to remind the reader, that a function can easily be defined that follows a continuous line, except at one parameter-value, at which it has a different value, such that the neighbouring intervals in the domain of said function do not include this endpoint, in either case. The only question which remains is, whether that function is a correct answer to a question. And, because such functions are possible, the answer depends on additional information, to the idea that there are exceptions to how this function is to be computed.

(Update 1/26/2020, 20h20 : )

## The role Materials play in CGI

When content-designers work with their favorite model editors or scene editors, in 3D, towards providing either a 3D game or another type of 3D application, they will often not map their 3D models directly to texture images. Instead, they will often connect each model to one Material, and the Material will then base its behavior on zero or more texture images. And a friend of mine has asked, what this describes.

Effectively, these Materials replace what a programmed shader would do, to define the surface properties of the simulated, 3D model. They tend to have a greater role in CPU rendering / ray tracing than they do with raster-based / DirectX or OpenGL -based graphics, but high-level editors may also be able to apply Materials to the hardware-rendered graphics, IF they can provide some type of predefined shader, that implements what the Material is supposed to implement.

A Material will often state such parameters as Gloss, Specular, Metallicity, etc.. When a camera-reflection-vector is computed, this reflection vector will land in some 3D direction relative to the defined light sources. Hence, a dot-product can be computed between it and the direction of the light source. Gloss represents the power to which this dot-product needs to be raised, resulting in specular highlights that become narrower. Often Gloss must be compensated for the fact that the integral of a power-function, is less than (1.0) times a higher power-function, and that therefore, the average brightness of a surface with gloss would seem to decrease…

But, if a content-designer enrolls a programmed shader, especially a Fragment Shader, than this shader replaces everything that a Material would otherwise have provided. It is often less-practical, though not impossible, to implement a programmed shader in software-rendered contexts, where mainly for this reason, the use of Materials still prevails.

Also, the notion often occurs to people, however unproven, that Materials will only provide basic shading options, such as ‘DOT3 Bump-Mapping‘, so that programmed shaders need to be used if more-sophisticated shading options are required, such as Tangent-Mapping. Yet, as I just wrote, every blend-mode a Material offers, is being defined by some sort of predefined shader – i.e. by a pre-programmed algorithm.

OGRE is an open-source rendering system, which requires that content-designers assign Materials to their models, even though hardware-rendering is being used, and then these Materials cause shaders to be loaded. Hence, if an OGRE content-designer wants to code his own shader, he must first also define his own Material, which will then load his custom shader.