An exercise at converting an arbitrary video clip into ASCII-art.

One of the throw-back activities in Computing, which has existed since the 1990s, was so-called ‘ASCII-Art’, in which regular characters represented an image.

When this form of Art is created by a Human, it can look quite nice. But, if a mere computer program is given a sequence of images to convert into characters in a batch-process, the results are usually inferior, because all the program will be able to do is, to translate each cell of the images to an ASCII character, the brightness of which is supposed to represent the original brightness of the cell of the image. The complex shape of the actual text characters is not taken into account – at least, by any programs I have access to – and will also interfere with the viewer’s ability to recognize the intended image, because those shapes will just represent some random ‘noise’ in the image, without which, merely to have been given grey-scale tiles would have probably made it easier for the viewer to recognize the image.

In spite of recognizing this, I have persevered, and converted an arbitrary video-clip of mine into ASCII-art, programmatically. The following is the link by which it can be viewed:

And Yes, the viewer would need to enable JavaScript from my site, in order to obtain an actual animation, because that is what advances the actual ‘iframe’.

(Updated 6/26/2021, 14h45… )

(As of 6/20/2021, 20h25: )

A valid question to ask could be, whether ASCII-art ever served any real purpose. And the truthful answer is the fact that, as early as the 1970s, users were connecting to time-sharing computers via mechanical terminals in the form of Teletypewriters.

Because those terminals were only able to print text on paper, and because some users wanted to print images – usually for very non-academic purposes – they allowed their printers to hammer out, ASCII-art. Just like my example, those took the form of black characters on a white (paper) background. And, such a session would take up about ½ an hour of time, just so that the person could obtain a permanent copy of an illicit image.

The ASCII-art which was played with in the 1990s, assumed that everybody had CRT monitors, and was therefore also calculated to view properly, when displayed as bright characters on a black background.

Software today that generates ASCII-art, generally has an ‘invert’ flag, which applies to the brightness-order, in case people view their results in a word processor or text editor.

(Update 6/22/2021, 15h10: )

(Concerning .FLIC / .FLI / .FLC Files.)

Another piece of Computing Arcana worth mentioning is, that of so-called ‘FLIC Video Files‘. This is a video format seldom used anymore. However, while it was being used, certain software authors also wrote a program named ‘aaflip’, which can be installed with the Debian package ‘aview’, which can input FLIC Videos as streams, and render them in real time, as ASCII-art.

One reason this works well is the fact that, like Animated GIF Files, FLIC Videos are palletized. This means that, especially in videos suitable for rendering as ASCII-art, each pixel can only have 1 colour out of 16 64 possible colours. Hence, it’s straightforward to have coded ‘aaflip’ to translate each possible pixel value, into 1 ASCII character.

However, I have two issues with trends that still exist, regarding this use of ‘aaflip':

• Preparing the FLIC Video constitutes manual work, because the palette must be well-chosen. Under Linux, it can be done using ‘GIMP-GAP‘. And,
• Some people will actually recommend that users ‘SSH’ -in to a Linux computer, to view the ASCII-animation that results. This requires that non-Linux users have the cooperation of a Linux user to do.

Having said that, I’d say that ‘some format’ like a FLIC Video is actually more efficient to stream, than actual ASCII, because…

• A colour belonging to a 16 64-colour palette only requires 4 6 bits to define, while 7/8-bit ASCII is being assumed. And,
• The very low-resolution of video required, is also compressed, as videos should be, by the FLIC Format, while the ASCII my suggested link streams has not been. Mind you, ASCII Text could also just be Zipped or GZipped… (:1)
• (Redundant.) If a FLIC Video is to be streamed, then the receiving computer needs the corresponding decoder. Hence, to deploy this to an unprepared audience would essentially require, that the equivalent of ‘aaflip’ be implemented in-browser, preferably in JavaScript. And, the original authors of ‘aaflip’ have not invested the additional time that would be required, even to allow that program to render Animated GIF Files as an alternative (as far as I know).

Yet, the reader is free to explore Arcana in any way they choose.

(Update 6/22/2021, 19h15: )

After having done more research on this, I’ve found that the chain of actions I just proposed, using GIMP and then ‘aaflip’, don’t work on any modern Linux versions. I can see two possible reasons why they do not:

(Update 6/22/2021, 19h45: )

Some readers might be puzzled, as to why I’m skeptical about the possibility, that ‘aaflip’ just can’t do what it used to be able to do, let’s say ‘because the libraries have lost their code’. And the reason I’d give is the simple fact, that Debian Maintainers were able to compile both. And, Debian executables generally have to be compiled against library versions belonging to the same branch of Debian. One thing which Debian Maintainers are known for, however, is to compile batches of code, without verifying in thorough ways, whether it still works.

(Update 6/23/2021, 8h30: )

1:)

I have just completed a task which was mentioned earlier, in that it was not a very good solution (yet), to change the URL of an iFrame, for every animation frame, so that a separate request to my server would take place. In response to this weakness, I have now re-implemented my viewer JavaScript, to download one ZIP File, and then to decompress that into the entire sequence of animation / ASCII frames (within the Web-browser).

Here is the JavaScript, of the viewer, that runs on the browser, for all to see…


function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}

// 'playDest' will be the element ID for the ASCII to be played to.
// 'obj' will be a DataHandler object, with the Boolean attribute .continuePlay
async function play(myData, playDest, obj) {

var i = 0;
var text = "";

// Get all entries from the blob.

while ((i < entries.length) && obj.continuePlay) {

let entry = entries[i];

// Filter zipped entries to valid filenames.
if ((! entry.directory) && (entry.filename.match("^out_[0-9]{4}\.(txt|html)$"))) { // Get current entry content as text by using a TextWriter. text = await entry.getData(new zip.TextWriter()); // Set field. document.getElementById(playDest).innerHTML = text; await sleep(500); } i = i + 1; } // Close the ZipReader. reader.close(); // Render XMLHttpRequest object as passive as possible. obj.xhr.abort(); obj.continuePlay = false; } // Construct an object, that will stream animation frames. // myDest: The ID of the HTML Element, the Inner HTML of which is to receive the frames. // base_URL: The Folder URL which must contain the file 'content.zip'. function DataHandler (myDest, base_URL) { this.baseURL = base_URL; this.dest = myDest; this.xhr = new XMLHttpRequest(); // Afterthought: Some means should exist, to stop playback... this.continuePlay = false; this.timeStopped = 0; this.receiveZip = function (obj) { if (obj.xhr.status < 400) { play(obj.xhr.response, obj.dest, obj); } else { alert("Could not retrieve Zip file."); obj.xhr.abort(); } } this.verifyTimeout = function (obj) { if (obj.xhr.readyState == 4) { alert("Could not retrieve Zip file."); obj.xhr.abort(); } } // 'start()' will be used to start the player-object. this.start = function () { // When methods are specified as handlers, 'this' works differently // from how it would work, according to true OO semantics. // Therefore, a local variable must be created as a synonym, for later use. var that = this; // Just in case 'start()' is called again, before 'stop()' has been called. var now = new Date(); if (this.continuePlay || ((now.getTime() - this.timeStopped) < 1500)) { return 0; } this.continuePlay = true; this.xhr.open("GET", this.baseURL + "content.zip", true); // Load the zip file as a Blob. this.xhr.responseType = 'blob'; this.xhr.timeout = 5000; // Abort after 5 seconds. // Use 'that' ! this.xhr.onload = function () {that.receiveZip(that); } this.xhr.ontimeout = function () {that.verifyTimeout(that); } this.xhr.send(); } // 'stop()' - Stop playback when called on DataHandler object. this.stop = function () { var now = new Date(); this.timeStopped = now.getTime(); this.continuePlay = false; } } // In theory, any number of player objects could be created. var dataHandler1 = new DataHandler("myField", "http://dirkmittler.homeip.net/text/Brotherhood_Wolf-asciiart/"); function myLoad() { // 'zip.configure(...)' must only be called once, when the page is loaded. zip.configure({ useWebWorkers: false }); // In practice, only one stream of ASCII characters will be played... dataHandler1.start(); }   (Code Revised 6/24/2021, 22h30. This code revision is a major one, because it replaces the use of non-object global variables, with an object-oriented style for writing JavaScript. Edit: 6/25/2021, 4h20: Additionally, I fleshed out the code with more comments now, and changed the name of one method, to make the code easier for my reader to understand, and perhaps, even, to use for themselves. Edit 6/25/2021, 6h35: This time, I added a ‘.stop()’ method to the ‘DataHandler’ object, which controls playback of one stream. Edit 6/25/2021, 7h10: Although theoretically, it should be possible to do so, I dislike naming functions in my HTML body as methods of one object instance. For that reason, in the demo that runs above, I have added wrappers around ‘dataHandler1.start()’ and ‘dataHandler1.stop()’, that are both global functions – aka, ‘Free Functions’. And either Free Function is the ‘onclick’ event of a button. Making those handlers methods, might just create problems with the way ‘this’ works.) I used a JavaScript Library, which can be found at This Location. When it comes to the subject, of how I generated the ASCII-art itself (on my server), that subject is a little more complex. The act of converting a Video stream into characters was performed using the following (Linux) shell-script:  #!/bin/bash AC='/usr/bin/asciiart' U2D='/usr/bin/unix2dos' FF='/usr/bin/ffmpeg' IM='/usr/bin/convert' RC='/usr/bin/recode' if [ ! -f "$1" ] ; then
exit 1
fi

mkdir "${1%.*}-asciiart" || exit 1 if ls out_????.??? 1>/dev/null 2>/dev/null ; then echo "Excuse me, I need to create files named out_0000.png..." exit 1 fi echo "Creating frame-files..." sleep 2$FF -i "$1" -an -vf fps=2 -vsync 0 out_%4d.png || exit 1 for i in out_????.png do$IM ${i} -colorspace Gray -separate -average -colors 16 "${i%.png}_e.png"

echo "<pre>" > "${i%.png}.txt" echo "<code>" >> "${i%.png}.txt"

$AC -i -w 118 "${i%.png}_e.png" | $RC utf8..html >> "${i%.png}.txt"

echo >> "${i%.png}.txt" echo "</code>" >> "${i%.png}.txt"
echo "</pre>" >> "${i%.png}.txt"$U2D -q "${i%.png}.txt" mv "${i%.png}.txt" "${1%.*}-asciiart" echo "Generated${i%.png}.txt"
done

rm -f out_????.txt
rm -f out_????.png
rm -f out_????_e.png



However, this would only quantize the brightness-range down to 16 shades of grey, linearly subdividing the maximum range, which in turn would lead to the inferior quality. As the reader may have noticed, I would want to quantize to fewer steps, than what the makers of ‘aalib’ apply.

In order to bring out video better, what I did beforehand was, to use ‘GIMP-GAP’ in order to reorganize the video into a single image, that has (n) layers, where the animation is to have (n) frames. Then, within GIMP, I give the command from the menu, ‘Image -> Mode … Indexed’, and set 32 colours for the optimized palette. Next, still within GIMP, I assign a new palette, choosing the one that offers 32 shades of grey. That can be found under ‘Colors -> Map -> Set colormap’. This I export to a .GIF File, which, in turn, my script above further quantizes to 16.

Finally, I give the command:


\$ zip content.zip out_????.txt



(Or, the equivalent command, that applies the actual filename extensions, in case they’ve been renamed to ‘.html’)

(Update 6/24/2021, 14h00: )

If the reader’s browser predates the year 2015, then they will not be able to view the animation, and one important reason is the fact that I coded trendy JavaScript, which makes use of the ‘let’ keyword (instead of always, using ‘var’). ‘let’ was only introduced into the JavaScript specification in 2015, and happens to be the cleanest way to do, what the code above is supposed to do.

I suppose that the question exists separately, of whether the reader’s browser could also load the JavaScript Library which I made use of, with a browser dating back to 2015, because I don’t know what advanced JS features the Library may make use of.

(Update 6/24/2021, 17h20: )

There’s a small piece of minutia, which I may as well log about my own activities, which had to do with 2 missing frames at the end of the sequence displayed according to this posting.

At first, the sequence went from ‘out_0001.html’ to ‘out_0205.html’. At that time, I wrote 2 different viewers, which were hard-coded to retrieve that exact frame-range from my server.

But at some later point in time, I chose to use a better way to quantize the original video, which only generated 203 actual frames of ASCII text, no longer 205. Because those last 2 frames were non-matching, I next deleted them from the server, and wrote my 3rd viewer, only to play the ASCII frames that are actually stored in the ZIP File.

What I did not realize, once I had uploaded the 3rd version of my viewer, was that the earlier 2 viewers were still accessible on my server, and that those would first play most of the animation, after which they would display an error message, over the missing ASCII frames ‘out_0204.html’ and ‘out_0205.html’.

Sorry. That error has now been rectified.

(Update 6/26/2021, 14h45: )

There is one more observation to add, about my initial inability to use the ‘aaflip’ program – the only purpose of which is, to view FLI or FLC Videos as ASCII-art – to view 8-bit FLC Videos. There is no deficiency on any of my computers, actually to view those as videos. ‘ffmpeg’ will decode but not encode them, and ‘VLC’ and ‘Xine’ will also play them fine.

When viewing prospective videos, I noticed that those had been composed in a way that had very exaggerated contrast, between an almost-white, bright subject, and a completely black background. Therefore, any ability to view those as ASCII-art is irrelevant, since most videos that originate in the wild, have a greater range of colours, that must also be converted into ASCII characters…

Further, there are 2 problems which will befall GIMP users, who might wish to Export an (n)-layered image as an FLC animation:

• The user may have given the image a palette of only 64 colours. This results in an output-palette of 256, out of which only the first 64 are actually used (differently from how GIMP exports GIFs). And,
• For some strange reason, GIMP will only output that animation to run at 25PFS – i.e., at 40ms /Frame – regardless, of what the user chose.

Dirk